GB2526525A - Touch sensing systems - Google Patents

Touch sensing systems Download PDF

Info

Publication number
GB2526525A
GB2526525A GB1406918.1A GB201406918A GB2526525A GB 2526525 A GB2526525 A GB 2526525A GB 201406918 A GB201406918 A GB 201406918A GB 2526525 A GB2526525 A GB 2526525A
Authority
GB
United Kingdom
Prior art keywords
pen
touch
optical
message
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1406918.1A
Other versions
GB201406918D0 (en
Inventor
Euan Smith
Paul Richard Routley
Brett Saunders
Julian Hall
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LIGHT BLUE OPTICS Inc
Original Assignee
LIGHT BLUE OPTICS Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LIGHT BLUE OPTICS Inc filed Critical LIGHT BLUE OPTICS Inc
Priority to GB1406918.1A priority Critical patent/GB2526525A/en
Publication of GB201406918D0 publication Critical patent/GB201406918D0/en
Publication of GB2526525A publication Critical patent/GB2526525A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface

Abstract

An optical touch sensing system comprises a touch sensor light source to project light above a surface, an optical touch sensor to capture a touch sense signal from a region above said surface, a pen to optically respond to said projected light, and a touch processing system comprising a signal processor. The signal processor identifies a lateral location of the pen on the surface from the optical response. The signal processor comprises a wireless communication system to send a wireless message to the pen, and the pen comprises an electronic controller coupled to a receiver to receive the wireless message from the signal processor. The projected light may be encoded to send the message to the pen. A brightness of the projected light may be modulated to define bright, touch detecting intervals and intervening intervals. The message may be encoded in the bright intervals or in the intervening intervals. The message may comprise an instruction to the pen to perform an action. The instruction may comprise a pen identity instruction. The projected light may define a touch sheet about the surface.

Description

Touch sensing systems
FIELD OF THE INVENTION
This invention relates to pen-based touch sensing systems and methods which, in embodiments, can be used to provide a touch sensing surface just above a whiteboard, a display screen, or the like.
BACKGROUND TO THE INVENTION
Conventional infra-red pen-based electronic whiteboards function on the basis of: * A pen" containing a battery, control electronics and an infra-red LED in the tip, which is turned on when the tip makes contact with the board * A camera attached to the board which determines the position of the pen tip LED and converts it to coordinates, which are passed to a computer (for example over a USB interface) Unfortunately, the need to illuminate the LED continually while the pen is in contact with the board leads to potentially short pen battery life, which is a critical problem since in many environments (e.g. schools) the pens may be used for many hours a day and battery replacement or recharging may be inconvenient.
In W02014/029331 we have previously described a new electronic whiteboard architecture using an infra-red pen which enables an extremely long battery life (years) at a low cost. Nonetheless further improvements are still desirable, as well as systems providing greater flexibility.
SUMMARY OF THE INVENTION
According to the present invention there s therefore provided an optical touch sensing system, the system comprising: a touch sensor light source to project light above a surface; an optical touch sensor to capture a touch sense signal from a region above said surface; a pen to optically respond to said projected light; a touch processing system comprising a signal processor, coupled to said optical touch sensor, to process said touch sense signal to identify a lateral location of said pen on said surface from said optical response; wherein said signal processor further comprises a wireless communication system to send a wireless message to said pen; and wherein said pen further comprises an electronic controller coupled to a receiver to receive said wireless message from said signal processor.
In embodiments the optical response is provided by the pen scattering the projected light, but additionally or alternatively the pen may reflect the scattered light, in particular retro-reflect the scattered light, or detect the projected light and provide an active optical response, for example by illuminating a light emitting device such as an LED (light emitting diode). In the description which follows, references to scattering are intended to encompass reflection, including retro-reflection.
In embodiments providing a forward channel between the touch processing system and the pen whose location is identified by the system facilitates improved touch detection performance, reduced power and additional functionality, as described further below.
The wireless communication system to send a message to the pen may comprise, but is not limited to, a radio frequency communication system, a near field communication system, an ultrasonic communication system, and an optical communication system (where opt/cat refers to wavelengths in the visible and/or infra-red and/or ultra-violet).
Preferably optical communication is employed. More particularly, in preferred embodiments the communications system is configured to encode the message onto the light projected from the touch sensor light source. The receiver in the pen may then comprise an optical sensor and the electronic controller of the pen may be configured to extract the message from the projected light. Additionally or alternatively, however, a separate optical channel may be provided to the pen, for example by providing an additional light emitting device such as an LED. Dependng upon the implementation there can be advantages in decoupling the light curtain (if present) from the forward channel to the pen, for example to provide a more consistent communication path. In such an arrangement the aforementioned touch sensor light source may be viewed as including such an additional light emitting device.
In some preferred embodiments the projected light comprises light defining a touch sheet above the touch surface. This may either be a substantially continuous light sheet or, for example, a touch sheet defined by one or more scanned beams (see, for example, our W02014/027189). Correspondingly the optical touch sensor may comprise a camera to capture a touch sense image, processed to identify a location of the pen, or a pen location may be determined from by extracting a timing signal from a single sensor or group of sensors employed in conjunction with a scanned beam.
In some preferred embodiments the projected light is modulated to define bright, touch detecting intervals and intervening, dark' intervals (although as will be described later, these dark intervals may contain signalhng information). The optical touch sensor, for example camera, may then be synchronised with the modulation of the projected light, to improve ambient light rejection -for example the shutter' of the camera may be opened selectively whilst light is being projected or, equivalently, light may be projected selectively whilst the camera shutter is open.
In one approach the message sent to the pen is encoded into the bright intervals of the projected light, for example by encoding binary data within a pulse of the projected light. In embodiments this may be achieved by modulating the projected (laser) light on and off to encode a sequence of is and Os. Preferably a DC balanced modulation scheme is employed so that the camera (or other detector) sees the same average brightness from one touch detecting interval to the next (the camera integrating over a touch detection interval). The skilled person will appreciate that there are many different DC balanced modulation schemes which may be employed.
Preferably with this modulation scheme the power of the touch sensor (laser) light source is increased to compensate for the average power reduction caused by the modulation.
In another approach, which may be used in conjunction with or instead of the previous approach, the message is encoded in the projected light during the intervening intervals between the touch detecting intervals. In embodiments the pen can sense the projected light and hence can determine the timing of the intervening intervals. These intervening intervals can thus be used to encode a message. The skilled person will be aware that many different encoding techniques may be employed.
In one approach a message is encoded by defining one or more time slots at one or more corresponding fixed time offsets from a touch detecting interval. The presence or absence of projected light during such a time slot may then be employed to communicate information. In another approach, in the intervening times between touch detection intervals a message may be encoded by frequency modulating the projected light, each of a plurality of different frequencies encoding a different respective message. In this context the skilled person will appreciate that references to a message include, at its simplest, a single bit message.
Preferably the projected light comprises ight from a narrowband source such as a light emitting diode or laser; the optical sensor ol the pen may then comprise a narrowband filter to selectively detect the projected light.
In some preferred embodiments the system also provides a back channel from the pen to the touch sensing system. This may employ any convenient wireless communication technique including, as previously described, radio frequency, near-field, ultra-sonic and optical communication techniques. The provision of bi-directional communications provides further significant advantages, described later. Thus in embodiments the pen includes a reply transmitter coupled to the electronic controller of the pen and configured to reply to a message sent to the pen by the touch processing system. The skilled person will appreciate that, where references are made to an electronic controller in the pen, this controller may be implemented as one or more separate controllers or integrated circuits in communication with one another.
In embodiments the reply transmitter comprises an optical transmitter, in which case the repy may be synchronised with the modulated light projection, in particular so that the reply is sent during the intervening intervals between the bright, touch detecting intervals. It is preferable in such an arrangement to separate the projected light and back channel light by wavelength and/or time, although this is not essential. In one approach the projected light is at a relatively long wavelength for example greater than BOOnm, BSOnm, 875nm, or 900nm, (to improve ambient light rejection, because there is less background solar irradiation at these longer wavelengths) whilst the back channel is at a shorter wavelength, in particular at less than 900nm, 875nm, 850nm, or SOOnm, because ambient light rejection is less important for the back channel. In embodiments the projected light curtain may be at around QO4nm whilst the back channel may be at around 785nm.
In some preferred embodiments a message may comprise an instruction to the pen to perform an action, and the electronic controller of the pen is configured to interpret the message to perform the requested action in response. Such an instruction may comprise, an instruction to the pen to reply with status data. Such status data may comprise, for example, data relating to a battery status of the pen and/or data from one or more sensors of the pen, such as a pressure sensor sensing a pressure on the tip of the pen and/or an orientation sensor such as an accelerometer. An orientation sensor may sense the angle at which a pen is being held -for example to enable a pen to be used for electronic calligraphy, such as italics and the like.
Additionally or alternatively to the above described back channel a further back channel may employ a light source associated with the pen at a wavelength visible to the optical touch sensor, in embodiments the camera. In embodiments this may be at substantially the same wavelength as, or at a similar wavelength to, the projected light.
(In principle, the reply transmitter may also operate at such a wavelength, although it is more preferable that a different wavelength is employed). By providing a light source which is visible to the optical touch sensor, the scattered light from the pen can be associated with a signal provided by this additional light source, which can therefore be used to transmit a pen identifying signal, for example a steady illumination, flash or pattern of flashes. In the case of a camera capturing an image of a touch sheet, the image of the scattered light from the pen can be associated with the pen identifying signal in the same touch sense image. In such an arrangement the message sent to the pen may be a message requesting that a pen identify itself. This message may be a message broadcast to all the pens in the system (where multiple pens are present) or a message to be selectively received by a single pen or a group of pens.
Such an arrangement is particularly useful in an optical touch sensing system because the scattered light from a pen can sometimes be occluded by another object closer to the source of projected light than the pen. When a pen reappears from behind such an occlusion there can be ambiguity as to whether the emerging object is the same as the previously occluded object. Whilst continuity of identity is a useful assumption, on occasion another object such as a finger can give rise to a false signal. In a similar manner, when a pen leaves a region defined by the touch sheet, for example because it is moved towards or off the edge of a whiteboard, when the pen re-enters there can again be ambiguity as to whether the re-entering object/pen is the same as the pen which previously left. Thus in embodiments the touch processing system may be configured to send a pen identity instruction in response to loss of detection of a pen and subsequent reacquisition of detection of a pen. This instruction may request the/or multiple pens to transmit their identifying signals, to facilitate tracking of the pen(s) by the touch processing system.
A further difficulty which arises with an optical touch sensing system occurs when a pen is in a hover' position. A pen is in hover when the pen is close enough to the touch sheet to be able to detect the adjacent presence of the touch sheet, but not intersecting the touch sheet sufficiently for the pen to scatter light and be detected by the touch sensing system. A pen may announce its presence in hover mode by, for example, flashing every few frames, but in the context of a system attempting to minimise power consumption this can represent a relatively significant power drain.
However without such a signal the pen is invisible to the camera.
The forward communication channel to the pen described above provides a technique which addresses this difficulty: the forward channel can be used to send a message to require a pen to identify its presence and/or a message can be sent to a pen to inform the pen that the touch sensing system has identified that the pen is in hover mode, so that transmission of the signal to the touch sensing system can be stopped. Thus for hover mode a message can request a performance of one or more of the following: start transmitting an optical signal visible to the touch sensing system, stop transmitting an optical signal visible to the touch sensing system, briefly transmit a signal visible to the touch sensing system, send a pen identifying signal to identify the pen to the touch sensing system.
Additionally or alternatively when a pen has been detected in hover mode, and then two objects are detected intersecting the touch sheet, a message can be sent to query one or both of the objects, in particular requesting that one or both of the objects identify themselves to the touch sensing system. This can be used to distinguish whether/where in a pen in hover mode has been applied to the touch sensing surface through the touch sheet, and/or to distinguish a pen from another object, such as a finger. Still further, a request for a pen identifying signal may be employed to determine whether/when a pen has been removed entirely away from the touch sheet (that is, removed further away from the touch sheet than when in hover mode).
In embodiments the message sent to the pen need not identify a parUcular pen or group of pens for which the message is intended. In other embodiments, however, a message may comprise or consist of a pen identifier. This may be detected by each pen so that the pen for which the message is intended can selectively respond. The skilled person will appreciate that there are many ways in which such a pen identifier may be implemented; in one simple embodiment a binary code is employed to address a pen and the binary code is encoded into the message.
In a related aspect the invention provides a pen for an optical touch sensing system as described above, the pen comprising: a receiver comprising an optical pen sensor; and an electronic controller coupled to said receiver, wherein said electronic controller is configured to extract a message encoded in said projected light.
As previously mentioned, in embodiments the pen may include a reply transmitter, for example a light source, to transmit a reply back to the touch processing system in response to the extracted message. In embodiments the pen may include one or more light sources, such as one or more light emitting diodes. These may be configured to transmit an optical signal in a dark' interval between one or more bright, touch-detecting intervals of the touch sensing system and/or to transmit a pen identifier, in particular when the touch sensing system is touch sensing, for example when a camera shutter is open. A reply sent back from the pen may comprise status data as previously described and/or a pen identifying signal, and/or other data.
More generally a pen for use with embodiments of the invention may comprise any of the previously described features of the pen of the optical touch sensing system.
In a further related aspect the invention provides a method of operating an optical touch sensing system, in particular as described above, the method comprising determining information relating to a pen used in the touch sensing system by encoding an instruction to the pen in said projected light.
Again, in embodiments the method may include any of the previously described features/aspects of the invention. Thus the information relating to the pen may comprise an identity of the pen, for example when the pen is in a hover mode and/or has been temporarily lost by the touch sensing system. Additionally or alternatively the information may include status data for the pen, in some preferred embodiments battery status data for the pen.
In a stil further related aspect the invention provides a carrier carrying processor control code for an optical touch sensing system, the system comprising: a touch sensor light source to project light above a surface; an optical touch sensor to capture a touch signal from a region above said surface; a pen to optically respond to said projected light; and a touch processing system comprising a signal processor, coupled to said optical touch sensor; and wherein said code comprises stored program code to process said touch sense signal to identify a lateral location of said pen on said surface from said optical response; and wherein the code further comprises code to determine information relating to a pen used in the touch sensing system by encoding an instruction to the pen in said projected light.
Again further features of the above described processor control code may include any of the previously described features ol aspects and embodiments of the invention.
The skilled person will appreciate that code and/or data for the invention may comprise source, object or executable code in a conventional programming language, or assembly code, or code for a genera purpose computer system or digital signal processor, or code for setting up or controlling an ASIC (application specific integrated circuit) or FPGA (field programmable gate array), or code for a hardware description language such as VHDL (very high speed integrated circuit hardware description language). Additionally or alternatively some or all of the processing may be implemented in hardware (electronic circuitry). As the skilled person will appreciate, code and/or data to implement embodiments of the invention may be distributed between a plurality of coupled components in communication with one another.
The skilled person will appreciate that in this specification references to a pen' are to be interpreted broadly. This pen' includes wands and other hand held devices usable for indicating a position on a surface, for example of a white board. Where references are made to the nib' of a pen these may correspondingly be replaced by references to a pen tip or equivalents thereof.
BRIEF DESCRIPTION OF THE DRAWINGS
Figures la and lb show, respectively, a vertical cross section view through an example touch sensitive image display device, and details of a sheet of light-based touch sensing system for the device; Figures 2a and 2b show, respectively, a plan view and a side view of an interactive whiteboard incorporating a touch sensitive image display; Figures 3a to 3c show, respectively, an embodiment of a touch sensitive image display device, use of a crude peak locator to find finger centroids, and the resulting finger locations; Figures 4a to 4d show, respectively, a first pen according to an embodiment of the invention showing an LED and photodetector in the pen tip; a second pen according to an embodiment of the invention showing an optical pen design using waveguide tip; and a third pen according to an embodiment of the invention showing an 905nm IR pen tip with SSOnm backchannel LED; Figure 5 shows an electrical block diagram of a pen according to an embodiment of the invention; Figures 6a and 6b show, respectively, a functional flow diagram for operation of the nib LED of the pen of Figure 5, and an example state machine for operation of the backchannel LED by the microcontroller (CPU) of the pen of Figure 5; Figure 7 shows an embodiment of a touch sensitive image display device configured to use the pen of Figures Sand 6; Figure 8a to 8c show, respectively, a pulse pattern for the sheet of light of Figure 1, a first example of a modulated pulse pattern for communicating with a pen in the system of Figure 1 according to a first embodiment of the invention, and a second example of a modulated pulse pattern for communcating with a pen according to a second embodiment of the invention; Figure 9 shows a flow diagram of a procedure implemented by a pen electronic controller according to an embodiment of the invention; and Figure 10 shows an optical touch sensing system for communicating with a pen, according to an embodiment of the invention.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
We have previously described touch sensing systems employing a plane or sheet of light, for example as shown in Figures 1 and 2. These techniques may be employed for detecting touches or objects proximate to a surface.
Figure 1 shows an example touch sensitive image projection device 100 comprising an image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102. A proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device. The image projection module 200 is configured to project downwards and outwards onto a flat surface such as a tabletop; boundaries of the light forming the displayed image 150 are indicated by lines lSOa, b.
Suitable image projections include, but are not limited to, digital micromirror-based projectors such as projectors based on DLPTM (Digital Light Processing) technology from Texas Instruments, Inc., and holographic projectors as described in our previously filed patent applications.
The touch sensing system 250, 258, 260 comprises an infrared laser illumination system 250 configured to project a sheet of infrared light 256 just above the surface of the displayed image 150 (for example Hmm above, although in principle the displayed image could be distant from the touch sensing surface). The laser illumination system 250 may comprise an IFI LED or laser 252, collimated then expanded in one direction by light sheet optics 254 such as a cylindrical lens. A CMOS imaging sensor (touch camera) 260 is provided with an IR-pass lens 258 and captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256 (the boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a, b). The touch camera 260 provides an output to touch detect signal processing circuitry as described further later. These techniques may be employed with any type of image projection system.
Figure 2, this shows plan and side views of an example interactive whiteboard touch sensitive image display device 400 incorporating such a system. In the illustrated example there are three IA fan sources 402, 404, 406, each providing a respective light fan 402a, 404a, 406a spanning approximately 120° together defining a single, continuous sheet of light just above display area 410. The fans overlap an the display area (which is economical as shadowing is most likely in the central region of the display area). Typically such a display area 410 may be of order lm by 2m. The side view of the system illustrates a combined projector 420 and touch image capture camera 422 either aligned side-by-side or sharing a portion of the projection optics.
The optical path between the projector/camera and display area is folded by a mirror 424. The sheet of light generated by fans 402a, 404a, 406a is preferably close to the display area, for example less than 1cm or 0.5cm above the display area. However the camera and projector 422, 420 are supported on a support 450 and may project light from a distance of up to around 0.5m from the display area.
Example touch sensing system Referring now to Figure 3a, this shows an embodiment of a touch sensitive image display device 300 according to an aspect of the invention. The system comprises an infra-red laser and optics 250 to generate a plane of light 256 viewed by a touch sense camera 258, 260 as previously described, the camera capturing the scattered light from one or more fingers 301 or other objects interacting with the plane of light. The system also includes an image projector 118, for example a holographic image projector, also as previously described.
In the arrangement of Figure 3a a controller 320 controls the IR laser on and off, controls the acquisition of images by camera 260 and controls projector 118. In the illustrated example images are captured with the IR laser on and off in alternate frames and touch detection is then performed on the difference of these frames to subtract out any ambient infra-red. The image capture objects 258 preferably also include a notch filter at the laser wavelength which may be around 780-9SOnm. Because of laser diode process variations and change of wavelength with temperature this notch may be relatively wide, for example of order 20 nm and thus it is desirable to suppress ambient IR. In the embodiment of Figure 3a subtraction is performed by module 302 which, in embodiments, is implemented in hardware (an FPGA).
In embodiments module 302 also performs binning of the camera pixels, for example down to approximately 80 by 50 pixels. This helps reduce the subsequent processing power/memory requirements and is described in more detail later. However such binning is optional, depending upon the processing power available, and even where processing power/memory is limited there are other options, as described further later.
Following the binning and subtraction the captured image data is loaded into a buffer 304 for subsequent processing to identify the position of a finger or, in a multi-touch system, fingers.
Because the camera 260 is directed down towards the plane of light at an angle it can be desirable to provide a greater exposure time for portions of the captured image further from the device than for those nearer the device. This can be achieved, for example, with a rolling shutter device, under control of controller 320 setting appropriate camera registers.
Depending upon the processing of the captured touch sense images and/or the brightness of the laser illumination system, differencing alternate frames may not be necessary (for example, where finger shape' is detected). However where subtraction takes place the camera should have a gamma of substantial unity so that subtraction is performed with a linear signal.
Various different techniques for locating candidate finger/object touch positions will be described. In the illustrated example, however, an approach is employed which detects intensity peaks in the image and then employs a centroid finder to locate candidate finger positions. In embodiments this is performed in software. Processor control code and/or data to implement the aforementioned FPGA and/or software modules shown in Figure 3 (and also to implement the modules described later with reference to Figure 5) may be provided on a disk 318 or another physical storage medium.
Thus in embodiments module 306 performs thresholding on a captured image and, in embodiments, this is also employed for image clipping or cropping to define a touch sensitive region. Optionally some image scaling may also be performed in this module.
Then a crude peak locator 308 is applied to the thresholded image to identify, approximately, regions in which a finger/object is potentially present.
Figure 3b illustrates an example such a coarse (decimated) grid. In the Figure the spots indicate the first estimation of the centre-of-mass. We then take a 32x20 (say) grid around each of these. This is preferably used in conjunction with a differential approach to minimize noise, i.e. one frame laser on, next laser off.
A centroid locator 310 (centre of mass algorithm) is applied to the original (unthresholded) image in buffer 304 at each located peak, to determine a respective candidate finger/object location. Figure 3c shows the results of the fine-grid position estimation, the spots indicating the finger locations found.
The system then applies distortion correction 312 to compensate for keystone distortion of the captured touch sense image and also, optionally, any distortion such as barrel distortion, from the lens of imaging optics 258. In one embodiment the optical axis of camera 260 is directed downwards at an angle of approximately 700 to the plane of the image and thus the keystone distortion is relatively small, but still significant enough for distortion correction to be desirable.
Because nearer parts of a captured touch sense image may be brighter than further parts, the thresholding may be position sensitive, alternatively position-sensitive scaling may be applied to the image in buffer 304 and a substantially uniform threshold may be applied.
In one embodiment of the crude peak locator 308 the procedure finds a connected region of the captured image by identifying the brightest block within a region (or a block with greater than a threshold brightness), and then locates the next brightest block, and so forth, preferably up to a distance limit (to avoid accidentally performing a flood fill). Centroid location is then performed on a connected region. In embodiments the pixel brightness/intensity values are not squared before the centroid location, to reduce the sensitivity of this technique to noise, interference and the like (which can cause movement of a detected centroid location by more than once pixel).
A simple centre-of-mass calculation is sufficient for the purpose of finding a centroid in a given ROl (region of interest), and R(x,y) may be estimated thus: v-i x-i -Ys0-'c0 YI xl Ys0 x5=O v-i x-i yR"(x5,y) Ys0 x Y1x1 JR (x5,y5) Ys0 x5=O where n is the order of the CaM calculation, and Xand Yare the sizes of the ROl.
In embodiments the distortion correction module 312 performs a distortion correction using a polynomial to map between the touch sense camera space and the displayed image space: Say the transformed coordinates from camera space (x,y) into projected
T
space (x,y) are related by the bivariate polynomial: x =xC1y -X*J and y'=xCyT; where G and G, represent polynomial coefficients in matrix-form, and x and y are the vectorised powers of x and y respectively. Then we may design C and C, such that we can assign a projected space grid location (i.e. memory location) by evaluation of the polynomial: b='J+Xy'J Where Xis the number of grid locations in the x-direction in projector space, and Li is the floor operator. The polynomial evaluation may be implemented, say, in Chebyshev form for better precision performance; the coefficients may be assigned at calibration.
Further background can be found in our published PCT application W02010/073024.
Once a set of candidate finger positions has been identified, these are passed to a module 314 which tracks finger/object positions and decodes actions, in particular to identity finger up/down or present/absent events. In embodiments this module also provides some position hysteresis, for example implemented using a digital filter, to reduce position jitter. In a single touch system module 314 need only decode a finger up/finger down state, but in a multi-touch system this module also allocates identifiers to the fingers/objects in the captured images and tracks the identified fingers/objects.
In general the field of view of the touch sense camera system is larger than the displayed image. To improve robustness of the touch sensing system touch events outside the displayed image area (which may be determined by calibration) may be rejected (for example, using appropriate entries in a threshold table of threshold module 306 to clip the crude peak locator outside the image area).
Ultra-Low-Power Optical Pen for Interactive Whiteboards The touch technology we describe above employs an infra-red light sheet (of wavelength A1 e.g. 905nm) disposed above (and substantially parallel to) the board and a camera (with a filter to reject light outside a band around A) to detect and locate impingements on the sheet (for example, from fingers), which are translated to touch events. To reject ambient light, the camera exposure period is set very short and the infra-red light sheet is pulsed in synchrony with the camera exposure. As an example, a camera exposure period of 1 OOus would lead to an infra-red laser duty cycle of 0.6%.
Preferably a duty cycle of the touch detecting interval to the dark interval duration is less than 50%, more preferably less than 10%, 5%, 2% or 1%, shorter duty cycles giving a greater overall power saving for the pen.
Using this touch architecture, one can add improved infra-red pen support as follows.
Using a standard infra-red pen as a base, add to the tip an infra-red photodetector (for example, a photodiode) which detects incident infra-red light (at wavelength A1), and activates the LED (also at A1) only if the detected incident IR light level is sufficiently high (in addition to the tip being in contact with the board). In embodiments the light source (LED) operates at the same wavelength as that of the projected light and using a photodetector is effectively switched on by the light sheet. Preferably, therefore, this light source is controlled to be extinguished sometime after it is initially triggered, for example after a period of one to a few hundred microseconds, to inhibt the system locking up.
The IR photodetector' s position relative to the tip should be chosen such that, when the pen is in contact with the board, the infra-red light sheet over the board impinges on the photodetector. As a result, the tip LED will activate not continuously but in synchrony with the infra-red light sheet pulse train, and therefore in synchrony with the camera exposure. The result is that the duty cycle of the LED will be reduced from 100% to around 0.46% (assuming a camera exposure period of lOCus), with no decrease in the signal intensity observed by the camera during its exposure period.
The pen's power consumption, conventionally dominated by the LED, will fall by several orders of magnitude. As a result, pen battery life will increase concomitantly.
Preferably the pen comprises a tip-touch sensor to sense touch of the tip of the pen onto a surface, for example a whiteboard surface. Then the controller may be configured such that it is activated from a low power state by this signal. Because the touch sheet is close but not coincident with the touch surface, generally 1 to a few mm above this surface, it is preferable that the pen is activated by actual physical touch onto the surface. This is to provide an improved user experience as activation of the pen by the touch sheet slightly away from the physical surface can feel less natural.
Further this approach facilitates additional power reduction in the pen -far example an interrupt line of the controller such as a PIG (trade mark) microcontroller may be employed to wake the controller up from a very low power quiescent state when the pen is touched to the surface. Also for this reason it is preferable that the touch sensor comprises a mechanically-operated electrical switch as such an approach by contrast with, say, an active sensor, effectively uses no power. In a simple embodiment such a mechanical swiTch may comprise a switch to detect touch of the pen tip or tilt of The pen tip away from an axis of the pen -for example the tip of the pen may be mounted on a rocker and/or spring, biased towards a central and/or extended configuration thus mechanically operated when the pen tip comes into contact with the surface.
Although in some preferred implementatons the pen includes a light source to provide a signal back to the touch detection system, and thus operates as an active signal-sending device, in other arrangements the signal-sending device may be passive, for example comprising a reflector, in particular a diffusing retroreflector in the tip of the pen (diffusing into, say, a 300 cone to facilitate detection). With such an arrangement passive scatter of the touch sheet by the pen signals touch of the pen onto the surface, although preferably the pen also includes a "touch-down" sensor previously described.
In embodiments the battery lifetime approaches the desired lifetime for the pen itself and the pen may be a sealed, disposable item.
Figure 4a shows one preferred arrangement of a pen 400 whereby the pen tip comprises a polished plastic cone with the LED and photodetector attached to the base facet of the cone. In the figure, the tip is separated from the pen body for clarity -in a real pen they are attached together mechanically. The tip of the cone is roughened by grinding to provide a diffusing surface, so that light from the LED is emitted by the tip over a wide angular range; and/or the tip may comprise a solid (white) plastic diffuser.
Alternatively, an optical waveguide, such as a polished plastic rod, may be employed to couple the light from the IR LED to the tp of a pen 420 (Figure 4b) so that only the tip glows. The tip, as above, may be roughened to provide diffuse emission of the light.
One or more IR photodiodes can be located separately from the waveguide, as shown in Figure 4b, or also coupled into the waveguide. If the photodiode is separate from the tip, then the infra-red light sheet is likely not to impinge upon it directly, but instead the photodiode can be configured to detect scatter of the IR light sheet from the diffusing tip of the waveguide. In embodiments the photodetector sees the light sheet indirectly, seeing scatter from the tip in the light sheet rather than the light sheet directly (potentially via a reflection from the board), and thus the photodetector does not need to be actually in the light sheet.
Doubtless many other possible configurations of IR LED and photodetector within a pen tip will be apparent to those skilled in the art.
In addition to the above, for systems that do not feature an infra-red light sheet (for example, systems that need to support only pen operation, and not touch), a pulsed infra-red LED can be employed in the system in place of the IR light sheet to provide an optical synchronisation signal for the pen, without any hardware changes to the pen.
Optical Backchannel Instead of just illuminating the LED continuously for, say, lOOus during the camera exposure, a pulse pattern can be employed within (or subsequent to) this 1 OOus period to encode additional data to transmit data via an optical backchannel from the pen to an independent photodetector associated with the whiteboard. This additional data can encode, for example, a pen ID, or whether a button on the pen is pressed, to provide additional functionality including multi-pen discrimination.
Additionally or alternatively, instead of the tip LED transmitting the optical backehannel pulse sequence, a second LED (or other signal source such as an RF emitter) can be employed to provide the backchannel -as shown for pen 440 of Figure 4c. lithe second LED has, for example, a different wavelength A2 from the tip LED (A), the primary IR position signal (detected by the camera) and the optical backohannel (detected by a separate photodetector) may be separated using appropriate dichroic filters, which may simplify the electronic design and improve robustness. An embodiment of this configuration is shown in Figure 4c.
Thus in some preferred embodiments the pen includes a system for indicating a change of state of the pen back to the touch detection system, for example to indicate one or more of: pen-down, pen-up, operation of a pen button, operation of a pen button in combination with pressing the pen against the surface, and so forth. Preferably the pen comprises a second light source operating at a different wavelength to that of the light sheet and the receiver comprises a corresponding optical detector, for example a photodiode with a narrowband filter. In this way the pen ID/operation/state signal is seen by the photodetector but not by the camera. One example modulation technique encodes a signal for transmission by frequency modulation; this approach automatically provides rejection of ambient light (a DC background) and facilitates use of multiple pens simultaneously, where each pen of a set utilises a distinct frequency.
For example a first set of one or more frequencies may be assigned to a first pen and a second set of one or more frequencies, all different to the first set, may be assigned to a second pen. Conveniently the pen signals may then be substantially simultaneously decoded by conversion to the frequency domain, for example by performing a time-to-frequency domain conversion such as a fast Fourier transform on the encoded signal to read the set of signals from the pens simultaneously.
Optionally the backchannel receiver optics may be configured to accept light only from a narrow angle or region, in particular the space just above the display surface or whiteboard, to improve resiliency to other events which may be happening elsewhere in the room. In a similar manner, the backchannel LED(s) in the pens may be configured to direct light preferentially towards the receiver(s), for efficiency. This may be achieved by positioning the LED(s) on one side (the top" side) of the pen. In another approach a lens is provided over the LED(s) to narrow a cone of emission so that light emission is restricted in a direction away from the display surface or whiteboard and into the room.
Figure 5 shows a schematic block diagram 500 of circuitry for the pens of Figure 4.
Thus this comprises a micro controller 502, for example a PlC micro controller and an electrical switch 504 operated by pressure of the tip onto the surface on which the pen is employed. This switch may comprise, for example, a micro switch and/or one or more pairs of spring contacts. The switch(es) may be operated by direct pressure of the pen tip onto/off the surface of, for example, a whiteboard and/or sideways motion of the pen tip produced by such pressure.
This is connected, in one embodiment, to an interrupt line of controller 502 to wake the controller from an ultra-low power quiescent state. Optionally one or more further user controls 506 are also coupled to controller 502, optionally to one or more additional interrupt lines of the controller. An infrared detector circuit 508 detects light from the infrared touch sheet and provides a corresponding signal to controller 502, which operates to detect this modulator light and to control a first LED 510 to illuminate the pen tip. Optionally a second LED 512 is also provided, coupled to the controller and operable by controls 506, for example using frequency modulation, to provide a back-channel to the touch detection system. A battery 514 powers the system; this may be sealed within the pen and/or rechargeable.
Figure 6 shows a flow diagram of software operating on controller 502 of Figure 5.
Thus when the tip indicates a pen-down condition 602 and interrupt controller circuit/process 604 begins a procedure which identifies whether the light sheet has been detected 606. If the light sheet has not been detected the controller (CPU) is put back into sleep mode 608. If a light sheet is detected then the first LED light source 510 is enabled 610 and after a timer delay 612, disabled 614, before the controller again returns to sleep mode 608.
In embodiments the pen runs a separate procedure to detect operation of the pen-tip (nib) sensor to identify pen-down/pen-up states and/or operation ot one or more user controls (buttons). Thus referring to Figure 6b if either a change in nib state (650) or a change in button state (652) is detected by an OR function 654 the interrupt controller is alerted beginning an interrupt handling process which sends a back channel code 660 to the touch detection system. The CPU is then put back to sleep 656.
In a variant of the approach described with reference to Figure 6 the photcdiode sensor in the tip of the pen is employed to detect the light curtain and this signal is directed to the interrupt controller to wake up the pen's processor (controller 502). Only at that time are the tip and button states then checked. This approach reduces drain on the pen battery if the tip or button are pressed away from the board, and so improves battery life.
Figure 7 shows a touch detection system 700 with additional features to the block diagram of Figure 3a for processing a signal from a pen of the type shown in Figure 4.
Thus a large area photo detector 702 is provided with a narrow band filter 704 to detect light from the second LED 512 back channel of the pen or pens. This signal is converted into the frequency domain 706 and decoded 708 by identifying the one or more frequencies present to determine pen identifiers and/or pen states, for example a pen-down or pen-up signal. The decode module 708 provides a pen detector signal to the time-correlation block 710, indicatively in communication with touch sense camera 260, although in practice there are many points in the signal processing chain of which the correlation may be implemented (compensating for any processing delays as necessary). The time-correlation module 710 correlates a back channel signal from the pen with identification of a new object in the captured touch sense image to provide a pen detective signal to the touch processing module (touch state machine) 314. In addition the pen ID/state information is also provided to the touch state machine to facilitate providing pen ID/state output data from the system.
Forward channel Referring to Figure Ba, this shows an example train of pulses 800 of the plane of light 256 of Figure 7. In one embodiment the camera operates at 68 frames per second and thus the pulses have a spacing of approximately 14.7ms. In one embodiment the camera shutter is open for a period of lOOps, and the IR laser is illuminated for a corresponding period.
In preferred embodiments, to send a message to the pen data encoding the message is modulated onto the pulse train of Figure Ba. In one approach each individual pulse 802 of the pulse train as modulated, as shown schematically in Figure 8b. With this approach preferably a DC balanced modulation scheme is employed, so that there are equal numbers of ones and zeros in the modulation, and thus that an average power of the pulse remains constant from one pulse to the next. The modulation is sufficiently fast to be invisible to the camera, and is averaged over the lOOps period for which the camera shutter is open. In general it is preferable to employ a high peak power pulse with a short exposure window to reduce the effects of ambient light, whilst avoiding powers which would give rise to safety issues. A 1 OOps window is a useful compromise, short enough to substantially reduced the effects of ambient light, while long enough for peak power to be sufficiently low to facilitate Class 1 laser safety.
When the pulse is modulated preferably the peak power of the pulse, averaged over the time for which the camera shutter is open, is increased to compensate for the zeros in the modulation.
Figure Bc shows another approach to modulating the touch sheet. As illustrated in Figure 8c, a back channel from the pen to the touch system as previously described may be provided by an LED or other light source in a pen. This preferably operates at a different wavelength to the touch sheet, transmitting data from the pen to the touch system during periods 804 between the touch sheet pulses. One or more such periods 804 may be used for the back channel; the pen may derive its timing for the back channel window from the touch sheet pulses 802. In one approach during one or more back channel windows 804 a first set of bits is encoded by frequency modulation using a first set of frequencies, the first set of bits identifying a pen, and a second set of bits is encoded, optionally in a different back channel time window, by frequency modulation using a second set of frequencies (which may be the same as the first set).
These latter bits encode pen state -for example when using two bit they may encode states: hover, tip down, button pressed and low battery.
As can be seen from Figure Bc, one or more addiTional time windows are available between the main touch sheet pulses 802 and the back channel time windows 804, and these can be used for a forward channel to the pen or pens. In one embodiment the time window between a pulse 802 and a window 804 is divided into a number of time slots 806, eight in the illustrated example, and an optical pulse in one or more of these slots is used to encode a message to one or more pens in the system. A message may be broadcast to all pens or one or more individual pens may be addressed -the skilled person will appreciate that with a time slot arrangement as shown in Figure 8c many different types of such addressing may be envisaged. In a similar manner the available time slots also enable one or more messages from a set of messages to be encoded. In one approach one or more binary bits may address a pen and/or define a message; in another approach a message and/or pen may be associated with the presence of a pulse in a defined time slot. The skilled person will appreciate that more complex addressing and message encoding techniques are also possible.
The approach of Figure 8c is, in embodiments, preferable to that of Figure 8b.
However the skilled person will recognise that other approaches are also possible -for example using frequency modulation in a similar manner to that previously described for the optical back channel.
A pen to receive the coded message need have no hardware additional to that shown in Figure 5 (noting also that in simple embodiments not all the elements shown in Figure 5 are necessary). Thus, in essence, a pen for use with the system needs some sort of receiver, tar example a photodetector (optionally with a narrow band filter matching that of the touch-sensing projected light), and some sort of signal processor, such as controller 502 of Figure 5, to extract the coded message. A suitable controller may, for example, comprise a controller intended for an infra-red remote control product -these can combine significant processing capability with low cost and low power consumption.
Figure 9 illustrates a flow diagram of a procedure which may be implemented in either software or hardware in a pen to extract and respond to a message. Thus at step S900 the procedure/circuitry inputs a light sheet signal (or scanned light signal) and (in a light sheet approach) determines a timng of pulses 802 (step S902). This timing can then be used to extract message data from the light sheet signal (S904), for example as illustrated in Figure 8c. Optionally the pen can then determine whether or not it is the intended recipienT of the message (S906) and, if it is, lake appropriate acTion. This action may be, for example, to provide data identifying the pen (S908), for example using an LED operating at or near the same wavelength as the touch sheet as previously described. Additionally or alternatively the pen may reply with status data (S910) using the optical back channel, for example at a different wavelength, again as previously described. Such status data may comprise, for example, battery status data and/or pen orientation data and/or tip pressure data and/or sensor data relating to any other parameter or parameters which may be sensed by the pen either internally or externally.
Referring next to Figure 10, this shows an embodiment of an optical touch sensing system 1000, and in which like elements to those of Figure 7 are indicated by like reference numerals. In the system of Figure 10 the controller 320 provides a signal 1004 for modulating the Fl laser illumination system 250, for example as described with reference to Figure 8. The controller 320 may also perform additional functions, for example requesting a pen battery status. A reply from the pen may be detected as previously described by photodetector(s) 702. The touch state machine 314 may originate a request for one or more pens to identify themselves, for example where the identity of a tracked pen has temporarily been lost or has become uncertain. This may be, for example, because the pen has been moved away from the writing region of an interactive white board, or has been occluded by an object closer to the IR illumination system 250 than the pen (such as another pen, hand or finger); and/or it may be because two nearby objects appear to the touch sensing system to have merged.
Thus the touch state machine 314 may send a pen identification request 1002 for the control of 320 to send as a message to one or more pens. The skilled person will appreciate that the particular arrangement of touch signal processing of Figure 10 is merely by way of example and that a request for pen identification may in principle originate anywhere in the signal processing chain.
The techniques we have described are particularly useful for implementing an interactive whiteboard but also have advantages in smaller scale touch sensitive displays. No doubt many other effective alternatives will occur to the skilled person. It will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the spirit and scope of the claims appended hereto.

Claims (23)

  1. CLAIMS1. An optical touch sensing system, the system comprising: a touch sensor light source to project light above a surface; an optical touch sensor to capture a touch sense signal from a region above said surface; a pen to optically respond to said projected light; a touch processing system comprising a signal processor, coupled to said optical touch sensor, to process said touch sense signal to identify a lateral location of said pen on said surface from said optical response; wherein said signal processor further comprises a wireless communication system to send a wireless message to said pen; and wherein said pen further comprises an electronic controller coupled to a receiver to receive said wireless message from said signal processor.
  2. 2. An optical touch sensing system as claimed in claim 1 wherein said communication system is configured to encode said projected light to send said message to said pen, wherein said receiver of said pen comprises an optical pen sensor to sense said encoded projected ight, and wherein said pen is configured to extract said message from said encoded projected light.
  3. 3. An optical touch sensing system as claimed in claim 1 or 2 wherein a brightness of said projected light is modulated to define bright, touch detecting intervals and intervening intervals, and wherein said optical touch sensor and said light projection are synchronised such that said optical touch sensor selectively captures said optical response during said touch detecting intervals and rejects ambient light during said intervening intervals.
  4. 4. An optical touch sensing system as claimed in claim 3 wherein said communication system is configured to encode said message in said bright intervals of said projected light.
  5. 5. An optical touch sensing system as claimed in claim 3 wherein said communication system is configured to encode said message in said projected light during said intervening intervals.
  6. 6. An optical touch sensing system as claimed in claim 3, 4 or 5 wherein said pen further comprises a reply transmitter coupled to said electronic controller and is configured to send a reply back to said touch processing system, in particular during said intervening intervals using said reply light source.
  7. 7. An optical touch sensing system as claimed in any preceding claim wherein said message comprises an instruction to said pen to perform an action, and wherein said electronic controller is configured to interpret said message and to perform said action in response to said interpretation.
  8. 8. An optical touch sensing system as claimed in claim 7 when dependent on claim 6 wherein said instruction comprises a pen status instruction, for said pen to send a status reply to said touch processing system, wherein said reply comprises status data for said pen.
  9. 9. An optical touch sensing system as claimed in claim 7 or 8 wherein said pen comprises an identifying light source, visible to said optical touch sensor, coupled to said electronic controller, and wherein said instruction comprises a pen identity instruction, for said pen to send a pen identifying signal to said touch processing system during said touch detecting interval.
  10. 10. An optical touch sensing system as claimed in claim 9 wherein said touch processing system is configured to send said pen identity instruction responsive to loss of detection of a pen and subsequent reacquisition of detection of a pen.
  11. 11. An optical touch sensing system as claimed in claim 9 wherein said touch processing system is configured to send said pen identity instruction at intervals to identify when said pen is in a hover positon adjacent said projected light but not optically responding to said projected light sufficiently for said touch processing system to determine said lateral location of said pen, and wherein said pen electronic controller is configured to send said pen identifying signal contingent upon said optical pen sensor sensing said projected light.
  12. 12. An optical touch sensing system as claimed in any preceding claim comprising a plurality of said pens each with a respective pen identifier, wherein said message includes a pen identifier, and wherein a pen is configured to selectively respond to a message comprising the pen identifier of the pen.
  13. 13. An optical touch sensing system as claimed in any preceding claim wherein said signal processor is configured to send a message signalling when said pen has been identified by said touch processing system.
  14. 14. An optical touch sensing system as in claim 14 wherein said pen further comprises a transmitter and is configured to transmit an identifier to a signal receiver of said touch processing system until reception of a message from said touch processing system indicating that said pen has been located by said touch processing system.
  15. 15. An optical touch sensing system as claimed in any preceding claim and wherein said projected light comprises light defining a touch sheet above said surface.
  16. 16. An optical touch sensing system as claimed in claim 15 wherein said optical touch sensor comprises a camera, wherein said touch sense signal comprises a touch sense image, and wherein said touch processing system signal processor is configured to process said touch sense image from said camera to identify said lateral location of said pen.
  17. 17. A pen for the optical touch sensing system of any one of claims ito 16, the pen comprising: a receiver comprising an optical pen sensor; and an electronic controller coupled to said receiver, wherein said electronic controller is configured to extract a message encoded in said projected light.
  18. 18. A pen as claimed in claim 17 further comprising a reply transmitter coupled to said electronic controller, wherein said pen is configured to send a reply back to said touch processing system responsive to said extracted message.
  19. 19. A pen as claimed in claim 18 wherein said reply comprises status data and/or a pen identifying signal for said pen.
  20. 20. A method of operating an optical touch sensing system as claimed in any one of claims ito 16, the method comprising determining information relating to a pen used in the touch sensing system by encoding an instruction to the pen in said projected light.
  21. 21. A method as claimed in claim 20 wherein said information comprises an identity of the pen, in particular when the pen is in hover mode or has been temporarily lost by the touch sensing system.
  22. 22. A method as claimed in claim 20 or 21 wherein said information comprises status data for the pen, in particular battery status data for the pen.
  23. 23. A carrier carrying processor control code for an optical touch sensing system, the system comprising: a touch sensor light source to project light above a surface; an optical touch sensor to capture a touch signal from a region above said surface; a pen to optically respond to said projected light; and a touch processing system comprising a signal processor, coupled to said optical touch sensor; and wherein said code comprises stored program code to process said touch sense signal to identify a lateral location of said pen on said surface from said optical response; and wherein the code further comprises code to determine information relating to a pen used in the touch sensing system by encoding an instruction to the pen in said projected light.
GB1406918.1A 2014-04-17 2014-04-17 Touch sensing systems Withdrawn GB2526525A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1406918.1A GB2526525A (en) 2014-04-17 2014-04-17 Touch sensing systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1406918.1A GB2526525A (en) 2014-04-17 2014-04-17 Touch sensing systems

Publications (2)

Publication Number Publication Date
GB201406918D0 GB201406918D0 (en) 2014-06-04
GB2526525A true GB2526525A (en) 2015-12-02

Family

ID=50928904

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1406918.1A Withdrawn GB2526525A (en) 2014-04-17 2014-04-17 Touch sensing systems

Country Status (1)

Country Link
GB (1) GB2526525A (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9977519B2 (en) * 2015-02-25 2018-05-22 Synaptics Incorporated Active pen with bidirectional communication
CN110554810A (en) * 2018-06-01 2019-12-10 佛山市顺德区美的电热电器制造有限公司 touch key method, device and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130050101A1 (en) * 2011-08-24 2013-02-28 Dexin Corporation Wireless transmission method for touch pen with wireless storage and forwarding capability and system thereof
WO2013108031A2 (en) * 2012-01-20 2013-07-25 Light Blue Optics Limited Touch sensitive image display devices
US20140092069A1 (en) * 2012-09-28 2014-04-03 Izhar Bentov Stylus Communication with Near-Field Coupling
EP2720118A1 (en) * 2012-10-09 2014-04-16 BlackBerry Limited Apparatus and method pertaining to testing a stylus communication path for interference

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130050101A1 (en) * 2011-08-24 2013-02-28 Dexin Corporation Wireless transmission method for touch pen with wireless storage and forwarding capability and system thereof
WO2013108031A2 (en) * 2012-01-20 2013-07-25 Light Blue Optics Limited Touch sensitive image display devices
US20140092069A1 (en) * 2012-09-28 2014-04-03 Izhar Bentov Stylus Communication with Near-Field Coupling
EP2720118A1 (en) * 2012-10-09 2014-04-16 BlackBerry Limited Apparatus and method pertaining to testing a stylus communication path for interference

Also Published As

Publication number Publication date
GB201406918D0 (en) 2014-06-04

Similar Documents

Publication Publication Date Title
TWI450159B (en) Optical touch device, passive touch system and its input detection method
CN101663637B (en) Touch screen system with hover and click input methods
US20150248189A1 (en) Touch Sensing Systems
US9703398B2 (en) Pointing device using proximity sensing
KR20110005737A (en) Interactive input system with optical bezel
JP2019512847A (en) Adaptive lighting system for mirror components and method of controlling an adaptive lighting system
CA2722677A1 (en) Interactive input system and pen tool therefor
EP3097466A1 (en) Dynamic assignment of possible channels in a touch sensor
JP2007052025A (en) System and method for optical navigation device having sliding function constituted so as to generate navigation information through optically transparent layer
CA2935282A1 (en) Device and method for operating at mitigated sensitivity in a touch sensitive device
US10558279B2 (en) Dual mode optical navigation device
JP6187067B2 (en) Coordinate detection system, information processing apparatus, program, storage medium, and coordinate detection method
WO2013108031A2 (en) Touch sensitive image display devices
US20160041632A1 (en) Contact detection system, information processing method, and information processing apparatus
US9886105B2 (en) Touch sensing systems
CN102314264A (en) Optical touch screen
GB2526525A (en) Touch sensing systems
US20180039344A1 (en) Coordinate detection apparatus, electronic blackboard, image display system, and coordinate detection method
KR20080007991A (en) Optical pointing apparatus and mobile terminal having the same
TW201337649A (en) Optical input device and input detection method thereof
US10061440B2 (en) Optical touch sensing system, optical touch sensing device and touch detection method thereof
Balaji et al. RetroSphere: Self-Contained Passive 3D Controller Tracking for Augmented Reality
TW201617814A (en) Optical touch screen
CN106325610B (en) Touch control display system, touch device and touch control display method
WO2018214691A1 (en) Optical touch sensing for displays and other applications

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20160421 AND 20160428

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)