GB2499979A - Touch-sensitive image display devices - Google Patents

Touch-sensitive image display devices Download PDF

Info

Publication number
GB2499979A
GB2499979A GB1200968.4A GB201200968A GB2499979A GB 2499979 A GB2499979 A GB 2499979A GB 201200968 A GB201200968 A GB 201200968A GB 2499979 A GB2499979 A GB 2499979A
Authority
GB
United Kingdom
Prior art keywords
image
touch
camera
light
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1200968.4A
Other versions
GB201200968D0 (en
Inventor
Euan Christopher Smith
Gareth John Mccaughan
Adrian James Cable
Paul Richard Routley
Raul Benet Ballester
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Light Blue Optics Ltd
Original Assignee
Light Blue Optics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Light Blue Optics Ltd filed Critical Light Blue Optics Ltd
Priority to GB1200968.4A priority Critical patent/GB2499979A/en
Publication of GB201200968D0 publication Critical patent/GB201200968D0/en
Priority to PCT/GB2013/050104 priority patent/WO2013108032A1/en
Priority to US14/369,085 priority patent/US20140362052A1/en
Publication of GB2499979A publication Critical patent/GB2499979A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment

Abstract

A touch sensitive image display device 100 comprises: an image projector 200 to project a displayed image 150 onto a display surface; a touch sensor light source 250 to project light defining a touch sheet 256 above the displayed image; a camera 258 directed to capture a touch sense image comprising light scattered from the touch sheet by an object approaching the displayed image; and a signal processor to process the touch sense image to identify a location of the object relative to the displayed image. The camera is able to capture a calibration image projected by the image projector, and the device includes a calibration module to calibrate locations in the captured touch sense image with reference to the displayed image. Also disclosed is an image projector configured to project the displayed image as a set of sequential sub-frames. Also disclosed is an image projector that projects onto a surface at an acute angle, and a device that includes distortion correction optics and a dichroic beam splitter. The touch sensor light source may be infrared, and the device may include various filters.

Description

INTELLECTUAL
PROPERTY OFFICE
Application No. GB1200968.4 RTM ^at6
The following terms are registered trademarks and should be read as such wherever they occur in this document:
DLP
Intellectual Property Office is an operating name of the Patent Office www.ipo.gov.uk
1
TOUCH SENSITIVE IMAGE DISPLAY DEVICES
FIELD OF THE INVENTION
5 This invention relates to touch sensitive image display devices of the type which project a sheet of light adjacent the displayed image. Embodiments of the invention relate to techniques for calibration and synchronisation between captured touch images and the projected displayed image.
10 BACKGROUND TO THE INVENTION
Background prior art relating to touch sensing systems employing a plane or sheet of light can be found in US6,281,878 (Montellese), and in various later patents of Lumio/VKB Inc, such as US7,305,368, as well as in similar patents held by Canesta 15 Inc, for example US6,710,770. Broadly speaking these systems project a fan-shaped plane of infrared (IR) light just above a displayed image and use a camera to detect the light scattered from this plane by a finger or other object reaching through to approach or touch the displayed image.
20 Further background prior art can be found in: W001/93006; US6650318; US7305368; US7084857; US7268692; US7417681; US7242388 (US2007/222760); US2007/019103; W001/93006; W001/93182; W02008/038275; US2006/187199; US6,614,422; US6,710,770 (US2002021287); US7,593,593; US7599561; US7519223; US7394459; US6611921; USD595785; US6,690,357; US6,377,238; US5767842; 25 W02006/108443; W02008/146098; US6,367,933 (WOOO/21282); W002/101443; US6,491,400; US7,379,619; US2004/0095315; US6281878; US6031519; GB2,343,023A; US4384201; DE 41 21 180A; and US2006/244720.
We have previously described techniques for improved touch sensitive holographic 30 displays, for example in our earlier patent applications: W02010/073024; W02010/073045; and W02010/073047.
The inventors have continued to develop and advance touch sensing techniques suitable for use with these and other image display systems. In particular we will 35 describe techniques which synergistically link the camera and image projector.
2
SUMMARY OF THE INVENTION
5 According to a first aspect of the invention there is therefore provided a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface in front of the device; a touch sensor light source to project a sheet of light above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said sheet of light, said touch sense 10 image comprising light scattered from said sheet of light by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said camera is further able to capture an image projected by said image projector; wherein said image projector is configured to project a calibration 15 image; and wherein said touch sensitive image display device further comprises a calibration module configured to use a camera image, captured by said camera, of said calibration image to calibrate locations in said captured touch sense image with reference to said displayed image.
20 It is desirable to be able to calibrate positions within a captured touch sense image with respect to the displayed image without the need for user intervention for example to define particular positions. For example embodiments of the invention address this by enabling the camera to see light from the image projector - although this is not straightforward because in general when capturing a touch sense image it is desirable 25 to suppress both background (typically IR, infra red) illumination from the projector, and background visible light, as much as possible.
In preferred embodiments, therefore, the camera is provided with a filter to suppress light from the displayed image and to allow through only light from the touch sheet. 30 Thus preferably the light defining the touch sheet is substantially monochromatic, for example IR at around 900nm, and this is selected by means of a notch filter. In embodiments, however, this filter is switchable and may be removed from the optical path to the camera, for example mechanically, to enable the camera to "see" the visible light from the image projector and hence auto-calibrate. In preferred implementations, 35 therefore, the system is provide with a calibration module which is configured to control
3
a wavelength-dependent sensitivity of the camera, for example by switching a notch filter in or out, and to control the projector to project a calibration image when the notch filter is removed.
5 In an alternative approach, described further later, the camera may be controlled so as not to see the displayed image in normal operation by controlling a relative timing of the capturing of the touch sense image and displaying of the projected image. More particularly for many types of projector a colour image is defined by projecting a sequence of colour planes (red, green and blue and potentially white and/or additional 10 colours), modulating these with a common imaging device such as an LCD display or DMD (Digital Micromirror Device). In such a system a natural blanking interval between illumination of the imaging device with the separate colour planes may be exploited to capture a touch sense image and/or such a blanking interval may be extended for a similar purpose. In such a system an IR-selective filter may not be 15 needed although optionally a switchable such filter may nonetheless be incorporated into the optical path to the camera. This can be helpful because in the "blanking intervals" there may still be some IR present.
In a still further approach, again in a system which employs an imaging device 20 sequentially illuminated by different colour planes, for example employing a colour wheel in front of an arc light, the image projector may be modified to include an additional, non-visible (typically IR) illumination option so that if desired the image projector may project a calibration image at substantially the same wavelength as used to generate the touch sheet. In a colour wheel type arrangement this may be achieved 25 by including an additional infrared "colour" but additionally or alternatively the projector may incorporate a switchable IR illumination source able to illuminate the imaging device (and preferably a control arrangement to, at the same time, switch off the visible illumination).
30 In a still further approach, which may be employed separately or in combination with the above described techniques, the camera may be provided with a spatially patterned wavelength-selective filter so that some portions of the image sensor see visible light for calibration purposes and other portions see non-visible light, typically IR light, scattered from the touch sheet. One example of such a filter is a chequerboard pattern 35 type filter similar to a Bayer filter. This approach is, however, less preferable because
4
there is a loss in both sensitivity and resolution in both the visible and the IR, although potentially the visible-sensitive pixels may also be employed for other purposes, such as ambient light correction. Where a spatially patterned wavelength-selective filter is employed, it can be preferable also to include an anti-aliasing filter before the camera 5 sensor as this helps to mitigate the potential effects of loss of resolution, broadly speaking by blurring small features.
In some preferred implementations the camera and the image projector share at least part of their front-end image projection/capture optics. This facilitates alignment and 10 helps to maintain calibration, as well as reducing the effects of, for example, different distortion correction being applied to the projected and captured images.
In a related aspect the invention provides a method of calibrating a touch sensitive image display device, the method comprising displaying an image by: projecting a 15 displayed image onto a surface in front of the device using an image projector; projecting a sheet of IR light above said displayed image; capturing a touch sense image from a region including at least a portion of said sheet of light, said touch sense image comprising light scattered from said sheet of light by an object approaching said displayed image using a camera with an IR filter to admit said scattered light and reject 20 light from said displayed image; and processing said touch sense image to identify a location of said object relative to said displayed image; the method further comprising: projecting a calibration image using said image projector; capturing said calibration image using said camera; and calibrating said location of said object with reference to said reference image using said captured calibration image.
25
In a further aspect the invention provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface in front of the device; a touch sensor light source to project a sheet of light above said displayed image; a camera directed to capture a touch sense image from a region 30 including at least a portion of said sheet of light, said touch sense image comprising light scattered from said sheet of light by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said image projector is configured to project said displayed image as a set of 35 sequential sub-frames, at a sub-frame rate, wherein said sub-frames combine to give
5
the visual impression of said displayed image; and wherein capture of said touch sense images is synchronised to said sub-frame projection.
The sub-frames typically comprise colour planes sequentially illuminating an imaging 5 device such as a liquid crystal display or digital micromirror device (DMD), for example by means of a colour wheel in front of a source of broadband illumination, switched LEDs or lasers or the like. However in the case of an inherently binary imaging device such as a high speed DMD, the sub-frames may include separate binary bit planes for each colour, for example to display sequentially a most significant bit plane down to a 10 least significant bit plane. By synchronising the touch image capture to the sub-frame projection, more particularly so that touch images are captured during a blanking interval between sub-frames, background light interference from the projector can be suppressed.
15 In a related aspect the invention provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface in front of the device; a touch sensor light source to project a sheet of light above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said sheet of light, said touch sense image comprising 20 light scattered from said sheet of light by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said image projector is configured to project said displayed image as a set of sequential sub-frames, at a sub-frame rate, wherein said sub-frames combine to give 25 the visual impression of said displayed image; and wherein capture of said touch sense images operates at a frequency different by a factor of at least ten from said sub-frame rate.
In embodiments by selecting the sub-frame rate and touch image capture rate to be at 30 very different frequencies then detected light interference will very rapidly and at a known frequency dependent on the difference between the two rates. Then, because the frequency of the interference is known, this may then be suppressed by filtering for example during digital signal processing of the captured images.
6
In a further aspect the invention provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface in front of the device; a touch sensor light source to project a plane of light above said displayed image; a camera directed to capture a touch sense image from a region 5 including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said image projector is configured to project onto said surface at an acute 10 angle and comprises an imaging device illuminated by a display light source, and distortion correction optics between said imaging device and a light projection output of said image projector; wherein said image capture optics are configured to capture said touch sense image from an acute angle relative to said sheet of light; and wherein an optical path between said imaging device and said distortion correction optics includes 15 a dichroic beam splitter to optically couple said camera into a shared optical path for both said projector and said camera through said distortion correction optics to said light projection output.
In embodiments, sharing part of the front end optical path between the image projector 20 and the camera helps with accurate calibration although can potentially increase the level of background light interference from the projector. Thus preferred implementations also include a broadband IR reject filter between the imaging device and the dichroic beam splitter (unless the imaging device is itself illuminated with substantially monochromatic light for each colour). It is further preferable that between 25 the dichroic beam splitter and the camera. Preferably this latter optical path also includes relay optics comprising a magnifying telescope.
Since in general accurate rendition of the displayed image is more important than precise location of a touch position, preferably the distortion correction optics are 30 optimised, more particularly have a focus optimised, for a visible wavelength, that is in a range 400nm to 700nm. The relay optics, however, may be optimised for the monochromatic IR touch sheet wavelength. For related reasons it may be desirable to duplicate some of the projection optics, in particular intermediate, aspheric optics between the output, distortion correction optics and the imaging device. Thus in 35 embodiments the dichroic beam splitter may be located between these aspheric optics
7
and the output distortion correction optics and a second set of intermediate, aspheric optics, optimised for the IR touch sheet wavelength, provided between the dichroic beam splitter and the camera.
5 In some preferred implementations the imaging device is a digital micromirror imaging device (DMD) although other devices, for example a reflective or transmissive LCD display may also be employed.
In embodiments of the device the image of the scattered light on an image sensor of 10 the camera is defocused. This reduces the effects of laser speckle when laser illumination is used to generate the touch sheet (in embodiments, a plane of light), and also facilitates detection of small touch objects. In embodiments the defocus may be greater along one axis in a lateral plane of the sensor than another, more particularly the defocus may be greater on a vertical axis than on a horizontal axis, where the 15 vertical axis defines a direction of increasing distance from the camera and the horizontal axis a lateral width of the touch sheet. The degree of defocus, that is the extent to which the camera image sensor is displaced away from a focal point or plane may be greater than 1%, 2%, 5%, 10%, 15% or 20% of the focal length to the camera image sensor. The skilled person will appreciate that this technique may be employed 20 independently of the other, previously described aspects and embodiments of the invention.
Embodiments of each of the above described aspects of the invention may be used in a range of touch-sensing display applications. However embodiments the invention 25 are particularly useful for large area touch coverage, for example in interactive whiteboard or similar applications.
Embodiments of each of the above described aspects of the invention are not limited to use with any particular type of projection technology. Thus although we will describe 30 later an example of a holographic image projector, the techniques of the invention may also be applied to other forms of projection technology including, but not limited to, digital micromirror-based projectors such as projectors based on DLP™ (Digital Light Processing) technology from Texas Instruments, Inc.
35
8
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will now be further described, by way of 5 example only, with reference to the accompanying figures in which:
Figures 1a and 1b show, respectively, a vertical cross section view through an example touch sensitive image display device suitable for implementing embodiments of the invention, and details of a sheet of light-based touch sensing system for the device;
10
Figures 2a and 2b show, respectively, a holographic image projection system for use with the device of Figure 1, and a functional block diagram of the device of Figure 1;
Figures 3a to 3e show, respectively, an embodiment of a touch sensitive image display 15 device according to an aspect of the invention, use of a crude peak locator to find finger centroids, and the resulting finger locations;
Figures 4a and 4b show, respectively, a plan view and a side view of an interactive whiteboard incorporating a touch sensitive image display with a calibration system an 20 embodiment of the invention; and
Figures 5a to 5d show, respectively, a shared optical configuration for a touch sensitive image display device according to an embodiment of the invention, an alternative shared optical configuration for the device, a schematic illustration of an example of a 25 spatially patterned filter for use in embodiments of the device, and details of a calibration signal processing and control system for the device.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
30 Figures 1a and 1b show an example touch sensitive holographic image projection device 100 comprising a holographic image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102. A proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device.
9
A holographic image projector is merely described by way of example; the techniques we describe herein may be employed with any type of image projection system.
The holographic image projection module 200 is configured to project downwards and 5 outwards onto a flat surface such as a tabletop. This entails projecting at an acute angle onto the display surface (the angle between a line joining the centre of the output of the projection optics and the middle of the displayed image and a line in a plane of the displayed image is less than 90°). We sometimes refer to projection onto a horizontal surface, conveniently but not essentially non-orthogonally, as "table down 10 projection". A holographic image projector is particularly suited to this application because it can provide a wide throw angle, long depth of field, and substantial distortion correction without significant loss of brightness/efficiency. Boundaries of the light forming the displayed image 150 are indicated by lines 150a, b.
15 The touch sensing system 250, 258, 260 comprises an infrared laser illumination system (IR line generator) 250 configured to project a sheet of infrared light 256 just above, for example ~1mm above, the surface of the displayed image 150 (although in principle the displayed image could be distant from the touch sensing surface). The laser illumination system 250 may comprise an IR LED or laser 252, preferably 20 collimated, then expanded in one direction by light sheet optics 254, which may comprise a negative or cylindrical lens. Optionally light sheet optics 254 may include a 45 degree mirror adjacent the base of the housing 102 to fold the optical path to facilitate locating the plane of light just above the displayed image.
25 A CMOS imaging sensor (touch camera) 260 is provided with an ir-pass lens 258 captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256. The boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a,b. The touch camera 260 provides an output to touch detect signal processing circuitry as described further later.
30
Example holographic image projection system
Figure 2a shows an example holographic image projection system architecture 200 in which the SLM may advantageously be employed. The architecture of Figure 2 uses 35 dual SLM modulation - low resolution phase modulation and higher resolution
10
amplitude (intensity) modulation. This can provide substantial improvements in image quality, power consumption and physical size. The primary gain of holographic projection over imaging is one of energy efficiency. Thus the low spatial frequencies of an image can be rendered holographically to maintain efficiency and the high-5 frequency components can be rendered with an intensity-modulating imaging panel, placed in a plane conjugate to the hologram SLM. Effectively, diffracted light from the hologram SLM device (SLM1) is used to illuminate the imaging SLM device (SLM2). Because the high-frequency components contain relatively little energy, the light blocked by the imaging SLM does not significantly decrease the efficiency of the 10 system, unlike in a conventional imaging system. The hologram SLM is preferably be a fast multi-phase device, for example a pixellated MEMS-based piston actuator device.
In Figure 2a:
• SLM1 is a pixellated MEMS-based piston actuator SLM as described above, to 15 display a hologram - for example a 160 x 160 pixel device with physically small lateral dimensions, e.g <5mm or <1mm.
• L1, L2 and L3 are collimation lenses (optional, depending upon the laser output) for respective Red, Green and Blue lasers.
• M1, M2 and M3 are dichroic mirrors a implemented as prism assembly. 20 • M4 is a turning beam mirror.
• SLM2 is an imaging SLM and has a resolution at least equal to the target image resolution (e.g. 854 x 480); it may comprise a LCOS (liquid crystal on silicon) or DMD (Digital Micromirror Device) panel.
• Diffraction optics 210 comprises lenses LD1 and LD2, forms an intermediate 25 image plane on the surface of SLM2, and has effective focal length f such that fA/ A covers the active area of imaging SLM2. Thus optics 210 perform a spatial Fourier transform to form a far field illumination pattern in the Fourier plane, which illuminates SLM2.
• PBS2 (Polarising Beam Splitter 2) transmits incident light to SLM2, and reflects 30 emergent light into the relay optics 212 (liquid crystal SLM2 rotates the polarisation by 90 degrees). PBS2 preferably has a clear aperture at least as large as the active area of SLM2.
• Relay optics 212 relay light to the diffuser D1.
• M5 is a beam turning mirror.
35 • D1 is a diffuser to reduce speckle.
11
• Projection optics 214 project the object formed on D1 by the relay optics 212, and preferably provide a large throw angle, for example >90°, for angled projection down onto a table top (the design is simplified by the relatively low entendue from the diffuser).
5
The different colours are time-multiplexed and the sizes of the replayed images are scaled to match one another, for example by padding a target image for display with zeros (the field size of the displayed image depends upon the pixel size of the SLM not on the number of pixels in the hologram).
10
A system controller and hologram data processor 202, implemented in software and/or dedicated hardware, inputs image data and provides low spatial frequency hologram data 204 to SLM1 and higher spatial frequency intensity modulation data 206 to SLM2. The controller also provides laser light intensity control data 208 to each of the three 15 lasers. For details of an example hologram calculation procedure reference may be made to W02010/007404 (hereby incorporated by reference).
Control system
20
Referring now to Figure 2b, this shows a block diagram of the device 100 of figure 1. A system controller 110 is coupled to a touch sensing module 112 from which it receives data defining one or more touched locations on the display area, either in rectangular or in distorted coordinates (in the latter case the system controller may perform 25 keystone distortion compensation). The touch sensing module 112 in embodiments comprises a CMOS sensor driver and touch-detect processing circuitry.
The system controller 110 is also coupled to an input/output module 114 which provides a plurality of external interfaces, in particular for buttons, LEDs, optionally a 30 USB and/or Bluetooth (RTM) interface, and a bi-directional wireless communication interface, for example using WiFi (RTM). In embodiments the wireless interface may be employed to download data for display either in the form of images or in the form of hologram data. In an ordering/payment system this data may include price data for price updates, and the interface may provide a backhaul link for placing orders, 35 handshaking to enable payment and the like. Non-volatile memory 116, for example
12
Flash RAM is provided to store data for display, including hologram data, as well as distortion compensation data, and touch sensing control data (identifying regions and associated actions/links). Non-volatile memory 116 is coupled to the system controller and to the I/O module 114, as well as to an optional image-to-hologram engine 118 as 5 previously described (also coupled to system controller 110), and to an optical module controller 120 for controlling the optics shown in figure 2a. (The image-to-hologram engine is optional as the device may receive hologram data for display from an external source). In embodiments the optical module controller 120 receives hologram data for display and drives the hologram display SLM, as well as controlling the laser output 10 powers in order to compensate for brightness variations caused by varying coverage of the display area by the displayed image (for more details see, for example, our W02008/075096). In embodiments the laser power(s) is(are) controlled dependent on the "coverage" of the image, with coverage defined as the sum of: the image pixel values, preferably raised to a power of gamma (where gamma is typically 2.2). The 15 laser power is inversely dependent on (but not necessarily inversely proportional to) the coverage; in preferred embodiments a lookup table as employed to apply a programmable transfer function between coverage and laser power. The hologram data stored in the non-volatile memory, optionally received by interface 114, therefore in embodiments comprises data defining a power level for one or each of the lasers 20 together with each hologram to be displayed; the hologram data may define a plurality of temporal holographic subframes for a displayed image. Preferred embodiments of the device also include a power management system 122 to control battery charging, monitor power consumption, invoke a sleep mode and the like.
25 In operation the system controller controls loading of the image/hologram data into the non-volatile memory, where necessary conversion of image data to hologram data, and loading of the hologram data into the optical module and control of the laser intensities. The system controller also performs distortion compensation and controls which image to display when and how the device responds to different "key" presses and includes 30 software to keep track of a state of the device. The controller is also configured to transition between states (images) on detection of touch events with coordinates in the correct range, a detected touch triggering an event such as a display of another image and hence a transition to another state. The system controller 110 also, in embodiments, manages price updates of displayed menu items, and optionally 35 payment, and the like.
13
Touch Sensing Systems
5 Referring now to Figure 3a, this shows an embodiment of a touch sensitive image display device 300 according to an aspect of the invention. The system comprises an infra red laser and optics 250 to generate a plane of light 256 viewed by a touch sense camera 258, 260 as previously described, the camera capturing the scattered light from one or more fingers 301 or other objects interacting with the plane of light. The system 10 also includes an image projector 118, for example a holographic image projector, also as previously described, to project an image typically generally in front of the device, in embodiments generally downwards at an acute angle to a display surface.
In the arrangement of Figure 3a a controller 320 controls the IR laser on and off, 15 controls the acquisition of images by camera 260 and controls projector 118. In the illustrated example images are captured with the IR laser on and off in alternate frames and touch detection is then performed on the difference of these frames to subtract out any ambient infra red. The image capture objects 258 preferably also include a notch filter at the laser wavelength which may be around 780-800 nm. Because of laser 20 diodes process variations and change of wavelength with temperature this notch may be relatively wide, for example of order 20 nm and thus it is desirable to suppress ambient IR. In the embodiment of Figure 3a subtraction is performed by module 302 which, in embodiments, is implemented in hardware (an FPGA).
25 In embodiments module 302 also performs binning of the camera pixels, for example down to approximately 80 by 50 pixels. This helps reduce the subsequent processing power/memory requirements and is described in more detail later. However such binning is optional, depending upon the processing power available, and even where processing power/memory is limited there are other options, as described further later. 30 Following the binning and subtraction the captured image data is loaded into a buffer 304 for subsequent processing to identify the position of a finger or, in a multi-touch system, fingers.
Because the camera 260 is directed down towards the plane of light at an angle it can 35 be desirable to provide a greater exposure time for portions of the captured image
14
further from the device than for those nearer the device. This can be achieved, for example, with a rolling shutter device, under control of controller 320 setting appropriate camera registers.
5 Depending upon the processing of the captured touch sense images and/or the brightness of the laser illumination system, differencing alternate frames may not be necessary (for example, where 'finger shape' is detected). However where subtraction takes place the camera should have a gamma of substantial unity so that subtraction is performed with a linear signal.
10
Various different techniques for locating candidate finger/object touch positions will be described. In the illustrated example, however, an approach is employed which detects intensity peaks in the image and then employs a centroid finder to locate candidate finger positions. In embodiments this is performed in software. Processor control code
15 and/or data to implement the aforementioned FPGA and/or software modules shown in Figure 3 (and also to implement the modules described later with reference to Figure 5) may be provided on a disk 318 or another physical storage medium.
Thus in embodiments module 306 performs thresholding on a captured image and, in
20 embodiments, this is also employed for image clipping or cropping to define a touch sensitive region. Optionally some image scaling may also be performed in this module. Then a crude peak locator 308 is applied to the thresholded image to identify, approximately, regions in which a finger/object is potentially present.
25 Figure 3b illustrates an example such a coarse (decimated) grid. In the Figure the spots indicate the first estimation of the centre-of-mass. We then take a 32x20 (say) grid around each of these. This is preferably used in conjunction with a differential approach to minimize noise, i.e. one frame laser on, next laser off.
30 A centroid locator 310 (centre of mass algorithm) is applied to the original (unthresholded) image in buffer 304 at each located peak, to determine a respective candidate finger/object location. Figure 3c shows the results of the fine-grid position estimation, the spots indicating the finger locations found.
15
The system then applies distortion correction 312 to compensate for keystone distortion of the captured touch sense image and also, optionally, any distortion such as barrel distortion, from the lens of imaging optics 258. In one embodiment the optical access of camera 260 is directed downwards at an angle of approximately 70° to the 5 plane of the image and thus the keystone distortion is relatively small, but still significant enough for distortion correction to be desirable.
Because nearer parts of a captured touch sense image may be brighter than further parts, the thresholding may be position sensitive (at a higher level for mirror image 10 parts) alternatively position-sensitive scaling may be applied to the image in buffer 304 and a substantially uniform threshold may be applied.
In one embodiment of the crude peak locator 308 the procedure finds a connected region of the captured image by identifying the brightest block within a region (or a 15 block with greater than a threshold brightness), and then locates the next brightest block, and so forth, preferably up to a distance limit (to avoid accidentally performing a flood fill). Centroid location is then performed on a connected region. In embodiments the pixel brightness/intensity values are not squared before the centroid location, to reduce the sensitivity of this technique to noise, interference and the like (which can 20 cause movement of a detected centroid location by more than once pixel).
A simple centre-of-mass calculation is sufficient for the purpose of finding a centroid in a given ROI (region of interest), and R(x,y) may be estimated thus:
V V.v «•{*,. V. )
_ ys=0xs=0
A
7-1 X-l
Xs =0x5=0 7-1 X-\
25 ^ 1'-1
T,HysR"(xs>ys)
ys =0 XS =0
T,HR"(xs'ys)
ys =0 XS =0
where n is the order of the CoM calculation, and Xand Vare the sizes of the ROI.
16
In embodiments the distortion correction module 312 performs a distortion correction using a polynomial to map between the touch sense camera space and the displayed image space: Say the transformed coordinates from camera space (x,y) into projected 5 space (x',y) are related by the bivariate polynomial: x=xCxyT * = and y =xCyyT; where Cx and Cy represent polynomial coefficients in matrix-form, and x and y are the vectorised powers of x and y respectively. Then we may design Cx and Cy such that we can assign a projected space grid location (i.e. memory location) by evaluation of the polynomial:
10
b = lx] + Xly]
Where Xis the number of grid locations in the x-direction in projector space, and [.J is the floor operator. The polynomial evaluation may be implemented, say, in Chebyshev 15 form for better precision performance; the coefficients may be assigned at calibration. Further background can be found in our published PCT application W02010/073024.
Once a set of candidate finger positions has been identified, these are passed to a module 314 which tracks finger/object positions and decodes actions, in particular to 20 identity finger up/down or present/absent events. In embodiments this module also provides some position hysteresis, for example implemented using a digital filter, to reduce position jitter. In a single touch system module 314 need only decode a finger up/finger down state, but in a multi-touch system this module also allocates identifiers to the fingers/objects in the captured images and tracks the indentified fingers/objects.
25
In general the field of view of the touch sense camera system is larger than the displayed image. To improve robustness of the touch sensing system touch events outside the displayed image area (which may be determined by calibration) may be rejected (for example, using appropriate entries in a threshold table of threshold 30 module 306 to clip the crude peak locator outside the image area).
Auto-calibration, synchronisation and optical techniques
17
We will now describe embodiments of various techniques for use with a touch sensitive display device, for example of the general type described above. The skilled person will appreciate that the techniques we will describe may be employed with any type of image projection system, not just the example holographic image projection system of 5 Figure 2.
Thus referring to first Figure 4a, this shows a plan view of an interactive whiteboard touch sensitive image display device 400 including a movement compensation system according to an embodiment of the invention. Figure 4b shows a side view of the 10 device.
As illustrated there are three IR fan sources 402, 404, 406, each providing a respective light fan 402a, 404a, 406a spanning approximately 120° (for example) and together defining a single, continuous sheet of light just above display area 410. The fans 15 overlap on display area 410, central regions of the display area being covered by three fans and more peripheral regions by two fans and just one fan. This is economical as shadowing is most likely in the central region of the display area. Typical dimensions of the display area 410 may be of order 1m by 2m. The side view of the system illustrates a combined projector 420 and touch image capture camera 422 either 20 aligned side-by-side or sharing at least an output portion of the projection optics. As illustrated in embodiments the optical path between the projector/camera and display area is folded by a mirror 424. The sheet of light generated by fans 402a, 404a, 406a is preferably close to the display area, for example less than 1cm or 0.5cm above the display area. However the camera and projector 422, 420 are supported on a support 25 450 and may project light from a distance of up to around 0.5m from the display area.
We first describe auto-calibration using a calibration pattern projected from projector: The projector itself can project a pattern containing identifiable features in known locations. Examples include a grid of lines, randomly positioned dots, dots in the 30 corners of the image, single dots or lines, crosshairs, and other static or time-varying patterns or structures. If the camera 258, 260 can see this pattern then the system can use this for calibration without any need for manual referencing by the user.
Such auto-calibration may be performed, for example: (1) when an explicit calibration 35 operation is requested by the user; and/or (2) when an explicit calibration operation is
18
triggered by, for example, system startup or shutdown or a long period of inactivity or some automatically-gathered evidence of poor calibration; and/or (3) at regular intervals; and/or (4) effectively continuously.
5 When implementing this technique the camera is made able to see the light the projector emits. In normal operation the system aims to remove IR from the projector's output and to remove visible light from the camera's input. One or other of these may be temporarily deactivated for auto-calibration. This may be done (a) by physically moving a filter out of place (and optionally swapping in a different filter instead) when 10 calibration is being done; and/or (b) by having a filter or filters move in and out of use all the time, for example using the projector's colour wheel or a second "colour wheel" applied to the camera; and/or (c) by providing the with camera a Bayer-like filter (Figure 5c) where some pixels see IR and some pixels see visible light. Such a filter may be combined with an anti-aliasing filter, for example similar to those in consumer digital 15 cameras, so that small features are blurred rather than arbitrarily either seen at full brightness or missed depending on their location relative to the IR/visible filter.
It is also desirable to share at least a portion of the optical path between the imaging optics (projection lens) and the touch camera optics. Such sharing matches distortion 20 between image output and touch input and ameliorates the need for cross-calibration between input and output, since both (sharing optics) are subject to the substantially same optical distortion.
Referring now to Figure 5a, this shows an embodiment of a touch sensitive image 25 display device 500 arranged to implement an auto-calibration procedure as described above. In the illustrated example an arc lamp 502 provides light via a colour wheel 504 and associated optics 506a, b to a digital micromirror device 508. The colour wheel 504 sequentially selects, for example, red, green, blue and white but may be modified to include an IR "colour" and/or to increase the blanking time between colours by 30 increasing the width of the separators 504a. In other arrangements switched, substantially monochromatic laser or LED illumination is employed instead. The colour selected by colour wheel 504 (or switched to illuminate the DMD 508) is known by the projector controller but, optionally, a rotation sensor may also be attached to wheel 504 to provide a rotation signal output 504b. A DMD is a binary device and thus each
19
colour is built up from a plurality of sub-frames, one for each significant bit position of the displayed image.
The projector is configured to illuminate the display surface at an acute angle, as 5 illustrated in Figure 5b, and thus the output optics include front end distortion correction optics 510 and intermediate, aspheric optics 512 (with a fuzzy intermediate image in between). The output optics 510, 512 enable short-throw projection onto a surface at a relatively steep angle.
10 Although the touch sense camera, 258, 260 may simply be located alongside the output optics, preferably the camera is integrated into the projector by means of a dichroic beam splitter 514 located after DMD 508 which dumps IR from lamp 502 and directs incoming IR scattered from the sheet of light into sensor 260 of the touch sense camera via relay optics 516 which magnify the image (because the sensor 260 is 15 generally smaller than the DMD device 508).
The dichroic beam splitter 514 is provided with a substantially non-absorbing dialectric coating, but preferably the system incorporates additional filtering, more particularly a broadband IR reject filter 518 and a notch IR pass filter 520 to filter out unwanted IR 20 from the exterior of the projector/camera system.
Lamp 502 is typically a mercury discharge lamp and thus emits a significant proportion of IR light. This can interfere with the touch detection in two ways: light is transmitted through the projection optics to the screen and reflected back through the camera 25 optics; and IR light is reflected inside the projection optics back to the camera. Both these forms of interference can be suppressed by locating and IR blocking filter before any such light reaches the camera, for example as shown by filter 518 or, alternatively, just before or just after colour wheel 504.
30 Continuing to refer to Figure 5a, notch filter 520 may be mounted on a mechanical actuator 522 so that the notch filter is switchable into and out of the optical path to sensor 260 under control of the system controller. This allows the camera to see the visible output from the projector when a calibration image is displayed.
20
Referring to Figure 5b, this shows an alternative arrangement of the optical components of Figure 5a, in which like elements are indicated by like reference numerals. In the arrangement of Figure 5b the aspheric intermediate optics are duplicated 512a, 5, which enables optics 512b to be optimised for distortion correction 5 at the infrared wavelength used by the touch sensing system. By contrast in the arrangement of Figure 5a the optics 510, 512 are preferably optimised for visible wavelengths since a small amount of distortion in the touch sensing system is generally tolerable.
10 As illustrated schematically by arrow 524 in Figures 5a and 5b, it can be advantageous to defocus the relay optics 516 slightly so that the image on sensor 260 is defocused to reduce problems which can otherwise arise from laser speckle. Such defocus enables improved detection of small touch objects. In embodiments the optics 524 may be modified to add defocus only onto the vertical axis of the sensor (the vertical axis in 15 Figure 4a).
Figure 5c illustrates an example Bayer-type spatial filter 530 which may be located directly in front of camera sensor 260 so that some pixels of the sensor see visible light and some IR light. As previously mentioned, if this is done, filter 530 may be combined 20 with an anti-aliasing filter for improved touch detection. Such an anti-aliasing filter may comprise, for example, a pair of layers of birefringent material.
Continuing to refer to the optical configuration and image capture, as previously mentioned the projector may itself be a source of light interference because the camera 25 is directed towards the image display surface (and because where the camera shares optics with the projector there can be other routes for light from the projector to reach the camera. This can cause difficulties, for example, in background subtraction because the light output from the projector varies for several reasons: the projected image varies; the red, green and blue levels may vary even for a fixed image, and in 30 general pass through the filters to the camera in different (small) amounts; and because the projectors imaging panel may be a binary device such as a DMD which switches very rapidly within each frame.
These problems can be ameliorated by synchronising the capture of the touch sense 35 image with operation of the projector. For example the camera may be triggered by a
21
signal which is referenced to the position of the colour wheel (for example derived from the colour wheel or the projector controller). Alternatively the image capture rate of the touch sense camera may be arranged to be substantially different to the rate at which the level of interference from the projected image varies. In this case the interference 5 effectively beats at a known difference frequency, which can then be used to reject this light component by digital filtering. Additionally or alternatively, irrespective of whether the previously described techniques are employed, the system may incorporate feedback, providing a signal related to the amount of light in the image displayed by the projector, to the touch system. The touch system may then apply light interference 10 compensation dependent on a level of this signal.
Referring now to Figure 5d, this shows a system similar to that illustrated in Figure 3a, but with further details of the calibration processing and control system. Thus the system controller incorporates a calibration control module 502 which is able to control 15 the image projector 118 to display a calibration image. In the illustrated embodiment controller 502 also receives a synchronisation input from the projector 118 to enable touch sense image capture to be synchronised to the projector. Optionally in a system where the projector is able to project an IR image for calibration controller 502 may suppress projection of the sheet of light during this interval.
20
A captured calibration image is processed for ambient light suppression and general initial filtering in the usual way and is then provided to a position calibration module 504 which determines the positions of the reference points in the displayed calibration image and is thus able to precisely locate the displayed image and map identified touch 25 positions to corresponding positions within the displayed image. Thus position calibration module 504 provides output date to the object location detection module 314 so that, if desired, this module is able to output position date referenced to a displayed image.
30 It will be appreciated that for the touch sensing system to work a user need not actually touch the displayed image. The plane or fan of light is preferably invisible, for example in the infrared, but this is not essential - ultraviolet or visible light may alternatively be used. Although in general the plane or fan of light will be adjacent to displayed image, this is also not essential and, in principle, the projected image could be at some 35 distance beyond the touch sensing surface. The skilled person will appreciate that
22
whilst a relatively thin, flat sheet of light is desirable this is not essential and some tilting and/or divergence or spreading of the beam may be acceptable with some loss of precision. Alternatively some convergence of the beam towards the far edge of the display area may be helpful in at least partially compensating for the reduction in 5 brightness of the touch sensor illumination as the light fans out. Further, in embodiments the light defining the touch sheet need not be light defining a continuous plane - instead structured light such as a comb or fan of individual beams and/or one or more scanned light beams, may be employed to define the touch sheet.
10 The techniques we have described are particularly useful for implementing an interactive whiteboard although they also have advantages in smaller scale touch sensitive displays. No doubt many other effective alternatives will occur to the skilled person. It will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying 15 within the spirit and scope of the claims appended hereto.
23

Claims (21)

CLAIMS:
1. A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a display surface;
5 a touch sensor light source to project light defining a touch sheet above said displayed image;
a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and 10 a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image;
wherein said camera is further able to capture an image projected by said image projector;
wherein said image projector is configured to project a calibration image; and 15 wherein said touch sensitive image display device further comprises a calibration module configured to use a camera image, captured by said camera, of said calibration image to calibrate locations in said captured touch sense image with reference to said displayed image.
20
2. A touch sensitive image display device as claimed in claim 1 wherein said camera has a controllable wavelength-dependent sensitivity, and wherein said calibration module is configured to control said wavelength-dependent sensitivity between a first wavelength-dependent sensitivity for which said camera is sensitive to said projected light defining said touch sheet and rejects light from said displayed 25 image, and a second wavelength-dependent sensitivity for which said camera is sensitive to light from said displayed image.
3. A touch sensitive image display device as claimed in claim 2 comprising a controllable optical notch filter and a controller to apply said notch filter to said camera
30 for said first wavelength-dependent sensitivity and to remove said notch filter from said camera for said second wavelength-dependent sensitivity.
4. A touch sensitive image display device as claimed in claim 1 wherein said light defining said touch sheet comprises light of a non-visible wavelength, wherein said
35 camera has a wavelength-selective filter to preferentially pass light of said non-visible
24
wavelength and reject light from said displayed image, and wherein said projector is configured to project said calibration image using light of a non-visible wavelength within a passband of said wavelength-selective filter.
5 5. A touch sensitive image display device as claimed in claim 1 wherein said light defining said touch sheet of light comprises light of a non-visible wavelength, and wherein said camera has a spatially patterned wavelength-selection filter, wherein said spatially patterned wavelength-selection filter is configured to preferentially pass light of said non-visible wavelength and reject light from said displayed image for selected 10 spatial regions of said camera image.
6. A touch sensitive image display device as claimed in claim 5 further comprising an anti-aliasing filter for said wavelength-selective filter.
15
7. A touch sensitive image display device as claimed in any preceding claim wherein said camera and said image projector share at least part of front-end image projection/capture optics of the device.
8. A method of calibrating a touch sensitive image display device, the method 20 comprising displaying an image by:
projecting a displayed image onto a surface using an image projector;
projecting IR light defining a touch sheet above said displayed image;
capturing a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet 25 by an object approaching said displayed image using a camera with an IR filter to admit said scattered light and reject light from said displayed image; and processing said touch sense image to identify a location of said object relative to said displayed image; the method further comprising:
projecting a calibration image using said image projector; 30 capturing said calibration image using said camera; and calibrating said location of said object with reference to said reference image using said captured calibration image.
9. A method as claimed in claim 8 wherein said capturing of said calibration image 35 comprises controlling said IR filter to modify a wavelength sensitivity of said camera.
25
10. A method as claimed in claim 8 comprising projecting said calibration image using IR wavelength light.
5
11. A method as claimed in claim 8 comprising spatially patterning said IR filter to enable said camera to detect both said scattered light and said displayed image at different locations within a captured image.
12. A touch sensitive image display device, the device comprising: 10 an image projector to project a displayed image onto a surface;
a touch sensor light source to project light defining a touch sheet above said displayed image;
a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered 15 from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image;
wherein said image projector is configured to project said displayed image as a 20 set of sequential sub-frames, at a sub-frame rate, wherein said sub-frames combine to give the visual impression of said displayed image; and wherein capture of said touch sense images is synchronised to said sub-frame projection.
25
13. A touch sensitive image display device as claimed in claim 12 wherein said sequential projection of said sub-frames includes blanking intervals between at least some of said sub-frames; and wherein capture of said touch sense images is synchronised to said blanking intervals.
30
14. A touch sensitive image display device as claimed in claim 12 or 13 wherein said image projector comprises a digital multimirror imaging device illuminated via a spinning colour wheel, and wherein said image capture is triggered responsive to a rotational position of said colour wheel.
35 15. A touch sensitive image display device, the device comprising:
26
an image projector to project a displayed image onto a surface;
a touch sensor light source to project light defining a touch sheet above said displayed image;
a camera directed to capture a touch sense image from a region including at 5 least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image;
10 wherein said image projector is configured to project said displayed image as a set of sequential sub-frames, at a sub-frame rate, wherein said sub-frames combine to give the visual impression of said displayed image; and wherein capture of said touch sense images operates at a frequency different by a factor of at least ten from said sub-frame rate.
15
16. A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a surface;
a touch sensor light source to project light defining a touch sheet above said displayed image;
20 a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed
25 image;
wherein said image projector is configured to project onto said surface at an acute angle and comprises an imaging device illuminated by a display light source, and distortion correction optics between said imaging device and a light projection output of said image projector;
30 wherein said image capture optics are configured to capture said touch sense image from an acute angle relative to said touch sheet; and wherein an optical path between said imaging device and said distortion correction optics includes a dichroic beam splitter to optically couple said camera into a shared optical path for both said projector and said camera through said distortion
35 correction optics to said light projection output.
27
17. A touch sensitive image display device as claimed in claim 16 wherein said light defining said touch sheet comprises monochromatic IR light; wherein said optical path further comprises an IR reject filter between said imaging device and said dichroic
5 beam splitter; and wherein an optical path between said dichroic beam splitter and said camera comprises an IR transmit notch filter at a wavelength of said monochromatic IR light.
18. A touch sensitive image display device as claimed in claim 17 further 10 comprising relay optics between said dichroic beam splitter and said camera, wherein said distortion correction optics have focus optimised for a wavelength in the range 400nm to 700nm, and wherein said relay optics are optimised for said wavelength of said monochromatic IR light.
15
19. A touch sensitive image display device as claimed in claims 16, 17 or 18 wherein an image of said scattered light on an image sensor of said camera is defocused.
20. A touch sensitive image display device as claimed in claims 16, 17, 18 or 19 20 comprising duplicated intermediate optics, wherein a first set of said intermediate optics is located in said optical path between said imaging device and said distortion correction optics, wherein a second set of said intermediate optics is located between said dichroic beam splitter and said camera; and wherein said first and second sets of intermediate optics are optimised for different optical wavelengths.
25
21. A touch sensitive image display device as claimed in any one of claims 16 to 20 wherein said imaging device is a digital micromirror device (DMD).
GB1200968.4A 2012-01-20 2012-01-20 Touch-sensitive image display devices Withdrawn GB2499979A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1200968.4A GB2499979A (en) 2012-01-20 2012-01-20 Touch-sensitive image display devices
PCT/GB2013/050104 WO2013108032A1 (en) 2012-01-20 2013-01-17 Touch sensitive image display devices
US14/369,085 US20140362052A1 (en) 2012-01-20 2013-01-17 Touch Sensitive Image Display Devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1200968.4A GB2499979A (en) 2012-01-20 2012-01-20 Touch-sensitive image display devices

Publications (2)

Publication Number Publication Date
GB201200968D0 GB201200968D0 (en) 2012-03-07
GB2499979A true GB2499979A (en) 2013-09-11

Family

ID=45840737

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1200968.4A Withdrawn GB2499979A (en) 2012-01-20 2012-01-20 Touch-sensitive image display devices

Country Status (1)

Country Link
GB (1) GB2499979A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3667412B1 (en) * 2013-07-30 2023-11-01 Dolby Laboratories Licensing Corporation Projector display systems having non-mechanical mirror beam steering

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050168448A1 (en) * 2004-01-30 2005-08-04 Simpson Zachary B. Interactive touch-screen using infrared illuminators
EP1607853A2 (en) * 2004-06-16 2005-12-21 Microsoft Corporation Calibration of an interactive display system
US20090116742A1 (en) * 2007-11-01 2009-05-07 H Keith Nishihara Calibration of a Gesture Recognition Interface System
US20100039379A1 (en) * 2008-08-15 2010-02-18 Gesturetek Inc. Enhanced Multi-Touch Detection
US20100079385A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for calibrating an interactive input system and interactive input system executing the calibration method
US20110243380A1 (en) * 2010-04-01 2011-10-06 Qualcomm Incorporated Computing device interface

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050168448A1 (en) * 2004-01-30 2005-08-04 Simpson Zachary B. Interactive touch-screen using infrared illuminators
EP1607853A2 (en) * 2004-06-16 2005-12-21 Microsoft Corporation Calibration of an interactive display system
US20090116742A1 (en) * 2007-11-01 2009-05-07 H Keith Nishihara Calibration of a Gesture Recognition Interface System
US20100039379A1 (en) * 2008-08-15 2010-02-18 Gesturetek Inc. Enhanced Multi-Touch Detection
US20100079385A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for calibrating an interactive input system and interactive input system executing the calibration method
US20110243380A1 (en) * 2010-04-01 2011-10-06 Qualcomm Incorporated Computing device interface

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3667412B1 (en) * 2013-07-30 2023-11-01 Dolby Laboratories Licensing Corporation Projector display systems having non-mechanical mirror beam steering

Also Published As

Publication number Publication date
GB201200968D0 (en) 2012-03-07

Similar Documents

Publication Publication Date Title
WO2013108032A1 (en) Touch sensitive image display devices
US9524061B2 (en) Touch-sensitive display devices
US8947402B2 (en) Touch sensitive image display
WO2013144599A2 (en) Touch sensing systems
CN106716318B (en) Projection display unit and function control method
EP2721468B1 (en) Touch-sensitive display devices
EP3176636B1 (en) Projection-type display device
Hirsch et al. BiDi screen: a thin, depth-sensing LCD for 3D interaction using light fields
US8690340B2 (en) Combined image projection and capture system using on and off state positions of spatial light modulator
WO2013108031A2 (en) Touch sensitive image display devices
JP2013120586A (en) Projector
US20140247249A1 (en) Touch Sensitive Display Devices
US10013116B2 (en) Projection display unit
GB2499979A (en) Touch-sensitive image display devices
US10521054B2 (en) Projection display unit
JP6807286B2 (en) Imaging device and imaging method
JP7125561B2 (en) Control device, projection system, control method, control program
JP5550111B2 (en) Imaging apparatus, imaging method, and program
WO2012172360A2 (en) Touch-sensitive display devices
JP6969606B2 (en) Projector with detection function
JP2020005049A (en) Projection system, projector, control method for projection system, and control method for projector
US20200045274A1 (en) Split aperture projector/camera

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20160218 AND 20160224

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)