GB2536604A - Touch sensing systems - Google Patents

Touch sensing systems Download PDF

Info

Publication number
GB2536604A
GB2536604A GB1420272.5A GB201420272A GB2536604A GB 2536604 A GB2536604 A GB 2536604A GB 201420272 A GB201420272 A GB 201420272A GB 2536604 A GB2536604 A GB 2536604A
Authority
GB
United Kingdom
Prior art keywords
touch
sheet
marker
touch sheet
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1420272.5A
Other versions
GB201420272D0 (en
Inventor
Smith Euan
Routley Paul
Thornton Shane
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Promethean Ltd
Original Assignee
Promethean Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Promethean Ltd filed Critical Promethean Ltd
Priority to GB1420272.5A priority Critical patent/GB2536604A/en
Publication of GB201420272D0 publication Critical patent/GB201420272D0/en
Publication of GB2536604A publication Critical patent/GB2536604A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04109FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location

Abstract

One method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system that includes a touch sheet projector to project light defining a touch sheet 1 above a touch surface; and a sensor e.g. camera 112 to capture a touch image of light scattered from the touch sheet by an object approaching the touch surface, a method includes: placing a marker on the touch surface to intersect the touch sheet; includes: capturing a marker image comprising the marker; determining a location of the marker with respect to the touch surface from the marker image; capturing a touch image of light scattered from the touch sheet by the marker; determining touch sheet height data from the marker image and the touch image, the touch sheet height data including data representing a height of the touch sheet above the touch surface at the location of the marker; and outputting a touch sheet alignment signal dependent on the touch sheet height data. Other differing methods are given concerning: using an alignment device with particular lateral dimensions, determining height at a plurality of positions to obtain an equation active marker, determining height with reference to light scattered from the marker and from such scattered light reflected from the surface, an active marker, a method of calibration and displaying alignment indication to a user.

Description

Touch sensing systems
FIELD OF THE INVENTION
This invention relates to methods, apparatus, and computer program code for aligning and calibrating touch sensing systems of the type which project light defining a touch sheet just above a surface or display.
BACKGROUND TO THE INVENTION
We have previously described a range of improvements in touch sensing systems of the type which project light defining a touch sheet just above a surface or display, for example to provide an electronic whiteboard. An example of such a system is described in W02014/049331.
For a good user experience it is generally desirable that the touch sheet is as close as practicable to the whiteboard/display surface. However the infrared light of the touch sheet is invisible to a user, thus complicating the alignment of the system. One simple approach to alignment is to employ an infrared viewer. For example where the system is mounted at the top of a whiteboard, the touch sheet defines a line on the floor which can be seen in the viewer and used to bring the touch sheet towards the surface. However it would be preferable to enable alignment of the system without such a viewer.
A position detection apparatus which can be used for alignment is described in US 2014/0218341, but in practice this system is difficult to use and misalignment is easy. Further, although it is convenient to think of the touch sheet as a substantially planar sheet of light, in practice it is not entirely flat, having more the shape of a cone with a very large radius of curvature; typically the thickness of the sheet also increases slightly away from the laser, thus giving rise to a very slight wedge-shape for the sheet. For simplicity we will nonetheless, on occasion, refer to the sheet as "planar" but this is to be understood as shorthand for the actual sheet configuration.
It is generally desirable to provide a user with improved alignment (and calibration) techniques which address these difficulties.
SUMMARY OF THE INVENTION
According to a first aspect of the invention there is therefore provided a method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a marker image comprising said marker; determining a location of said marker with respect to said touch surface from said marker image; capturing a touch image of light scattered from said touch sheet by said marker; determining touch sheet height data from said marker image and said touch image, said touch sheet height data comprising data representing a height of said touch sheet above said touch surface at said location of said marker; and outputting a touch sheet alignment signal dependent on said touch sheet height data.
Broadly speaking embodiments of the method measure the height of the touch sheet above the touch/display or other surface, so that this can be used to align the touch sheet to be substantially parallel to the surface and, preferably, as close to the surface as possible. In embodiments a line generator assembly generating the touch sheet may be mounted on, for example, a three-point mount, so that both the height and the angle of the touch sheet can be adjusted. This facilitates optimising the alignment of the touch sheet over the whole touch region (defined, say, by a whiteboard or underlying display). However in a simplified approach only a lower "edge" of the touch sheet may be aligned to the surface.
This is complicated by the acute angle view of the touch sensing camera -in effect there is an ambiguity in height since, to the camera, a greater distance of the marker from the camera and a greater height of the touch sheet (scattered light) above the touch surface both result in the scattered light moving upwards in the view of the camera (or downwards, depending upon the disposition of the camera). Thus in embodiments to determine the height On the z-direction) the position of the marker is determined (in the x-y plane), preferably after calibration since the height is very sensitive to (x,y) error.
Thus preferably the method further comprises calibrating the touch sensing system to map a captured image space of the camera to an (x,y) position space on the touch surface. Preferably then a common coordinate system is employed for determining a difference in position between the location of the marker (in the visible, marker image) and a location of the light scattered by the marker in the touch image. This difference in position may be used directly in an alignment system or, more preferably, it may be converted to a height of the touch sheet at the (x,y) position of the marker above the touch surface. This procedure is facilitated by employing the same camera for capturing the visible marker image and the touch image. Thus the touch camera may be provided with an infrared filter (a long-wavelength pass or bandpass filter) which may be moved out of and into position in front of the touch camera for capturing the marker and touch images respectively. This may be achieved with a mechanical actuator or motor; optionally a focus correction may be made. Alternatively, however, since cameras are relatively inexpensive, multiple cameras may be employed.
In preferred embodiments of the method one or more markers are placed at a plurality of different locations on the touch surface, either sequentially or concurrently, so that data determining the location of the touch sheet may be established. For convenience the touch sheet may be assumed to be a plane, the location of which may be established at three different positions of the marker(s), though preferably 4, 5 or more different locations are used to over-constrain this plane (which in practice is not precisely planar).
In embodiments a supposed plane of the touch sheet is defined by a touch sheet equation; this may be the equation of a plane (for example defined by a normal to the plane and a point within the plane) or it may be the equation of a line. The equation of a line may suffice since in general, because of the geometry of the touch system as shown in Figure 1b, a line along a light ray from the (laser) light source may be sufficient to define the plane for alignment purposes. Alternatively a more complex equation describing a curved surface (or portion thereof) may be employed.
Once the location of the touch sheet has been defined embodiments of the method then provide a touch sheet alignment signal to the user. This may either be a good/bad alignment signal for one or more degrees of freedom of motion of the touch sheet, or this may be a signal indicating a degree of divergence from a target alignment for the touch sheet, again in one or more degrees of freedom of motion of adjustment of the touch sheet. Such a touch sheet alignment signal may be determined dependent upon a difference between the touch sheet equation and a target touch sheet location. The target touch sheet location may comprise an acceptable range of target touch sheet locations; these may be defined in terms of one or more distances of the touch sheet from the touch surface and/or one or more angles which the touch sheet makes with the respect to the touch surface. The target touch sheet location may optionally also be defined by a line, plane or other equation. The touch sheet alignment signal may be displayed, for example, on the touch sensing apparatus and/or on the screen of a computer system associated with the touch sensing/display system and/or on a mobile device such as a smartphone, for example connected to the touch sensing system via an RF (eg. BluetoothTM) link.
In embodiments the touch sheet projector has a set of adjusters to adjust a position of the touch sheet. For example the touch sheet projector assembly, whether comprising one or more lasers, may be mounted using a set of three screws allowing adjustment of the touch sheet in height, "pitch" and "tilt7roll". The touch adjustment signal may then comprise a set of digital or analogue meter-type displays, one for each adjuster (screw) indicating a direction and/or degree of adjustment to be made to bring the touch sheet towards the target sheet location. In embodiments the target sheet location may be, for example, a location less than lOmm above the display surface and generally parallel to the display surface.
Thus in a related aspect the invention provides a method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; determining touch sheet height data from said touch image, said touch sheet height data representing a height of said touch sheet above said touch surface at a location of said marker; and providing to a user a touch sheet alignment display comprising a set of one or more meters, one for each of a respective position adjuster of a mount of said touch sheet projector; wherein said set of meters displays data derived from said touch sheet height data, indicating to said user a direction or degree of adjustment of said position adjuster to adjust a location of said touch sheet towards a target position.
In a related aspect the invention provides a method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; the method comprising: sensing a location of said marker; using said sensed location to determine a height of said touch sheet above said touch surface at said sensed location; and providing a signal for aligning said touch sheet to said touch surface using said determined height.
The invention further provides a touch sensing system comprising a touch sheet projector to project light defining a touch sheet above a touch surface, a sensor for sensing a location of said marker; and means for using said sensed location to determine a height of said touch sheet above said touch surface at said sensed location, and providing a signal for aligning said touch sheet to said touch surface using said determined height.
The invention further provides a method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; determining touch sheet height data from said touch image, said touch sheet height data representing a height of said touch sheet above said touch surface at a location of said marker; determining, from said touch sheet height data at a plurality of different locations, a touch sheet equation defining said touch sheet or a portion thereof with respect to said touch surface; determining a difference between said touch sheet or portion thereof defined by said touch sheet equation and a target touch sheet location defined with respect to said touch surface; and outputting a touch sheet alignment signal dependent upon said difference between said touch sheet or portion thereof defined by said touch sheet equation and said target sheet location.
In conjunction with or independently from the above described methods, a marker may be employed which is configured so that when the marker is placed on the touch surface the height of the touch sheet produces a characteristic variation in the light scattered by the marker. For example, the marker or a reflective or scattering surface on the marker may be tapered either towards or away from the touch surface on which it is placed, so that the higher up the marker the touch sheet intersects, the longer (or shorter) the length of a line or pattern of light scattered by the marker.
In preferred embodiments of this approach a lateral dimension of the marker or a light scattering part of the marker changes substantially linearly with distance away from the touch surface, but it will be appreciated that other height-encoding shapes or configurations may be employed. For example a stepped scattering surface may indicate which of a plurality of height ranges the light sheet falls within by providing a discrete set of different length lines of scattered light. Other variations will be apparent to those skilled in the art.
Thus the invention also provides a marker for facilitating alignment of a touch sheet to a touch surface in particular using a method as described above, where the marker is configured such that light scattering by the marker is dependent upon distance along the marker in a direction away from the touch surface.
Thus the invention also provides a method of aligning a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising an optical projector to project light defining a touch sheet above a touch surface; the method comprising: configuring an alignment device such that a lateral dimension of a light scattering property of the alignment device changes with distance away from the touch surface on which the alignment device is based; placing said alignment device on said touch surface; detecting a location of said alignment; determining a height of said touch sheet from an image of a line or pattern of light scattered from said alignment device; and providing feedback to a user on the height of said touch sheet at said detected location, for aligning said touch sheet.
The skilled person would appreciate that in all of the above described methods it is preferable but not essential for the system to include a display image projector to project a displayed image onto a touch surface.
The invention further provides processor control code to implement the above-described methods, for example on a general purpose computer system or on a digital signal processor (DSP). The code is provided on a non-transitory physical data carrier such as a disk, CD-or DVD-ROM, programmed memory such as non-volatile memory (eg Flash) or read-only memory (Firmware). Code (and/or data) to implement embodiments of the invention may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, or code for a hardware description language. As the skilled person will appreciate such code and/or data may be distributed between a plurality of coupled components in communication with one another.
The invention also provides a touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the system further comprising a processor coupled to program memory storing processor control code to: capture a marker image of a marker placed on said touch surface to intersect said touch sheet; determine a location of said marker with respect to said touch surface from said marker image; capture a touch image of light scattered from said touch sheet by said marker; determine touch sheet height data from said marker image and said touch image, said touch sheet height data comprising data representing a height of said touch sheet above said touch surface at said location of said marker; and output a touch sheet alignment signal dependent on said touch sheet height data.
Broadly speaking embodiments of the apparatus are configured to identify a position of one or more markers from a visual image of the marker and then the apparatus, preferably using the same camera, captures an infrared image when the touch sheet is illuminated. The difference in marker position between these two images is then used to determine the height of the touch sheet above the touch surface, although in principle the position difference could be used to determine an adjustment signal for the touch sheet without converting to an explicit height of the touch sheet above the touch surface.
In embodiments the apparatus is arranged to work in a coordinate space of the displayed image, which is approximately the same as physical space on the touch surface (the two spaces are equivalent if anti-distortion is applied).
This coordinate frame is optimal as the touch position data will, in general, be referenced to the displayed image when the apparatus is in use and thus the preferred reference for the touch image is the projected displayed image space.
As previously described, in preferred embodiments the apparatus is calibrated to match the visible coordinate space of the camera to physical (x,y) coordinate space on the touch surface. There are many ways in which this may be achieved. For example in one approach respective sets of vertical and horizontal stripes are displayed on the touch surface by a display image projector, and the known position of these stripes (in projected, displayed image space) is used to map camera space to projected displayed image space. Preferably, but not essentially, a correction may also be applied because of the different focus and distortions of the touch camera in visible as compared with infrared light. This may be a standard, pre-computed correction for the touch camera/imaging system. This approach thus provides a mapping between the displayed calibration image and the touch image space.
Once the position of the touch sheet has been measured (using any convenient representation) this information may then be fed back to the user. More particularly this information may be used to provide an indication of which way the system should be adjusted to bring the touch sheet towards the touch surface and/or align the touch sheet parallel to the touch surface.
The distance of the touch sheet from the touch surface should preferably be close to the touch surface but not so close that it is intersected by undulations or non-uniformities in the touch surface (which may be the surface of a whiteboard). Thus in embodiments of the described methods/apparatus optionally a check may be performed that the touch sheet does not intersect the touch surface, for example by looking for light scattered from the touch surface. In addition, in more sophisticated approaches one or more additional parameters of the touch sheet may be determined. For example a thickness of the touch sheet may be determined by determining a range of heights for the sheet from the degree to which the scattered light extends (in a direction corresponding to the z-direction from the touch surface) within the touch image.
In addition to or independently from the above described techniques another technique which may be employed for determining the height of a touch sheet is to measure the angular difference between light scattered by the marker and a reflection of light scattered by the marker in the touch (whiteboard) surface. This provides a relatively direct measure of the height of the touch sheet at the location of the marker (which location can be determined from the visible image as previously described).
Thus in a further aspect the invention provides a method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; and determining touch sheet height data from said touch image, said touch sheet height data representing a height of said touch sheet above said touch surface at a location of said marker; wherein said determining of said touch sheet height data comprises processing said touch image to identify a difference in position between said light from said touch sheet scattered by said marker and a reflection in said touch surface of said light scattered by said marker.
In a still further approach an active marker may be employed to in effect, measure the height of the touch sheet above the touch surface by sensing the touch sheet and providing a corresponding signal. The signal may simply comprise an illuminated LED at the height of the touch sheet, or a signal may be transmitted from the marker to the touch sensing apparatus and/or to a smart phone or the like, for display for the height of the touch sheet. This height information may then be used as previously described.
Thus the invention further provides an active marker comprising: a power source; a ladder or array of optical sensors to sense the height of said touch sheet above said display surface; and a marker output to output a marker signal representing said height.
In a related aspect the invention provides a method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising a touch sheet projector to project light defining a touch sheet above a touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet, wherein said marker is an active marker comprising a ladder or array of optical sensors to sense the height of said touch sheet above said display surface, and having a marker output to output a marker signal representing said height; and providing a user output and indicate said height to a user, wherein said user output is derived from or comprises said marker signal.
A marker may also be employed to calibrate for a difference in (x,y) position as seen by the touch camera as compared with the actual (x,y) position of an object. This difference arises because of the finite distance of the touch sheet above the touch surface, coupled with the acute viewing angle of the touch camera. Broadly speaking the visible marker image can be used to determine the actual marker position and the touch image the maker position as seen by the touch camera. The difference between these two positions can then be used to determine a correction to be applied to the sensed touch position. This correction is position-dependent since it is greater further from the camera where the camera "viewing" angle is more acute.
Thus in a further aspect the invention provides a method of calibrating a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising; placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; processing said touch image to determine a touch-identified location of said marker with respect to said touch surface; and comparing said touch-identified location of said marker with a known or sensed location of said marker on said display surface to determine a calibration relating a position of said marker seen by said touch sensing system with an actual location of said marker.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will now be further described, by way of example, with a reference to the accompanying figures in which: Figures la and lb show, respectively, a plan view and a side view of an interactive whiteboard incorporating a touch sensing system; Figure 2 shows a touch sensing system incorporating a touch sheet projector; Figure 3 shows an embodiment of a touch signal processing system;
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Touch sensing systems Referring to Figure 1, this shows plan and side views of an example interactive whiteboard touch sensitive image display device 100 incorporating such a system. In the illustrated example a touch sheet projector 110 comprises three IR fan sources 102, 104, 106, each providing a respective light fan 102a, 104a, 106a spanning approximately 1200 together defining a substantially continuous sheet of light 101 just above display area 120. The fans overlap on the display area (which is economical as shadowing is most likely in the central region of the display area). In embodiments a common laser may be employed to generate multiple portions of the touch sheet, that is a common laser may be employed to generate two or more light fans making up the light sheet.
Typically such a display area 120 may be of order lm by 2m. The side view of the system in Figure lb illustrates, in this example, a display image projector 122 and a touch image capture camera 112 either aligned side-by-side or sharing a portion of the display projection optics. In the example the optical path between the projector/camera and display area is folded by a mirror 124. The sheet of light generated by fans 102a, 104a, 106a is preferably close to the display area, for example less than lcm or 0.5cm above the display area. However the camera and display projector 112, 122 are supported on a support 126 and may project/view light from a distance of up to around 0.5m from the display area. The display image projector is optional; any suitable display image projector may be employed, for example a digital micromirror-based display image projector.
In embodiments, broadly speaking the touch sheet projector 110 is configured to project a sheet of infrared light just above the surface of the displayed image (or whiteboard), for example at a distance of one to a few millimetres above this surface. Figure 2 shows a simple embodiment of a touch sensing system incorporating such a touch sheet projector 110. Thus the touch sheet projector 110 may comprise an IR LED or laser 252, collimated then expanded in one direction by light sheet optics 254 such as a cylindrical lens; the laser/LED output may optionally be split into n paths to generate 17 sheets of light. A CMOS imaging sensor (touch camera) 260 is provided with an IR-pass lens 258 and captures light scattered by touching the touch sheet, with an object such as a finger, through the sheet of infrared light 101 (the boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a, b). The touch camera 260 provides an output to a touch signal processing system.
Referring now to Figure 3, this shows an example of a touch signal processing system 300 for use with the touch sensing system. Like elements to those previous previously described are indicated by like reference numerals.
In the illustrated example of Figure 3 a controller 320 controls the IS laser on and off, controls the acquisition of images by camera 112 and controls projector 122. In this example images are captured with the IR laser on and off in alternate frames and touch detection is then performed on the difference of these frames to subtract out any ambient infra-red. The touch camera preferably also includes a notch filter at the laser wavelength which may be around 780-950nm. Because of laser diode process variations and change of wavelength with temperature this notch may be relatively wide, for example of order 20 nm and thus it is desirable to suppress ambient IR.
As described later, in embodiments of the invention the IS-pass filter may be switched in and out of the optical path to the camera 112, to allow the camera to view visible wavelengths for capturing a marker image.
In the embodiment of Figure 3 subtraction is performed by module 302 which, in embodiments, is implemented in hardware (an FPGA). In embodiments module 302 also performs binning of the camera pixels, for example down to approximately 80 by 50 pixels. This helps reduce the subsequent processing power/memory requirements and is described in more detail later. However such binning is optional, depending upon the processing power available, and even where processing power/memory is limited there are other options, as described further later.
Following the binning and subtraction the captured image data is loaded into a buffer 304 for subsequent processing to identify the position of a finger or, in a multi-touch system, fingers.
Because the camera 112 is directed down towards the plane of light at an angle it can be desirable to provide a greater exposure time for portions of the captured image further from the device than for those nearer the device. This can be achieved, for example, with a rolling shutter device, under control of controller 320 setting appropriate camera registers.
Depending upon the processing of the captured touch sense images and/or the brightness of the laser illumination system, differencing alternate frames may not be necessary (for example, where 'finger shape' is detected). However where subtraction takes place the camera should have a gamma of substantially unity so that subtraction is performed with a linear signal.
Various different techniques for locating candidate finger/object touch positions will be described. In the illustrated example, however, an approach is employed which detects intensity peaks in the image and then employs a centroid finder to locate candidate finger positions. In embodiments this is performed in software. Processor control code and/or data to implement the aforementioned FPGA and/or software modules may be provided in nonvolatile memory or on a disk 318.
In embodiments module 306 performs thresholding on a captured image and, in embodiments, this is also employed for image clipping or cropping to define a touch sensitive region. Optionally some image scaling may also be performed in this module. Then a crude peak locator 308 is applied to the thresholded image to identify, approximately, regions in which a finger/object is potentially present. Spots in the captured image provide a first estimation of the centre-of-mass; this is preferably used in conjunction with a differential approach to minimize noise, i.e. one frame laser on, next frame laser off.
A centroid locator 310 (centre of mass algorithm) is applied to the original (unthresholded) image in buffer 304 at each located peak, to determine a respective candidate finger/object location. The system then applies distortion correction 312 to compensate for keystone distortion of the captured touch sense image and also, optionally, any distortion such as barrel or keystone distortion, from the lens of touch camera 112. Because nearer parts of a captured touch sense image may be brighter than further parts, the thresholding may be position sensitive, alternatively position-sensitive scaling may be applied to the image in buffer 304 and a substantially uniform threshold may be applied.
In one embodiment of the crude peak locator 308 the procedure finds a connected region of the captured image by identifying the brightest block within a region (or a block with greater than a threshold brightness), and then locates the next brightest block, and so forth, preferably up to a distance limit (to avoid accidentally performing a flood fill). Centroid location is then performed on a connected region. In embodiments the pixel brightness/intensity values are not squared before the centroid location, to reduce the sensitivity of this technique to noise, interference and the like (which can cause movement of a detected centroid location by more than once pixel). A simple centre-of-mass calculation is sufficient for the purpose of finding a centroid in a given ROI (region of interest).
In embodiments the distortion correction module 312 performs a distortion correction using a polynomial to map between the touch sense camera space and the displayed image space: Say the transformed coordinates from camera space (x,y) into projected space (xcy) are related by the bivariate polynomial: x' = xlcyr and y' = :cc yr; where C, and Cy represent polynomial coefficients in matrix-form, and x and y are the vectorised powers of x and y respectively. Then we may design Cx and Cy such that we can assign a projected space grid location (i.e. memory location) by evaluation of the polynomial: b=Lxr j+ xL Where X is the number of grid locations in the x-direction in projector space, and L. j is the floor operator. The polynomial evaluation may be implemented, say, in Chebyshev form for better precision performance; the coefficients may be assigned at calibration. Further background can be found in our published PCT application W02010/073024.
Once a set of candidate finger positions has been identified, these are passed to a module 314 which tracks finger/object positions and decodes actions, in particular to identity finger up/down or present/absent events. In embodiments this module also provides some position hysteresis, for example implemented using a digital filter, to reduce position jitter. In a single touch system module 314 need only decode a finger up/finger down state, but in a multi-touch system this module also allocates identifiers to the fingers/objects in the captured images and tracks the identified fingers/objects.
Alignment and calibration systems We will now describe techniques for aligning the touch sheet at 101 to the touch/display area 120, and some related calibration techniques. These may be used to align the touch sheet projector, also referred to as the line generator (LG) or line generator assembly to, for example, the surface of an interactive white board at installation of the system. We describe various different techniques which may be employed.
Referring to Figure 4a, this illustrates the principle of one approach in which height estimation is based upon the parallax of an object as seen from the known location of the camera 112. In Figure 4a an object 400 such as a marker or finger is seen by camera 112. Line 402 represents the location of the marker or point at which the finger is pointing as seen from the point of view of the camera when a visible image is captured, and line 404 represents the portion of the finger/marker lit by the infra-red touch sheet, also as seen by the camera, when the camera captures an IR image. The height of the touch sheet 101 is estimated based upon the difference between these two locations as seen by camera 112.
We assume, initially, that the camera has no distortion and no wavelength-dependent scaling (that is the camera sees different wavelengths at the same numbers of pixels per meter. Although the calculations below are based upon these assumptions, in practice it is straightforward to incorporate distortion/scaling correction by applying such a correction to an image captured from the camera. We also assume that the touch sheet is substantially planar but again the techniques we describe may be extended to non-planar cases.
Referring to Figure 4b, this shows the geometry of Figure 4a, in particular illustrating the Height of the camera above display surface 120, the apparent distance Dl of the object/marker from the camera along surface 120, as seen by the visible camera image, and the corresponding apparent distance D2 for the infra-red image The height of the touch sheet, h is shown at the actual location of the object/marker.
To simplify the explanation we assume that the camera is viewing the touch surface in a direction normal to the touch surface so that there is a point within the captured image which represents the location of the camera. We define an optical axis along the direction of camera support 126 so that with this approach the camera's position within the image is the image of a ray along this optical axis. (The skilled person will appreciate that it is straightforward to correct an image captured at an acute angle so that it appears to have been captured perpendicular to the surface). With this definition of the optical axis, D1 is the distance from the actual marker position to the optical axis, and distance D2 is the distance from part of the marker illuminated by the IS touch sheet to the optical axis. The relationship between the distances in Figure 4b is therefore as follows: H -h H D1 D2 and thus h= H(1-D1 D2, Typically distance H is defined by the apparatus On one embodiment H equals 320mm), and thus if D1 and D2 can be determined then h can be evaluated. The effective position of the optical axis on the camera sensor is determined by the geometry of the apparatus; the positions of the target in the visible and infra-red images are used to determine D1 and D2, in a common co-ordinate space, preferably the co-ordinate space of the displayed image (or a scaled version of this). This then determines the touch sheet height in corresponding coordinates that is with a corresponding scale (which may be defined by a particular number of pixels per meter).
Figure 5a shows a set of pairs of calibration points in which square crosses indicate the (actual) mark up positions in the visible image and diagonal crosses indicate the detected marker positions in the infra-red image captured from light scattered from the touch sheet. One example pair of corresponding positions has been ringed; the optical axis is at the origin, (0,0). Figure 5a shows the increasing parallax with greater distance from the optical axis.
In one approach the marker positions (square crosses) may be defined by projecting an image onto the touch/display surface 120, but this requires relatively precise positioning of the markers. Preferably, therefore, the marker positions are determined from the visible image captured by the camera.
Figure 5b shows a map of touch sheet heights for the calibration points similar to those shown in Figure 5a. In Figure 5b the radius of the circle at each calibration position indicates the height of the touch sheet at that position. In Figure 5b the results for two different touch sheet positions are super imposed, an aligned position and a non-aligned position. The touch sheet heights for the aligned touch sheet are indicated by the inner disks 500 -as can be seen these heights are relatively uniform and or relatively small indicating that the touch sheet is close to and substantially parallel to display area 120. The larger disks 502 that relate to the mis-aligned touch sheet; the touch sheet is at a much greater distance from display area 120 and is tilted so that the distance from the display area increases towards the bottom of the map. (The final row of disks also has optical artefacts arising from gross misalignment).
From Figures 5a and 5b it can be seen that the parallax is smaller towards the optical axis with potentially greater measurement errors in this region so that the most reliable height data is obtained at larger distances from the optical axis, here towards the bottom of the board.
As previously mentioned the above described height estimation method presumes that the camera is normal to the display area 120 and that the image is substantially undistorted. In practice the image captured from the camera is corrected for lens distortion and key stone based upon an initial calibration procedure, for example as previously outlined. In embodiments this maps the captured image to a "virtual sensor space" at which is associated with a scaling factor, for example in pixels per meter, which is preferably dependent on whether visible or infra-red light is employed. This facilitates expressing all distances in equivalent metres; with this approach the errors introduced by key stone distortion are second order. Error analysis of the formula for h identifies that the distance error is proportional to D/H times the height error (where D is of the order of D1 and D2) so that at one metre from the optical axis a 3mm position error results in a 100mm touch sheet height estimation error (with a H equals 320mm).
It will be appreciated that to define a plane of the touch sheet it is preferable to measure the height of the touch sheet at three or more locations, for example locations 600 schematically illustrated in Figure 6a, although in principle two locations may suffice. For example the two locations may be vertices of a triangle with the third vertex defined by the location of the line generator, for example where the line generator is at the top of the display area and the two heights are measured at the bottom left and bottom right corners since these are the locations where the touch sheet is most likely to intersect/bear away from the touch display surface. Similarly in principle the touch sheet may effectively be defined by the equation of a line between the two measured heights, for similar reasons.
As previously mentioned, in one approach a user may be requested to place markers at specific positions on the display area, but preferably the marker positions are detected/measured by the camera before detecting their position in the infra-red, that is where they are intersected by the touch sheet. This latter approach helps to remove a significant source of operator error. Where multiple positions are measured multiple markers may be placed on the board simultaneously or sequentially.
Figure 6b illustrates a maker 610 which facilitates accurate location of the marker within a (visible) image captured by the camera. Thus, as illustrated, the marker has a triangular reflective/scattering portion 612, preferably pointing towards the board. Figure 6c shows a captured (and edge-detected) image of five such markers, illustrating that the triangular region helps to define the location of the marker on the board. Additionally or alternatively the marker of Figure 6b may be used as a substantially direct measure of the height of the touch sheet. Thus referring to Figure 6b, the length / of scattered touch sheet light in the touch image is a linear function of height h. In some preferred implementations the system of Figure 3 is modified to include signal processing modules for implementing the alignment/calibration and, preferably, to control the camera and touch projection system during alignment/calibration. More particularly the camera is switched between infra-red-viewing and visible-viewing modes (although alternatively two different cameras may be employed) and the touch sheet projection is preferably switched off during visible image capture.
Referring to Figure 7a, this shows an alignment/calibration procedure, preferably implemented in software, according to an embodiment of the invention. Thus at step S700 the procedure is initialised by displaying to the user (for example using the display system projector) a request to place markers onto the touch/display area 120. For an interactive whiteboard these markers may be magnetic to retain them in place. The system then waits for the user to confirm that this has been done and then proceeds to determine the touch sheet height for the multiple markers (3701). Thus the procedure first switches off the touch sheet and places the camera in visible image capture mode (for example by removing an IR filter), captures a visible image and processes this to identify the locations of the markers (3702). This may be performed only once during the alignment procedure if the markers are not moved.
The system then turns the touch sheet projector on and reinstates the IR filter for the camera, then capturing an IR "touch" image and processing this to identify the locations of the markers (3704). At this stage the system has an internal representation of information similar to that depicted in Figure 5a. The system then resolves the marker locations in the visible and IR, for example using the above described equations to determine the touch sheet heights at the locations of the makers (3706). Optionally the system then determines a notional plane of the touch sheet (3708), for example by determining the equation of a line or plane representing the sheet. In some preferred embodiments the touch sheet position is then compared with a target position or range of positions (3710) and this information is preferably then used to determine information representing a difference between the measured position of the touch sheet and a desired position of the touch sheet. This information may be displayed in many different ways, for example as a good/bad position indication but in one preferred approach, described later, the system displays a set of meters which provide the user with an indication of both a direction of a discrepancy between the actual and desired position and, preferably, an indication of the degree of discrepancy. Preferably this is provided for each of three degrees of freedom of the touch sheet (although as previously described two degrees of freedom may suffice), for example height, pitch and roll of the touch sheet defined with respect to the line generator assembly. The procedure then loops back to capture and process a further IR image so that, in embodiments, the meters display the discrepancy between the actual and desired touch sheet position in real time.
With this information the user may adjust the touch sheet position by adjusting a set of adjusters such as screws mounting the line generator assembly to bring the touch sheet into alignment. Once this has been done the user may signal the end of a calibration/alignment process to the system, at which point the system may optionally log data defining the final touch sheet position. This data may be used, for example, to correct for the parallax illustrated in Figure 4 between the touch sheet position and actual position of an object such as a finger or pen when the system is in use.
Although embodiments of the method have been described for use in aligning the touch sheet to the display surface, in principle the method may be used independently of this for determining a calibration or mapping between the location of an object such as a pen or finger seen by the touch sheet, location 792 in Figure 7c, and the actual location of the object, location 790 in Figure 7c.
Referring to Figure 7b, this shows a touch sensing system 750 similar to that of Figure 3 but configured to implement the above described alignment/calibration procedure. Thus controller 720 is configured to be able to put the system 750 into a calibration mode, by controlling IR laser illumination system 110 on/off and by controlling a motor, actuator or the like to move an infra-red filter 722 into/out of an optical path to touch sense camera 112.
When calibration mode is selected signal processing as previously described is performed. Figure 7b illustrates one example of data pathways for such signal processing but the skilled person will appreciate that there are many variations possible. Thus in the illustrated example, a visible image processing module 752 receives an image from the touch sense camera 112 after optional ambient light suppression and binning, for marker detection/location. This module provides marker location data 754 to a comparison module 756 which compares visible and infra-red marker locations. Comparison module 756 receives data defining locations of the markers as seen by the touch sheet from a point in the touch image processing chain, for example following distortion correction module 312. The marker position comparison module 756 compares the marker positions in the visible and IR images and from this determines heights of the touch sheet at the marker locations, providing a touch sheet height data output 758. The touch sheet height data is processed by touch sheet position module 760 which, in embodiments, determines the equation of a line or plane of the touch sheet and matches this to a target position or range of positions. The target matching data is provided to the user by user feedback module 762, for example displaying this data as a set of meters on a graphical user interface.
Figure 8a shows an example set of meters for such a graphical user interface, as illustrated showing an indication of touch sheet height, roll (left-right) and pitch (top/bottom). For each of these the position of the touch sheet is indicated with respect to a target position, at "0".
Figure 8b illustrates schematically, a geometry of the line generator assembly 800 in such a system. As illustrated the line generator assembly is provided with three adjustable mounts 802a, b, c which may be employed to adjust the touch sheet in three degrees of freedom to align the touch sheet to a target position. Again, in principle, just two adjusters may be employed if, say, the height of the touch sheet at the location of the line generator assembly is well-defined.
Figure Sc shows a side view of the system of Figure 8b, illustrating the use of two passive markers 810.
Figure 9 shows an embodiment of an active maker 900. Such a marker may either be used in conjunction with the previously described touch sheet alignment procedure, for example to provide a more direct measure of touch sheet height at different marker positions. Alternatively active marker 900 may be used to align a system of the type illustrated in Figure 3 without the touch sensing system being adapted to perform any particular type of calibration procedure, as described further below.
In one embodiment active marker 900 is fabricated on a printed circuit board 902 and comprises a ladder of photo detectors 904 and a corresponding ladder of optical indicators 906. Photo detectors 904 and indicators 906, for example LEDs are coupled to control circuitry 908, for example a micro controller, powered by a battery 910. In one embodiment each indicator 906 lights up when infrared light is detected by the corresponding photo detector 904. In such an embodiment, in use a base 912 of the printed circuit board is placed on the display area 120 and the height of the illuminated ladder of LEDs 906 indicates the height of the touch sheet at that location. This provides a simple tool which may be used to align the touch sheet.
The skilled person will appreciate that many variations of the user output are possible. For example in one alternative approach the marker 900 is provided with an RF communications link 914, such as a Bluetooth link, additionally or alternatively to LEDs 906. This may be used to communicate with the display system and/or a mobile device such as a smart phone, to provide an alternative display of touch sheet height and, optionally, to map touch sheet height at a plurality of different locations.
Figure 10 illustrates a still further approach which may be employed additionally or alternatively to those described above to identify the height of the touch sheet at a location on the display area 120. Thus in the arrangement of figure 10 camera 112 captures an image comprising a first ray 1002 of light scattered by marker 1000 and a second ray 1004 comprising light scattered by the marker and reflected off surface 120. Thus the camera effectively sees the marker 1000 at two different locations within the touch image, one defined by the direct ray 1002 and one from the reflected ray 1004. The reflected ray 1004 sees a reflection 1010 of marker 1000 in surface 120 and thus, as illustrated, the angle between raise 1002 and 1004 defines a distance which is twice the height of the touch sheet at the location of the marker. Thus the effective angle between the rays can be substantially directly translated into a measure of the height of the touch sheet at the location of the marker, using simple geometry.
Although example embodiments have been described with reference to a touch sensing system which employs a substantially continuous sheet of light, the techniques we describe are also applicable to other types of optical touch sensing system, in particular those using scanned beams to define a touch sheet as described, for example, in our published patent applications W02014/027189 and W02014/125272.
The techniques we have described are particularly useful with large touch-sensitive regions (say >0.5m in one direction), for example for large touch-sensitive flat screen displays or interactive whiteboards; but they also have advantages in smaller scale touch sensitive displays.
No doubt many other effective alternatives will occur to the skilled person. It will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the spirit and scope of the claims appended hereto.

Claims (27)

  1. CLAIMS: 1. A method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a marker image comprising said marker; determining a location of said marker with respect to said touch surface from said marker image; capturing a touch image of light scattered from said touch sheet by said marker; determining touch sheet height data from said marker image and said touch image, said touch sheet height data comprising data representing a height of said touch sheet above said touch surface at said location of said marker; and outputting a touch sheet alignment signal dependent on said touch sheet height data.
  2. 2. A method as claimed in claim 1 wherein said capturing of said marker image comprises capturing a visible image using said camera, and wherein said capturing of said touch image comprises capturing an infrared (IR) image using said camera.
  3. 3. A method as claimed in claim 1 or 2 further comprising calibrating said touch sensing system to map a captured image space to a coordinate space on said touch surface.
  4. 4. A method as claimed in claim 1, 2 or 3 wherein said determining of said touch sheet height data comprises determining a difference in position, in a common coordinate system, between said location of said marker and a location of said light scattered by said marker.
  5. 5. A method as claimed in claim 4 wherein said determining of said touch sheet height data further comprises determining a height of said touch sheet above said touch surface from said difference in position.
  6. 6. A method as claimed in any preceding claim comprising placing said marker and determining said touch sheet height data at a plurality of different locations of said marker on said touch surface, and determining touch plane data defining a plane of said touch sheet from said touch sheet height data at said plurality of different locations.
  7. 7. A method as claimed in any preceding claim further comprising: determining, from said touch sheet height data at a plurality of different locations, a touch sheet equation defining said touch sheet or a portion thereof with respect to said touch surface; determining a difference between said touch sheet or portion thereof defined by said touch sheet equation and a target touch sheet location defined with respect to said touch surface; and determining said touch sheet alignment signal dependent upon said difference between said touch sheet or portion thereof defined by said touch sheet equation and said target sheet location.
  8. 8. A method as claimed in any preceding claim, wherein said touch sheet alignment signal comprises a graphical display of a set of meters comprising touch sheet height, pitch, and tilt/roll meters for enabling a user to perform a mechanical adjustment to said projector to bring said touch sheet towards a target, aligned position wherein said target aligned position is defined by stored target alignment data.
  9. 9. A method as claimed in any preceding claim wherein said projector is mounted on an adjustable mount comprising a plurality of adjusters for adjusting a position of said touch sheet; and wherein said touch sheet alignment signal comprises a signal informing said user how to adjust said adjusters to bring said touch sheet towards said aligned position.
  10. 10. A method as claimed in any preceding claim wherein said marker is configured such that light scattering by said marker is depending upon distance along said marker in a direction away from said touch surface, the method further comprising determining second touch sheet height data from a characteristic of the light scattered by said marker in said captured touch image, said second touch sheet height data defining a distance of said touch sheet along said marker in said direction away from said touch image; and combining said touch sheet height data and said second touch sheet height data to determine said touch sheet alignment signal.
  11. 11. A method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; the method comprising: sensing a location of said marker; using said sensed location to determine a height of said touch sheet above said touch surface at said sensed location; and providing a signal for aligning said touch sheet to said touch surface using said determined height.
  12. 12. A method as claimed in claim 11 comprising capturing a touch image of light scattered from said touch sheet by said marker, and using said sensed location in combination with said touch image to determine said height of said touch sheet.
  13. 13. A method as claimed in claim 11 or 12 comprising capturing a touch image of light scattered from said touch sheet by said marker, and wherein said determining of said height comprises processing said touch image to identify a difference in position between said light from said touch sheet scattered by said marker and a reflection in said touch surface of said light scattered by said marker.
  14. 14. A method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; determining touch sheet height data from said touch image, said touch sheet height data representing a height of said touch sheet above said touch surface at a location of said marker; determining, from said touch sheet height data at a plurality of different locations, a touch sheet equation defining said touch sheet or a portion thereof with respect to said touch surface; determining a difference between said touch sheet or portion thereof defined by said touch sheet equation and a target touch sheet location defined with respect to said touch surface; and outputting a touch sheet alignment signal dependent upon said difference between said touch sheet or portion thereof defined by said touch sheet equation and said target sheet location.
  15. 15. A method as claimed in claim 14 wherein said touch sheet equation comprises an equation of a line or plane.
  16. 16. A method as claimed in claim 15 wherein said determining of said difference comprises determining a difference between said equation of said line or plane and a target line or plane defined by a target line or plane equation.
  17. 17. A non-transitory data carrier carrying processor control code to when running, implement the method of any preceding claim.
  18. 18. A method of aligning a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising an optical projector to project light defining a touch sheet above a touch surface; the method comprising: configuring an alignment device such that a lateral dimension of a light scattering property of the alignment device changes with distance away from the touch surface on which the alignment device is based; placing said alignment device on said touch surface; detecting a location of said alignment; determining a height of said touch sheet from an image of a line or pattern of light scattered from said alignment device; and providing feedback to a user on the height of said touch sheet at said detected location, for aligning said touch sheet.
  19. 19. A method as claimed in claim 18 wherein said detecting of said location of said alignment device comprises imaging light scattered from an intersection of said alignment device with said touch sheet.
  20. 20. A touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the system further comprising a processor coupled to program memory storing processor control code to: capture a marker image of a marker placed on said touch surface to intersect said touch sheet; determine a location of said marker with respect to said touch surface from said marker image; capture a touch image of light scattered from said touch sheet by said marker; determine touch sheet height data from said marker image and said touch image, said touch sheet height data comprising data representing a height of said touch sheet above said touch surface at said location of said marker; and output a touch sheet alignment signal dependent on said touch sheet height data.
  21. 21. A method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; and determining touch sheet height data from said touch image, said touch sheet height data representing a height of said touch sheet above said touch surface at a location of said marker; wherein said determining of said touch sheet height data comprises processing said touch image to identify a difference in position between said light from said touch sheet scattered by said marker and a reflection in said touch surface of said light scattered by said marker.
  22. 22. A method as claimed in claim 21 comprising determining a value for said height of said touch sheet from an angle subtended between said light and said reflected light at said camera.
  23. 23. A method as claimed in claim 22 further comprising adjusting said height dependent upon a defined or sensed location of said marker.
  24. 24. A method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising a touch sheet projector to project light defining a touch sheet above a touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet, wherein said marker is an active marker comprising a ladder or array of optical sensors to sense the height of said touch sheet above said display surface, and having a marker output to output a marker signal representing said height; and providing a user output and indicate said height to a user, wherein said user output is derived from or comprises said marker signal.
  25. 25. An active marker for use in the method of claim 24, the active marker comprising: a power source; a ladder or array of optical sensors to sense the height of said touch sheet above said display surface; and a marker output to output a marker signal representing said height.
  26. 26. A method of calibrating a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising; placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; processing said touch image to determine a touch-identified location of said marker with respect to said touch surface; and comparing said touch-identified location of said marker with a known or sensed location of said marker on said display surface to determine a calibration relating a position of said marker seen by said touch sensing system with an actual location of said marker.
  27. 27. A method of facilitating alignment of a touch sheet to a touch surface in a touch sensing system, the touch sensing system comprising: a touch sheet projector to project light defining a touch sheet above a touch surface; and a camera to capture a touch image of light scattered from said touch sheet by an object approaching said touch surface; the method comprising: placing a marker on said touch surface to intersect said touch sheet; capturing a touch image of light scattered from said touch sheet by said marker; determining touch sheet height data from said touch image, said touch sheet height data representing a height of said touch sheet above said touch surface at a location of said marker; and providing to a user a touch sheet alignment display comprising a set of one or more meters, one for each of a respective position adjuster of a mount of said touch sheet projector; wherein said set of meters displays data derived from said touch sheet height data, indicating to said user a direction or degree of adjustment of said position adjuster to adjust a location of said touch sheet towards a target position.
GB1420272.5A 2014-11-14 2014-11-14 Touch sensing systems Withdrawn GB2536604A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1420272.5A GB2536604A (en) 2014-11-14 2014-11-14 Touch sensing systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1420272.5A GB2536604A (en) 2014-11-14 2014-11-14 Touch sensing systems

Publications (2)

Publication Number Publication Date
GB201420272D0 GB201420272D0 (en) 2014-12-31
GB2536604A true GB2536604A (en) 2016-09-28

Family

ID=52248368

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1420272.5A Withdrawn GB2536604A (en) 2014-11-14 2014-11-14 Touch sensing systems

Country Status (1)

Country Link
GB (1) GB2536604A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018111165B4 (en) 2017-09-18 2024-04-25 Lenovo (Beijing) Limited METHOD, ELECTRONIC DEVICE AND APPARATUS FOR CALIBRATING A TOUCH AREA

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110291990A1 (en) * 2010-05-26 2011-12-01 Seiko Epson Corporation Projection display system and attaching device
CN103091964A (en) * 2013-01-10 2013-05-08 苏州佳世达光电有限公司 Adjusting auxiliary tool and projection system
WO2013104060A1 (en) * 2012-01-11 2013-07-18 Smart Technologies Ulc Interactive input system and method
WO2013108032A1 (en) * 2012-01-20 2013-07-25 Light Blue Optics Limited Touch sensitive image display devices
CN204390218U (en) * 2015-01-21 2015-06-10 江苏天绘智能科技有限公司 A kind of infrared laser curtain position indicator

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110291990A1 (en) * 2010-05-26 2011-12-01 Seiko Epson Corporation Projection display system and attaching device
WO2013104060A1 (en) * 2012-01-11 2013-07-18 Smart Technologies Ulc Interactive input system and method
WO2013108032A1 (en) * 2012-01-20 2013-07-25 Light Blue Optics Limited Touch sensitive image display devices
CN103091964A (en) * 2013-01-10 2013-05-08 苏州佳世达光电有限公司 Adjusting auxiliary tool and projection system
CN204390218U (en) * 2015-01-21 2015-06-10 江苏天绘智能科技有限公司 A kind of infrared laser curtain position indicator

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018111165B4 (en) 2017-09-18 2024-04-25 Lenovo (Beijing) Limited METHOD, ELECTRONIC DEVICE AND APPARATUS FOR CALIBRATING A TOUCH AREA

Also Published As

Publication number Publication date
GB201420272D0 (en) 2014-12-31

Similar Documents

Publication Publication Date Title
US10805535B2 (en) Systems and methods for multi-camera placement
KR102456961B1 (en) Depth mapping using structured light and time of flight
KR102595391B1 (en) Distance sensor with adjustable focus imaging sensor
US9915521B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US20190230341A1 (en) Non-uniform spatial resource allocation for depth mapping
KR100753885B1 (en) Image obtaining apparatus
KR101331543B1 (en) Three-dimensional sensing using speckle patterns
US9971455B2 (en) Spatial coordinate identification device
US20140307100A1 (en) Orthographic image capture system
US20110096182A1 (en) Error Compensation in Three-Dimensional Mapping
JP7270702B2 (en) Depth sensing system and method
US8982101B2 (en) Optical touch system and optical touch-position detection method
JP2013042411A (en) Image processing apparatus, projector and projector system comprising the image processing apparatus, image processing method, program thereof, and recording medium having the program recorded thereon
JP2006258798A (en) Device and method for improved shape characterization
WO2013035553A1 (en) User interface display device
US20210166412A1 (en) Method and system for measuring an object by means of stereoscopy
JP6937482B2 (en) Surface shape measuring device and its stitching measuring method
TWI740237B (en) Optical phase profilometry system
CN110462688B (en) Three-dimensional contour determination system and method using model-based peak selection
US20140247249A1 (en) Touch Sensitive Display Devices
JP2015108582A (en) Three-dimensional measurement method and device
GB2536604A (en) Touch sensing systems
US20220244392A1 (en) High resolution lidar scanning
US20130241882A1 (en) Optical touch system and optical touch position detecting method
WO2022101429A1 (en) Depth measurement through display

Legal Events

Date Code Title Description
COOA Change in applicant's name or ownership of the application

Owner name: PROMETHEAN LIMITED

Free format text: FORMER OWNER: LIGHT BLUE OPTICS LTD

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)