WO2015185964A1 - System and method for multi-touch event detection - Google Patents
System and method for multi-touch event detection Download PDFInfo
- Publication number
- WO2015185964A1 WO2015185964A1 PCT/IB2014/062041 IB2014062041W WO2015185964A1 WO 2015185964 A1 WO2015185964 A1 WO 2015185964A1 IB 2014062041 W IB2014062041 W IB 2014062041W WO 2015185964 A1 WO2015185964 A1 WO 2015185964A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detector
- computing unit
- light sources
- touch event
- touch
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
- G06F3/041661—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving using detection at multiple resolutions, e.g. coarse and fine scanning; using detection within a limited area, e.g. object tracking window
Definitions
- Touch detection systems and methods are described.
- the system disclosed provides a multi-touch detection on potentially any surface by using an emitter creating a virtual touchable surface overlying it, coupled with image light detectors, a controller and a processor to compute the touch event, and to deal with the touch event or relay it to another device.
- This interaction with multi-touch surfaces takes the form of gestures. Those can be complex, using a large vocabulary involving multiple fingers, taps, and movement in all sorts of ways: up, down, left, right, circular, tapping, with one, two, three, up to 50 fingers with more than one user touching simultaneously.
- Interactive situations may comprise individuals interacting as well as groups interacting among themselves within the context of group software.
- Safety it should not use potentially harmful lasers, and it should minimize the crashing risks and associate injuries with broken parts
- Minimum size for touchable area since the goal is to cater to groups as well single users, the minimum size for the touchable area should be greater than the current tablet computers, to comfortably accommodate at least a small group of three people;
- Multitouch gestures as well groups interacting with the surface may require multiple touch detection at any given time;
- Touch resolution tasks such as signing a document, handwriting and freehand drawing should be supported
- one structure should house both the touch sensor and the system for projecting images on the touchable surface. Touch detection should not interfere on or be impaired by the projection image;
- Ruggedness It should be contained in a single-body structure, with no movable parts, allowing being lugged around and used in industrial environments;
- Some devices have detection based on a resistive wire grid installed in front of display. Others types of devices with small displays like smartphones and tablets have a capacitive touch-sensing technologies implemented.
- LLP Laser light plane
- laser light planes are generated by one or more lasers placed somewhere around the surface.
- LED When one or more fingers touch the laser light plane, light is scattered from the fingertips and a camera detects these light.
- the invention herein described proposes the usage of two noncoherent light sources, typically LED instead of LLP technique.
- Laser light touch sensor devices can be harmful to human eye due the characteristic of laser light - coherent and concentrated.
- the system can include an emitter comprising at least two light sources synchronized with a detector by a controller and at least two beam shapers arranged to generate at least two partially overlapping illuminated areas over a touchable surface.
- the controller can synchronize the at least two light sources with the detector, and can activate each of the at least two light sources in sequence and in a repeating loop.
- the detector can detect light reflected responsive to a touch event associated with the at least two light sources.
- a computing unit can process signals received from the detector using an image processing algorithm.
- the computing unit can include a processor that can execute instructions stored in a memory bank to perform image analysis on the signals received from the detector to identify the touch event, can compute a difference of the signals received by the detector from the at least two light sources, and can convert the signals into a binary image using an adaptive threshold.
- a memory bank can store instructions and record the binary image corresponding to activation of each light source, and a first communication interface can communicate with the controller.
- a second communication interface can communicate with the detector.
- Figure 1 is a schematic diagram of a touch detection system
- Figures 2A and 2B present the light propagation pattern of the light sources and in two different moments
- Figures 3A and 3B present a touch event sequence
- Figures 4A to 4D present an example output of image processing using the filtering and thresholding in which two white regions appear where two fingertips were in contact with the touchable surface;
- Figures 5A and 5B present a locally adaptive threshold to correct the effects due a non-uniform illumination
- Figure 6 is the flow diagram of the sequential operations carried by the main components of system.
- Figure 7 is the flow diagram wherein the image processing algorithms is performed.
- the Figure 1 is a schematic diagram of a touch detection system, which comprises an emitter 101 , a detector 108, a controller 106 and a computing unit 107.
- the emitter comprises two or more light sources 102 and 103, each coupled with its beam shaper 104 and 105, which are arranged to create two overlapping illuminated areas above the touchable surface 109.
- the power of each source may be selected in combination with the arrangement of the sources such that the touchable area is thoroughly illuminated.
- the detector 108 is placed above the touchable surface 109 and acquires the signal generated by the touch event
- the detector 108 may be composed of one single light sensor or multiple sensors arranged in different position and angles to cover the area of interest and provide a unique solution for locating the x,y touch coordinates.
- the term 'light' herein is used to refer to electromagnetic radiation of any wavelength, including, but not limited to, visible light, infra-red (IR) and ultra-violet (UV).
- IR infra-red
- UV ultra-violet
- the emitter can be comprising infrared light sources and the detector comprises an infra-red sensitive detector and an optical infra-red filter.
- the controller 106 synchronizes the light sources with the detector in order to capture the signal received by the detector for each light source.
- the computing unit 107 comprises a processor 1 10 that runs instructions stored in a memory bank 1 1 1 , a communication interface with the controller 1 12, a communication interface with the detector 1 13.
- inputs can modify the internal instructions, and the result can be presented on an output display partially or totally integrated to the touchable surface;
- the processor 1 10 and the controller interface 1 12 can be or can include, for example, one or more programmable microprocessors, microcontrollers, application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), or other similar device or combination of such devices.
- ASICs application-specific integrated circuits
- FPGAs field programmable gate arrays
- PLDs programmable logic devices
- the memory 1 1 1 provides temporary and/or long-term storage of instructions and/or data for use by processor, the controller and/or detector.
- Memory can be or include one or more volatile and/or non-volatile storage device, such as random access memory (RAM), read-only memory (ROM), flash memory, solid-state drives, or the like.
- the detector 108 may comprise any suitable means of detecting light reflected by a touch event. The choice of detector is therefore dependent on the particular implementation of the embodiment.
- the detector may include a filter to prevent wavelengths other than the wavelength of operation from being detected or from saturating the detector. Examples of suitable means include, but are not limited to, cameras and photo-sensors.
- the detector may be orientated in any suitable manner relative to the touchable surface so that it can capture light reflected inside the area of interest.
- the computing unit 107 configures the controller using the controller interface 1 12 that defines the timing of the emission of light sources 102 and 103 of emitter 101 .
- the controller 106 continuously activates light sources of emitter 101 one at a time and the activation lasts for the duration configured for the arrangement.
- the computing unit 107 configures the detector 108 using the detector interface 1 13. This step enables a synchronism signal output from detector 108 to the controller 106. The synchronism guarantees that the signal is acquired during the emission of only one light source.
- Figures 2A and 2B present the light propagation pattern of the light source 202 and 203 in two different moments.
- moment M1 only the light source 202 of the emitter 201 emits a light that is oriented by beam shaper 204 creating an illuminated area 206 over the touchable surface 209.
- the detector 208 acquires the raw signal S1 .
- the light source 203 of the emitter 201 emits a light that is properly oriented by beam shaper 205 creating other illuminated area 207 over the touchable surface 209.
- detector 208 acquires another raw signal S2.
- the beam propagation of light source is controlled by the beam shaper 204 and 205 that allows a horizontal propagation to cover the touchable surface 209 and limits the vertical propagation to reduce the hovering effect and optimize the touch detection algorithm.
- the computing unit process the signals S1 and S2 acquired by the detector during these two moments to determinate a touch event.
- the resulting signal S is an arithmetic point-by-point subtraction operation of these two raw signals S1 and S2 as described below:
- S(x,y) stands for the signal level resulted on x,y coordinates
- Smin is the minimum value acceptable as the result of the subtraction. In our case, we consider Smin as zero.
- FIG. 3A and 3B it is disclosed a touch event sequence.
- the user's finger 310 is close to the touchable surface 309 however it is not considered a touch event because the fingertip is reflecting light only from light source 303.
- the signal S2 due the light reflection on the finger is greater than S1 , because there is not reflection caused by light source 302, and the result of the subtraction is zero.
- FIG. 4A represents Signal S1 that is the finger reflection 402 detected at moment M1 .
- Fig. 4B is the Signal S2 due finger reflection 403 captured at moment M2.
- Fig. 4C the area indicated on fingertip 404 is the result of subtraction of signals and Fig. 4D represents the binary result as a binary image 405 after binary convention be applied to signal 404.
- This binary conversion uses a threshold light level to convert the detected signal into a binary image 405.
- Figure 4D presents an example binary output image after threshold conversion. Two white regions are the signals after threshold conversion, the rest of the working area, apart from two fingertips, is black.
- the embodiment uses a locally adaptive threshold illustrated in figures 5A and 5B to correct the effects due a non-uniform illumination.
- This approach uses a different threshold level 602 for different regions of the signal.
- the straightforward technic, the global threshold is a noncontextual approach and uses a specific select threshold value 601 .
- the proposed invention solves two common issues of touch detection using light emitters and detectors: the removal of the background noise 406 and the hovering effect of the finger over the touching surface.
- the background noise 406 is generated by the sunlight and the environment artificial light, and it impairs the performance of the touch detection.
- the choice of a specific wavelength in conjunction with a filter and the technique of signal subtraction allied with threshold conversion removes the background noise.
- the hovering effect may affect the user experience. It is important to have the touch event triggered only close to touchable surface in order to provide the user with an experience that is perceived as "real-touch”. That is achieved by detecting as intended touches by the user only those events where the finger is placed within a rather thin virtual touchable space delimited by the light source 302 in conjunction with the beam shaper 304. This thin touchable space is created just above a touchable surface to emulate the true- touch user interaction, allowing a high-quality multi-touch user experience.
- the touch detection performance would be compromised.
- the touch events would be generated far from the surface due the light propagation pattern.
- the user may not have to apply pressure to create a touch event.
- image- processing techniques may be applied to detect the pressure applied by the user.
- physical objects may be used to trigger the touch event. The object has to reflect the light emitted by the light source and calibrated in order to be correctly detected.
- detectable object is a white pen.
- Figure 6 is the simplified flow diagram of the sequential operations steps carried by the main components of system.
- Signal masking 701 involves masking the detected image to remove detections outside working area. This eliminates stray signals, which may have been detected outside the area of interest.
- Signal subtraction 702 remove background interference and reduce hovering effect.
- Adaptive threshold 703 is applied in order to correct effects due non-uniform illumination, through feature detection 704.
- the touchable surface 804 may be integrated with a display surface showed in figures 8A and 8B.
- the display presents images such that the user can interact with them using touch events detected by the system.
- the display can be a projector 803, where the touchable surface is a projection screen, a pad or any reflective surface.
- the display may be a LCD screen 805 where the LCD becomes the touchable surface.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112016028470A BR112016028470A2 (en) | 2014-06-06 | 2014-06-06 | multi-touch event detection system and method |
PCT/IB2014/062041 WO2015185964A1 (en) | 2014-06-06 | 2014-06-06 | System and method for multi-touch event detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2014/062041 WO2015185964A1 (en) | 2014-06-06 | 2014-06-06 | System and method for multi-touch event detection |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015185964A1 true WO2015185964A1 (en) | 2015-12-10 |
Family
ID=54766211
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2014/062041 WO2015185964A1 (en) | 2014-06-06 | 2014-06-06 | System and method for multi-touch event detection |
Country Status (2)
Country | Link |
---|---|
BR (1) | BR112016028470A2 (en) |
WO (1) | WO2015185964A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040108990A1 (en) * | 2001-01-08 | 2004-06-10 | Klony Lieberman | Data input device |
US20100231532A1 (en) * | 2009-03-12 | 2010-09-16 | Samsung Electronics Co., Ltd. | Touch sensing system and display apparatus employing the same |
US20100302191A1 (en) * | 2009-06-01 | 2010-12-02 | Beijing Irtouch Systems Co., Ltd. | Touch Detection Apparatus |
US20110050639A1 (en) * | 2009-09-02 | 2011-03-03 | Lenovo (Singapore) Pte, Ltd. | Apparatus, method, and system for touch and gesture detection |
US20140204061A1 (en) * | 2013-01-18 | 2014-07-24 | Wistron Corporation | Optical touch system, method of touch detection and computer program product |
-
2014
- 2014-06-06 BR BR112016028470A patent/BR112016028470A2/en not_active IP Right Cessation
- 2014-06-06 WO PCT/IB2014/062041 patent/WO2015185964A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040108990A1 (en) * | 2001-01-08 | 2004-06-10 | Klony Lieberman | Data input device |
US20100231532A1 (en) * | 2009-03-12 | 2010-09-16 | Samsung Electronics Co., Ltd. | Touch sensing system and display apparatus employing the same |
US20100302191A1 (en) * | 2009-06-01 | 2010-12-02 | Beijing Irtouch Systems Co., Ltd. | Touch Detection Apparatus |
US20110050639A1 (en) * | 2009-09-02 | 2011-03-03 | Lenovo (Singapore) Pte, Ltd. | Apparatus, method, and system for touch and gesture detection |
US20140204061A1 (en) * | 2013-01-18 | 2014-07-24 | Wistron Corporation | Optical touch system, method of touch detection and computer program product |
Also Published As
Publication number | Publication date |
---|---|
BR112016028470A2 (en) | 2017-08-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10324566B2 (en) | Enhanced interaction touch system | |
US10001845B2 (en) | 3D silhouette sensing system | |
US8837780B2 (en) | Gesture based human interfaces | |
JP5693972B2 (en) | Interactive surface computer with switchable diffuser | |
US8446376B2 (en) | Visual response to touch inputs | |
US20090231281A1 (en) | Multi-touch virtual keyboard | |
US20140237408A1 (en) | Interpretation of pressure based gesture | |
EP2107446A1 (en) | System and a method for tracking input devices on LC-displays | |
US10019115B2 (en) | Method and apparatus for contactlessly detecting indicated position on reproduced image | |
US20140237401A1 (en) | Interpretation of a gesture on a touch sensing device | |
US20120223909A1 (en) | 3d interactive input system and method | |
JP3201426U (en) | Virtual two-dimensional positioning module of input device and virtual input device | |
WO2016119906A1 (en) | Multi-modal gesture based interactive system and method using one single sensing system | |
JP2017509955A (en) | Dynamic allocation of possible channels in touch sensors | |
KR20090060283A (en) | Multi touch sensing display through frustrated total internal reflection | |
US20230057020A1 (en) | Meeting interaction system | |
Izadi et al. | ThinSight: integrated optical multi-touch sensing through thin form-factor displays | |
RU2014150517A (en) | INPUT SYSTEM | |
US20150227261A1 (en) | Optical imaging system and imaging processing method for optical imaging system | |
KR101071864B1 (en) | Touch and Touch Gesture Recognition System | |
US20180059806A1 (en) | Information processing device, input control method for controlling input to information processing device, and computer-readable storage medium storing program for causing information processing device to perform input control method | |
TWI582671B (en) | Optical touch sensitive device and touch sensing method thereof | |
US20140267193A1 (en) | Interactive input system and method | |
WO2015185964A1 (en) | System and method for multi-touch event detection | |
KR101481082B1 (en) | Apparatus and method for infrared ray touch by using penetration screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14893733 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112016028470 Country of ref document: BR |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14893733 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 112016028470 Country of ref document: BR Kind code of ref document: A2 Effective date: 20161205 |