WO2015185964A1 - System and method for multi-touch event detection - Google Patents

System and method for multi-touch event detection Download PDF

Info

Publication number
WO2015185964A1
WO2015185964A1 PCT/IB2014/062041 IB2014062041W WO2015185964A1 WO 2015185964 A1 WO2015185964 A1 WO 2015185964A1 IB 2014062041 W IB2014062041 W IB 2014062041W WO 2015185964 A1 WO2015185964 A1 WO 2015185964A1
Authority
WO
WIPO (PCT)
Prior art keywords
detector
computing unit
light sources
touch event
touch
Prior art date
Application number
PCT/IB2014/062041
Other languages
French (fr)
Inventor
Marcelo Ferreira GUIMARÃES
Renato Parenti Turcato
Paulo Ricardo Fonseca Blank
Eduardo Matos DE BRITO JÚNIOR
Original Assignee
Sabia Experience Tecnologia S.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sabia Experience Tecnologia S.A. filed Critical Sabia Experience Tecnologia S.A.
Priority to BR112016028470A priority Critical patent/BR112016028470A2/en
Priority to PCT/IB2014/062041 priority patent/WO2015185964A1/en
Publication of WO2015185964A1 publication Critical patent/WO2015185964A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • G06F3/041661Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving using detection at multiple resolutions, e.g. coarse and fine scanning; using detection within a limited area, e.g. object tracking window

Definitions

  • Touch detection systems and methods are described.
  • the system disclosed provides a multi-touch detection on potentially any surface by using an emitter creating a virtual touchable surface overlying it, coupled with image light detectors, a controller and a processor to compute the touch event, and to deal with the touch event or relay it to another device.
  • This interaction with multi-touch surfaces takes the form of gestures. Those can be complex, using a large vocabulary involving multiple fingers, taps, and movement in all sorts of ways: up, down, left, right, circular, tapping, with one, two, three, up to 50 fingers with more than one user touching simultaneously.
  • Interactive situations may comprise individuals interacting as well as groups interacting among themselves within the context of group software.
  • Safety it should not use potentially harmful lasers, and it should minimize the crashing risks and associate injuries with broken parts
  • Minimum size for touchable area since the goal is to cater to groups as well single users, the minimum size for the touchable area should be greater than the current tablet computers, to comfortably accommodate at least a small group of three people;
  • Multitouch gestures as well groups interacting with the surface may require multiple touch detection at any given time;
  • Touch resolution tasks such as signing a document, handwriting and freehand drawing should be supported
  • one structure should house both the touch sensor and the system for projecting images on the touchable surface. Touch detection should not interfere on or be impaired by the projection image;
  • Ruggedness It should be contained in a single-body structure, with no movable parts, allowing being lugged around and used in industrial environments;
  • Some devices have detection based on a resistive wire grid installed in front of display. Others types of devices with small displays like smartphones and tablets have a capacitive touch-sensing technologies implemented.
  • LLP Laser light plane
  • laser light planes are generated by one or more lasers placed somewhere around the surface.
  • LED When one or more fingers touch the laser light plane, light is scattered from the fingertips and a camera detects these light.
  • the invention herein described proposes the usage of two noncoherent light sources, typically LED instead of LLP technique.
  • Laser light touch sensor devices can be harmful to human eye due the characteristic of laser light - coherent and concentrated.
  • the system can include an emitter comprising at least two light sources synchronized with a detector by a controller and at least two beam shapers arranged to generate at least two partially overlapping illuminated areas over a touchable surface.
  • the controller can synchronize the at least two light sources with the detector, and can activate each of the at least two light sources in sequence and in a repeating loop.
  • the detector can detect light reflected responsive to a touch event associated with the at least two light sources.
  • a computing unit can process signals received from the detector using an image processing algorithm.
  • the computing unit can include a processor that can execute instructions stored in a memory bank to perform image analysis on the signals received from the detector to identify the touch event, can compute a difference of the signals received by the detector from the at least two light sources, and can convert the signals into a binary image using an adaptive threshold.
  • a memory bank can store instructions and record the binary image corresponding to activation of each light source, and a first communication interface can communicate with the controller.
  • a second communication interface can communicate with the detector.
  • Figure 1 is a schematic diagram of a touch detection system
  • Figures 2A and 2B present the light propagation pattern of the light sources and in two different moments
  • Figures 3A and 3B present a touch event sequence
  • Figures 4A to 4D present an example output of image processing using the filtering and thresholding in which two white regions appear where two fingertips were in contact with the touchable surface;
  • Figures 5A and 5B present a locally adaptive threshold to correct the effects due a non-uniform illumination
  • Figure 6 is the flow diagram of the sequential operations carried by the main components of system.
  • Figure 7 is the flow diagram wherein the image processing algorithms is performed.
  • the Figure 1 is a schematic diagram of a touch detection system, which comprises an emitter 101 , a detector 108, a controller 106 and a computing unit 107.
  • the emitter comprises two or more light sources 102 and 103, each coupled with its beam shaper 104 and 105, which are arranged to create two overlapping illuminated areas above the touchable surface 109.
  • the power of each source may be selected in combination with the arrangement of the sources such that the touchable area is thoroughly illuminated.
  • the detector 108 is placed above the touchable surface 109 and acquires the signal generated by the touch event
  • the detector 108 may be composed of one single light sensor or multiple sensors arranged in different position and angles to cover the area of interest and provide a unique solution for locating the x,y touch coordinates.
  • the term 'light' herein is used to refer to electromagnetic radiation of any wavelength, including, but not limited to, visible light, infra-red (IR) and ultra-violet (UV).
  • IR infra-red
  • UV ultra-violet
  • the emitter can be comprising infrared light sources and the detector comprises an infra-red sensitive detector and an optical infra-red filter.
  • the controller 106 synchronizes the light sources with the detector in order to capture the signal received by the detector for each light source.
  • the computing unit 107 comprises a processor 1 10 that runs instructions stored in a memory bank 1 1 1 , a communication interface with the controller 1 12, a communication interface with the detector 1 13.
  • inputs can modify the internal instructions, and the result can be presented on an output display partially or totally integrated to the touchable surface;
  • the processor 1 10 and the controller interface 1 12 can be or can include, for example, one or more programmable microprocessors, microcontrollers, application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), or other similar device or combination of such devices.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • PLDs programmable logic devices
  • the memory 1 1 1 provides temporary and/or long-term storage of instructions and/or data for use by processor, the controller and/or detector.
  • Memory can be or include one or more volatile and/or non-volatile storage device, such as random access memory (RAM), read-only memory (ROM), flash memory, solid-state drives, or the like.
  • the detector 108 may comprise any suitable means of detecting light reflected by a touch event. The choice of detector is therefore dependent on the particular implementation of the embodiment.
  • the detector may include a filter to prevent wavelengths other than the wavelength of operation from being detected or from saturating the detector. Examples of suitable means include, but are not limited to, cameras and photo-sensors.
  • the detector may be orientated in any suitable manner relative to the touchable surface so that it can capture light reflected inside the area of interest.
  • the computing unit 107 configures the controller using the controller interface 1 12 that defines the timing of the emission of light sources 102 and 103 of emitter 101 .
  • the controller 106 continuously activates light sources of emitter 101 one at a time and the activation lasts for the duration configured for the arrangement.
  • the computing unit 107 configures the detector 108 using the detector interface 1 13. This step enables a synchronism signal output from detector 108 to the controller 106. The synchronism guarantees that the signal is acquired during the emission of only one light source.
  • Figures 2A and 2B present the light propagation pattern of the light source 202 and 203 in two different moments.
  • moment M1 only the light source 202 of the emitter 201 emits a light that is oriented by beam shaper 204 creating an illuminated area 206 over the touchable surface 209.
  • the detector 208 acquires the raw signal S1 .
  • the light source 203 of the emitter 201 emits a light that is properly oriented by beam shaper 205 creating other illuminated area 207 over the touchable surface 209.
  • detector 208 acquires another raw signal S2.
  • the beam propagation of light source is controlled by the beam shaper 204 and 205 that allows a horizontal propagation to cover the touchable surface 209 and limits the vertical propagation to reduce the hovering effect and optimize the touch detection algorithm.
  • the computing unit process the signals S1 and S2 acquired by the detector during these two moments to determinate a touch event.
  • the resulting signal S is an arithmetic point-by-point subtraction operation of these two raw signals S1 and S2 as described below:
  • S(x,y) stands for the signal level resulted on x,y coordinates
  • Smin is the minimum value acceptable as the result of the subtraction. In our case, we consider Smin as zero.
  • FIG. 3A and 3B it is disclosed a touch event sequence.
  • the user's finger 310 is close to the touchable surface 309 however it is not considered a touch event because the fingertip is reflecting light only from light source 303.
  • the signal S2 due the light reflection on the finger is greater than S1 , because there is not reflection caused by light source 302, and the result of the subtraction is zero.
  • FIG. 4A represents Signal S1 that is the finger reflection 402 detected at moment M1 .
  • Fig. 4B is the Signal S2 due finger reflection 403 captured at moment M2.
  • Fig. 4C the area indicated on fingertip 404 is the result of subtraction of signals and Fig. 4D represents the binary result as a binary image 405 after binary convention be applied to signal 404.
  • This binary conversion uses a threshold light level to convert the detected signal into a binary image 405.
  • Figure 4D presents an example binary output image after threshold conversion. Two white regions are the signals after threshold conversion, the rest of the working area, apart from two fingertips, is black.
  • the embodiment uses a locally adaptive threshold illustrated in figures 5A and 5B to correct the effects due a non-uniform illumination.
  • This approach uses a different threshold level 602 for different regions of the signal.
  • the straightforward technic, the global threshold is a noncontextual approach and uses a specific select threshold value 601 .
  • the proposed invention solves two common issues of touch detection using light emitters and detectors: the removal of the background noise 406 and the hovering effect of the finger over the touching surface.
  • the background noise 406 is generated by the sunlight and the environment artificial light, and it impairs the performance of the touch detection.
  • the choice of a specific wavelength in conjunction with a filter and the technique of signal subtraction allied with threshold conversion removes the background noise.
  • the hovering effect may affect the user experience. It is important to have the touch event triggered only close to touchable surface in order to provide the user with an experience that is perceived as "real-touch”. That is achieved by detecting as intended touches by the user only those events where the finger is placed within a rather thin virtual touchable space delimited by the light source 302 in conjunction with the beam shaper 304. This thin touchable space is created just above a touchable surface to emulate the true- touch user interaction, allowing a high-quality multi-touch user experience.
  • the touch detection performance would be compromised.
  • the touch events would be generated far from the surface due the light propagation pattern.
  • the user may not have to apply pressure to create a touch event.
  • image- processing techniques may be applied to detect the pressure applied by the user.
  • physical objects may be used to trigger the touch event. The object has to reflect the light emitted by the light source and calibrated in order to be correctly detected.
  • detectable object is a white pen.
  • Figure 6 is the simplified flow diagram of the sequential operations steps carried by the main components of system.
  • Signal masking 701 involves masking the detected image to remove detections outside working area. This eliminates stray signals, which may have been detected outside the area of interest.
  • Signal subtraction 702 remove background interference and reduce hovering effect.
  • Adaptive threshold 703 is applied in order to correct effects due non-uniform illumination, through feature detection 704.
  • the touchable surface 804 may be integrated with a display surface showed in figures 8A and 8B.
  • the display presents images such that the user can interact with them using touch events detected by the system.
  • the display can be a projector 803, where the touchable surface is a projection screen, a pad or any reflective surface.
  • the display may be a LCD screen 805 where the LCD becomes the touchable surface.

Abstract

A system and method for multi-touch event detection are disclosed. One aspect is directed to a system that can include an emitter comprising at least two light sources synchronized with a detector by a controller and at least two beam shapers arranged to generate at least two partially overlapping illuminated areas over a touchable surface. The controller can synchronize the at least two light sources with the detector, and can activate each of the at least two light sources in sequence and in a repeating loop. The detector can detect light reflected responsive to a touch event associated with the at least two light sources. A computing unit can process signals received from the detector using an image processing algorithm.

Description

"SYSTEM AND METHOD FOR MULTI-TOUCH EVENT DETECTION"
BACKGROUND
[0001 ] Advancements of the touch sensing technology creates numerous opportunities in computer interaction by both individuals and groups. The use for interactive touch surfaces varies based on context, comprehending labor and entertainment activities. Challenges arise in creating resilient, affordable and safe solutions that provide interactive touch surfaces.
SUMMARY
[0002] Touch detection systems and methods are described. The system disclosed provides a multi-touch detection on potentially any surface by using an emitter creating a virtual touchable surface overlying it, coupled with image light detectors, a controller and a processor to compute the touch event, and to deal with the touch event or relay it to another device.
[0003] Advancements of the touch sensing technology creates numerous opportunities in computer interaction by both individuals and groups. The use for interactive touch surfaces varies based on context, comprehending labor and entertainment activities. Systems and methods of the present disclosure provide resilient, affordable and safe solutions interactive touch surfaces that turn surfaces of any dimension and nature into interactive ones. In some embodiments, the technology is resilient by being dust, spill and drop resistant; cost effective due to the parts arrangement and processing method implemented by software to offer a cost effective product; safe due to minimum or no crashing risks involving the likes of cuts from cracked glass or electrical shocks, as well as the avoidance of using potentially harmful light sources as lasers.
[0004] This interaction with multi-touch surfaces takes the form of gestures. Those can be complex, using a large vocabulary involving multiple fingers, taps, and movement in all sorts of ways: up, down, left, right, circular, tapping, with one, two, three, up to 50 fingers with more than one user touching simultaneously. Interactive situations may comprise individuals interacting as well as groups interacting among themselves within the context of group software.
[0005] In essence, there is a recognized need for robust, inexpensive and secure mechanisms to support multi-touch interactive surfaces to be used in work and entertaining situations by individuals and groups.
[0006] A list of factors was set forth at the origin of this invention, regarding its context of use:
1 ) Safety: it should not use potentially harmful lasers, and it should minimize the crashing risks and associate injuries with broken parts
2) Near-touch with true-touch feeling: even adopting near-touch due to the other factors, it should behave as true touch from the user viewpoint
3) Minimum size for touchable area: since the goal is to cater to groups as well single users, the minimum size for the touchable area should be greater than the current tablet computers, to comfortably accommodate at least a small group of three people;
4) Multitouch: gestures as well groups interacting with the surface may require multiple touch detection at any given time;
5) Touch resolution: tasks such as signing a document, handwriting and freehand drawing should be supported;
6) Touch on a surface with a projected image: one structure should house both the touch sensor and the system for projecting images on the touchable surface. Touch detection should not interfere on or be impaired by the projection image;
7) Ambient lighting: artificial or sunlight should not interfere with the touch detection.
8) Ruggedness: It should be contained in a single-body structure, with no movable parts, allowing being lugged around and used in industrial environments;
[0007] Some devices have detection based on a resistive wire grid installed in front of display. Others types of devices with small displays like smartphones and tablets have a capacitive touch-sensing technologies implemented.
[0008] Devices with large displays like interactive tables and whiteboards usually have optical sensing systems implemented. They can use FTIR (frustrated total internal reflection) technology to detect touching objects. On this configuration, the light is emitted on the edge of transparent sheet that is the touchable surface.
[0009] Other techniques to detect touching events in is LLP (Laser light plane). In these technique laser light planes are generated by one or more lasers placed somewhere around the surface. When one or more fingers touch the laser light plane, light is scattered from the fingertips and a camera detects these light. The invention herein described proposes the usage of two noncoherent light sources, typically LED instead of LLP technique. Laser light touch sensor devices can be harmful to human eye due the characteristic of laser light - coherent and concentrated.
[0010] One aspect is directed to a system for multi-touch event detection. The system can include an emitter comprising at least two light sources synchronized with a detector by a controller and at least two beam shapers arranged to generate at least two partially overlapping illuminated areas over a touchable surface. The controller can synchronize the at least two light sources with the detector, and can activate each of the at least two light sources in sequence and in a repeating loop. The detector can detect light reflected responsive to a touch event associated with the at least two light sources. A computing unit can process signals received from the detector using an image processing algorithm. The computing unit can include a processor that can execute instructions stored in a memory bank to perform image analysis on the signals received from the detector to identify the touch event, can compute a difference of the signals received by the detector from the at least two light sources, and can convert the signals into a binary image using an adaptive threshold. A memory bank can store instructions and record the binary image corresponding to activation of each light source, and a first communication interface can communicate with the controller. A second communication interface can communicate with the detector.
BRIEF DESCRIPTION OF THE DRAWINGS
[001 1 ] The accompanying drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labelled in every drawing. In the drawings:
[0012] Figure 1 is a schematic diagram of a touch detection system;
[0013] Figures 2A and 2B present the light propagation pattern of the light sources and in two different moments;
[0014] Figures 3A and 3B present a touch event sequence;
[0015] Figures 4A to 4D present an example output of image processing using the filtering and thresholding in which two white regions appear where two fingertips were in contact with the touchable surface;
[0016] Figures 5A and 5B present a locally adaptive threshold to correct the effects due a non-uniform illumination;
[0017] Figure 6 is the flow diagram of the sequential operations carried by the main components of system; and
[0018] Figure 7 is the flow diagram wherein the image processing algorithms is performed.
DETAILED DESCRIPTION
[0019] This disclosure is not limited in its application to the constructive details and components arrangement in the description hereunder or illustrated in the drawings. The present disclosure can be carried out with other configurations and can be executed in several manners.
[0020] References in this disclosure to "an embodiment", "one embodiment", "some embodiments" or the like, mean that the particular feature, structure or characteristic being described is included in at least one embodiment of the present invention. Occurrences of such phrases in this specification do not necessarily all refer to the same embodiment.
[0021 ] The present disclosure below in connection with the appended drawings is directed to the description of the invention.
[0022] The Figure 1 is a schematic diagram of a touch detection system, which comprises an emitter 101 , a detector 108, a controller 106 and a computing unit 107.
[0023] The emitter comprises two or more light sources 102 and 103, each coupled with its beam shaper 104 and 105, which are arranged to create two overlapping illuminated areas above the touchable surface 109. The power of each source may be selected in combination with the arrangement of the sources such that the touchable area is thoroughly illuminated.
[0024] The detector 108 is placed above the touchable surface 109 and acquires the signal generated by the touch event
[0025] The detector 108 may be composed of one single light sensor or multiple sensors arranged in different position and angles to cover the area of interest and provide a unique solution for locating the x,y touch coordinates.
[0026] The term 'light' herein is used to refer to electromagnetic radiation of any wavelength, including, but not limited to, visible light, infra-red (IR) and ultra-violet (UV).
[0027] The emitter can be comprising infrared light sources and the detector comprises an infra-red sensitive detector and an optical infra-red filter.
[0028] The operation of touch detection system can be described with reference to the flow diagram shown in figure 6.
[0029] The controller 106 synchronizes the light sources with the detector in order to capture the signal received by the detector for each light source.
[0030] The computing unit 107 comprises a processor 1 10 that runs instructions stored in a memory bank 1 1 1 , a communication interface with the controller 1 12, a communication interface with the detector 1 13.
[0031 ] In the multi-touch detection system of present invention, inputs can modify the internal instructions, and the result can be presented on an output display partially or totally integrated to the touchable surface;
[0032] The processor 1 10 and the controller interface 1 12 can be or can include, for example, one or more programmable microprocessors, microcontrollers, application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), or other similar device or combination of such devices.
[0033] The memory 1 1 1 provides temporary and/or long-term storage of instructions and/or data for use by processor, the controller and/or detector. Memory can be or include one or more volatile and/or non-volatile storage device, such as random access memory (RAM), read-only memory (ROM), flash memory, solid-state drives, or the like.
[0034] The detector 108 may comprise any suitable means of detecting light reflected by a touch event. The choice of detector is therefore dependent on the particular implementation of the embodiment. The detector may include a filter to prevent wavelengths other than the wavelength of operation from being detected or from saturating the detector. Examples of suitable means include, but are not limited to, cameras and photo-sensors. The detector may be orientated in any suitable manner relative to the touchable surface so that it can capture light reflected inside the area of interest.
[0035] During system boot and setup, the computing unit 107 configures the controller using the controller interface 1 12 that defines the timing of the emission of light sources 102 and 103 of emitter 101 . The controller 106 continuously activates light sources of emitter 101 one at a time and the activation lasts for the duration configured for the arrangement.
[0036] The computing unit 107 configures the detector 108 using the detector interface 1 13. This step enables a synchronism signal output from detector 108 to the controller 106. The synchronism guarantees that the signal is acquired during the emission of only one light source.
[0037] Figures 2A and 2B present the light propagation pattern of the light source 202 and 203 in two different moments. On moment M1 , only the light source 202 of the emitter 201 emits a light that is oriented by beam shaper 204 creating an illuminated area 206 over the touchable surface 209. On this moment, the detector 208 acquires the raw signal S1 . On moment M2, the light source 203 of the emitter 201 emits a light that is properly oriented by beam shaper 205 creating other illuminated area 207 over the touchable surface 209. On this moment M2, detector 208 acquires another raw signal S2. The beam propagation of light source is controlled by the beam shaper 204 and 205 that allows a horizontal propagation to cover the touchable surface 209 and limits the vertical propagation to reduce the hovering effect and optimize the touch detection algorithm.
[0038] The computing unit process the signals S1 and S2 acquired by the detector during these two moments to determinate a touch event. The resulting signal S is an arithmetic point-by-point subtraction operation of these two raw signals S1 and S2 as described below:
[0039] S(x,y) = max[S1 (x,y) - S2(x,y); Smin]
[0040] Where S(x,y) stands for the signal level resulted on x,y coordinates; and Smin is the minimum value acceptable as the result of the subtraction. In our case, we consider Smin as zero.
[0041 ] On figure 3A and 3B it is disclosed a touch event sequence. On the figure 3A the user's finger 310 is close to the touchable surface 309 however it is not considered a touch event because the fingertip is reflecting light only from light source 303. On this situation, the signal S2 due the light reflection on the finger is greater than S1 , because there is not reflection caused by light source 302, and the result of the subtraction is zero.
[0042] On the figure 3B the user is effectively touching the touchable surface 309 and inside the area illuminated by the light source 302 and 303. On this situation, the signal S1 due the light reflection on the user's finger 310 is subtracted by signal S2 resulting a positive value. This resulting signal is recognized as a touch event. [0043] The Figures 4A to 4D present an example output of image processing using the filtering and thresholding in which two white regions appear where two fingertips were in contact with the touchable surface - Touch event 401 . Fig. 4A represents Signal S1 that is the finger reflection 402 detected at moment M1 . Fig. 4B is the Signal S2 due finger reflection 403 captured at moment M2. In Fig. 4C the area indicated on fingertip 404 is the result of subtraction of signals and Fig. 4D represents the binary result as a binary image 405 after binary convention be applied to signal 404.
[0044] This binary conversion uses a threshold light level to convert the detected signal into a binary image 405.
[0045] Figure 4D presents an example binary output image after threshold conversion. Two white regions are the signals after threshold conversion, the rest of the working area, apart from two fingertips, is black.
[0046] The embodiment uses a locally adaptive threshold illustrated in figures 5A and 5B to correct the effects due a non-uniform illumination. This approach uses a different threshold level 602 for different regions of the signal. The straightforward technic, the global threshold is a noncontextual approach and uses a specific select threshold value 601 .
[0047] The proposed invention solves two common issues of touch detection using light emitters and detectors: the removal of the background noise 406 and the hovering effect of the finger over the touching surface.
[0048] The background noise 406 is generated by the sunlight and the environment artificial light, and it impairs the performance of the touch detection. The choice of a specific wavelength in conjunction with a filter and the technique of signal subtraction allied with threshold conversion removes the background noise.
[0049] The hovering effect may affect the user experience. It is important to have the touch event triggered only close to touchable surface in order to provide the user with an experience that is perceived as "real-touch". That is achieved by detecting as intended touches by the user only those events where the finger is placed within a rather thin virtual touchable space delimited by the light source 302 in conjunction with the beam shaper 304. This thin touchable space is created just above a touchable surface to emulate the true- touch user interaction, allowing a high-quality multi-touch user experience.
[0050] In an embodiment constructed with a single non-coherent light source, the touch detection performance would be compromised. The touch events would be generated far from the surface due the light propagation pattern.
[0051 ] Differently from other touch detection systems, the user may not have to apply pressure to create a touch event. In one embodiment, image- processing techniques may be applied to detect the pressure applied by the user. In addition, physical objects may be used to trigger the touch event. The object has to reflect the light emitted by the light source and calibrated in order to be correctly detected. An example of detectable object is a white pen.
[0052] Figure 6 is the simplified flow diagram of the sequential operations steps carried by the main components of system.
[0053] In an embodiment, considering the elements of system already described, the present invention disclosures a method for multi-touch event detection that comprises the steps of:
generating at least two partially overlapping light beams by at least two light sources synchronized with a detector by an controller and at least two beam shapers so that a touchable area is defined overlaying a touchable surface;
activating each light source momentarily by the controller in sequence and in a repeating loop;
detecting light reflection by a detector as a result of a touch event for light sources;
recording the images in a memory bank of computing unit for any two sequential moments within a loop, each corresponding to the activation of each light source; performing image analysis in a computing unit by image processing algorithms on an acquired signals from detector to identify touch event, computing the difference between the images so that only touch events that take place near the touchable surface are considered, and converting the signal into a binary image using an adaptive threshold to signaling touch events;
performing a communication between controller and the computing unit by a communication interface of said computing unit; and
performing a communication between detector and the computing unit by a communication interface of said computing unit.
[0054] The step of performing image analysis in a computing unit by image processing algorithms on an acquired signal from detector to identify touch events is shown in details by figure 7.
[0055] Signal masking 701 involves masking the detected image to remove detections outside working area. This eliminates stray signals, which may have been detected outside the area of interest.
[0056] Signal subtraction 702 remove background interference and reduce hovering effect.
[0057] Adaptive threshold 703 is applied in order to correct effects due non-uniform illumination, through feature detection 704.
[0058] Calculating the center of mass of each object 705 each touch event is converted into a single coordinate x,y on touchable surface.
[0059] Applying homography 706 to map touch event in coordinate x, y relative to onscreen coordinates where touch system is used as an overlay to a display system as described below.
[0060] These techniques described may be applied in an order that is different from the sequence presented on figure 7. In a basic implementation, only the Signal Subtraction 702 a threshold conversion 703 have to be applied and in either order.
[0061 ] In other embodiment, the touchable surface 804 may be integrated with a display surface showed in figures 8A and 8B. The display presents images such that the user can interact with them using touch events detected by the system. The display can be a projector 803, where the touchable surface is a projection screen, a pad or any reflective surface. In other situation, the display may be a LCD screen 805 where the LCD becomes the touchable surface.

Claims

1. A system for multi-touch event detection comprising:
an emitter comprising at least two light sources synchronized with a detector by a controller and at least two beam shapers arranged to generate at least two partially overlapping illuminated areas over a touchable surface, wherein
the controller synchronizes the at least two light sources with the detector, activates each of the at least two light sources in sequence and in a repeating loop, and wherein
the detector detects light reflected responsive to a touch event associated with the at least two light sources; and
a computing unit to process signals received from the detector using an image processing algorithm comprising:
a processor that executes instructions stored in a memory bank to perform image analysis on the signals received from the detector to identify the touch event, compute a difference of the signals received by the detector from the at least two light sources, and convert the signals into a binary image using an adaptive threshold;
a memory bank to store instructions and record the binary image corresponding to activation of each light source;
a first communication interface configured to communicate with the controller; and
a second communication interface configured to communicate with the detector.
2. The system of claim 1 , wherein the emitter comprises infrared light sources and the detector comprises an infrared sensitive detector and an optical infrared filter.
3. The system of claim 1 , further comprising inputs to modify the instructions stored in the memory bank that perform image analysis on the signals received from the detector.
4. The system of claim 1 , further comprising an output display at least partially integrated with the touchable surface configured to present a result associated with the binary image.
5. The system of claim 4, wherein the output display includes a projection screen configured to present a graphical user interface projected via a projector, the graphical user interface including the touchable surface.
6. The system of claim 4, wherein the output display includes an LCD screen configured to display a graphical user interface.
7. The system of claim 1 , wherein the computing unit is further configured to:
compute an interest area inside a working area of the touchable surface by filtering the signals to remove data outside of the interest area.
8. The system of claim 1 , wherein the computing unit is further configured to: filter the signals to remove background noise.
9. The system of claim 1 , wherein the computing unit is further configured to control a touch event based application.
10. A method of detecting a multi-touch event comprising:
generating, by at least two light sources, at least two partially overlapping light beams synchronized with a detector by a controller and at least two beam shapers to define a touchable area is over a touchable surface;
activating, by the controller, each of the at least two light sources in sequence and in a repeating loop;
detecting, by a detector, light reflected responsive to a touch event associated with the at least two light sources;
recording, by the computing unit in a memory bank, signals received from the detector for two sequential moments within a loop of the repeating loop, each of the two sequential moments corresponding to the activation of each of the two light sources; performing, by the computing unit, image analysis using image processing algorithms on the signals received from the detector to identify a touch event;
determining, by the computing unit, a difference between the signals so that touch events that take place in an interest area of the touchable surface are considered;
converting, by the computing device, the signals within the interest area of the touchable surface into a binary image using an adaptive threshold to signaling touch events;
performing a communication between controller and the computing unit by a communication interface of said computing unit; and
performing a communication between the detector and the computing unit by a communication interface of said computing unit.
11. The method as recited in claim 10, wherein performing image analysis in a computing unit by image processing algorithms on an acquired signals from detector to identify touch events further comprises:
masking the detected image by signal masking to remove detections outside a working area in order to remove stray signals detected outside the area of interest;
removing, by signal subtraction, at least some background interference and at least some hovering effects;
applying an adaptive threshold in order to correct effects due to nonuniform illumination, through feature detection;
calculating the center of mass of each object where each touch event is converted into a single coordinate x,y on the touchable surface; and
applying homography to map the touch event in coordinate x, y relative to onscreen coordinates where touch system is used as an overlay to a display system.
PCT/IB2014/062041 2014-06-06 2014-06-06 System and method for multi-touch event detection WO2015185964A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
BR112016028470A BR112016028470A2 (en) 2014-06-06 2014-06-06 multi-touch event detection system and method
PCT/IB2014/062041 WO2015185964A1 (en) 2014-06-06 2014-06-06 System and method for multi-touch event detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2014/062041 WO2015185964A1 (en) 2014-06-06 2014-06-06 System and method for multi-touch event detection

Publications (1)

Publication Number Publication Date
WO2015185964A1 true WO2015185964A1 (en) 2015-12-10

Family

ID=54766211

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/062041 WO2015185964A1 (en) 2014-06-06 2014-06-06 System and method for multi-touch event detection

Country Status (2)

Country Link
BR (1) BR112016028470A2 (en)
WO (1) WO2015185964A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040108990A1 (en) * 2001-01-08 2004-06-10 Klony Lieberman Data input device
US20100231532A1 (en) * 2009-03-12 2010-09-16 Samsung Electronics Co., Ltd. Touch sensing system and display apparatus employing the same
US20100302191A1 (en) * 2009-06-01 2010-12-02 Beijing Irtouch Systems Co., Ltd. Touch Detection Apparatus
US20110050639A1 (en) * 2009-09-02 2011-03-03 Lenovo (Singapore) Pte, Ltd. Apparatus, method, and system for touch and gesture detection
US20140204061A1 (en) * 2013-01-18 2014-07-24 Wistron Corporation Optical touch system, method of touch detection and computer program product

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040108990A1 (en) * 2001-01-08 2004-06-10 Klony Lieberman Data input device
US20100231532A1 (en) * 2009-03-12 2010-09-16 Samsung Electronics Co., Ltd. Touch sensing system and display apparatus employing the same
US20100302191A1 (en) * 2009-06-01 2010-12-02 Beijing Irtouch Systems Co., Ltd. Touch Detection Apparatus
US20110050639A1 (en) * 2009-09-02 2011-03-03 Lenovo (Singapore) Pte, Ltd. Apparatus, method, and system for touch and gesture detection
US20140204061A1 (en) * 2013-01-18 2014-07-24 Wistron Corporation Optical touch system, method of touch detection and computer program product

Also Published As

Publication number Publication date
BR112016028470A2 (en) 2017-08-22

Similar Documents

Publication Publication Date Title
US10324566B2 (en) Enhanced interaction touch system
US10001845B2 (en) 3D silhouette sensing system
US8837780B2 (en) Gesture based human interfaces
JP5693972B2 (en) Interactive surface computer with switchable diffuser
US8446376B2 (en) Visual response to touch inputs
US20090231281A1 (en) Multi-touch virtual keyboard
US20140237408A1 (en) Interpretation of pressure based gesture
EP2107446A1 (en) System and a method for tracking input devices on LC-displays
US10019115B2 (en) Method and apparatus for contactlessly detecting indicated position on reproduced image
US20140237401A1 (en) Interpretation of a gesture on a touch sensing device
US20120223909A1 (en) 3d interactive input system and method
JP3201426U (en) Virtual two-dimensional positioning module of input device and virtual input device
WO2016119906A1 (en) Multi-modal gesture based interactive system and method using one single sensing system
JP2017509955A (en) Dynamic allocation of possible channels in touch sensors
KR20090060283A (en) Multi touch sensing display through frustrated total internal reflection
US20230057020A1 (en) Meeting interaction system
Izadi et al. ThinSight: integrated optical multi-touch sensing through thin form-factor displays
RU2014150517A (en) INPUT SYSTEM
US20150227261A1 (en) Optical imaging system and imaging processing method for optical imaging system
KR101071864B1 (en) Touch and Touch Gesture Recognition System
US20180059806A1 (en) Information processing device, input control method for controlling input to information processing device, and computer-readable storage medium storing program for causing information processing device to perform input control method
TWI582671B (en) Optical touch sensitive device and touch sensing method thereof
US20140267193A1 (en) Interactive input system and method
WO2015185964A1 (en) System and method for multi-touch event detection
KR101481082B1 (en) Apparatus and method for infrared ray touch by using penetration screen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14893733

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112016028470

Country of ref document: BR

122 Ep: pct application non-entry in european phase

Ref document number: 14893733

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 112016028470

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20161205