WO2006095320A2 - System and method for detecting the location, size and shape of multiple objects that interact with a touch screen display - Google Patents
System and method for detecting the location, size and shape of multiple objects that interact with a touch screen display Download PDFInfo
- Publication number
- WO2006095320A2 WO2006095320A2 PCT/IB2006/050728 IB2006050728W WO2006095320A2 WO 2006095320 A2 WO2006095320 A2 WO 2006095320A2 IB 2006050728 W IB2006050728 W IB 2006050728W WO 2006095320 A2 WO2006095320 A2 WO 2006095320A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- touch screen
- calibration data
- act
- sensors
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04184—Synchronisation with the driving of the display or the backlighting unit to avoid interferences generated internally
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates generally to touch screen displays, and more particularly, to methods and apparatus for detecting the location, size and shape of multiple objects that interact with a touch screen display.
- Touch screens are commonly used as pointing sensors to provide a man-machine interface for computer driven systems.
- a number of infrared optical emitters (i.e., transmitters) and detectors (i.e., receivers) are arranged around the periphery of the display screen to create a plurality of intersecting light paths.
- the user's finger blocks the optical transmission of certain ones of the perpendicularly arranged transmitter/receiver pairs. Based on the identity of the blocked pairs, the touch screen system can determine the location of the intercept (single point interaction).
- a particular choice can be selected by a user by touching the area of the screen where that choice is displayed, which can be a menu option or a button.
- This use of perpendicular light beams while widely used, is unable to effectively detect the shape and size of an object. Neither can the use of perpendicular light beams detect multiple objects or multiple touch points.
- touch screen applications it would therefore be desirable for touch screen applications to be able to determine the shape and size of an object, in addition to being able to detect multiple touch points. These applications would also benefit from the ability to determine the transparency and reflectivity of the one or more objects.
- the present invention provides methods and apparatus for detecting the location, size and shape of one or more objects placed on a plane within the touch sensor boundaries of a touch screen display. Methods are also provided for detecting an object's, or multiple objects', reflectivity and transparency.
- an apparatus for detecting the location, size and shape of an object, or multiple objects, placed on a plane within the touch sensor boundaries of a touch screen includes a plurality of light transmitters (N) and sensors (M) arranged in an alternating pattern on the periphery of the touch screen.
- a method for detecting an object's, or multiple objects', location, size and shape comprises the acts of: (a) acquiring calibration data for each of (N) light transmitters L 1 arranged around the periphery of a touch screen display; (b) acquiring non-calibration data for each of the (N) light transmitters L 1 ; (c) computing N minimum area estimates of at least one object positioned in the plane of the touch screen display using the calibration data and the non- calibration data computed at acts (a) and (b); (d) combining the N minimum area estimates to derive a total minimum object area of the at least one object; (e) computing (N) maximum area estimates of the at least one object using the calibration data and the non-calibration data computed at acts (a) and (b); (f) combining the N maximum area estimates to derive a total maximum object area of the at least one object; and (g) combining the total minimum and maximum object areas to derive the boundary area of the at least one object.
- the light transmitters and receivers can be located in separate parallel planes in close proximity.
- the density of light transmitters and receivers is substantially increased thus providing for increased resolution and precision in defining the location, shape and size of the at least one object.
- specific types of photo-sensors may be employed to provide a capability for detecting the reflectivity or conversely the transmissivity of certain objects thus providing additional information regarding the optical properties of the material constituting the object. For example, based on the detected differences in light transmission, reflection, absorption the touch screen can distinguish between a person's hand, a stylus or a pawn used in an electronic board game.
- FIGS. 1 & 2 illustrate a snapshot of the touch screen display during a point in time at which the first and second light sources are switched on during a calibration mode
- FIGS. 3 & 4 illustrate a snapshot of the touch screen display during a point in time at which the first and second light sources are switched on during an operational mode
- FIG. 5 illustrates a snapshot that shows how minimum and maximum area estimates are being made using the calibration and non-calibration data
- FIGS. 6 - 9 illustrate how the minimum and maximum area estimates are combined to determine the total boundary area of an object
- FIG. 10 illustrates a snapshot of the touch screen display 10 in the operational mode during the turn-on time of a first corner light source L 0 in the presence of two circular objects;
- FIG. 11 illustrates a snapshot of the touch screen display in the operational mode during the turn-on time of a second corner light source L 1 in the presence of two circular objects;
- FIG. 12 illustrates how the minimum and maximum area estimates are calculated for the "optimized" approach
- FIGS. 13- 15 illustrate snapshots of the touch screen display which illustrate the measurement of light reflection, absorption and transmission of one object
- FIG. 16 illustrates a touch screen having an oval shape, according to an embodiment of the invention
- FIG. 17-21 illustrate how the difference in the object location on the touch screen can impact the object location, shape, size detection precision; and FIG. 22-25 illustrate an embodiment where different angular positions are selected for the light transmitters.
- the invention is described and illustrated herein in conjunction with a touch screen (i.e., a display with embedded touch sensing technology), the invention does not require the use of a display screen. Rather, the invention may be used in a standalone configuration without including a display screen.
- the use of the word 'touch screen' throughout this specification is intended to imply all other such XY implementations, applications, or modes of operation with or without a display screen.
- the invention is not restricted to using infrared light transmitters only. Any kind of light source, visible or invisible, can be used in combination with appropriate detectors. Using light transmitters that emit visible light can give an extra advantage in some cases since it provides visual feedback on the object placed within the touch screen. The visual feedback in such case is the light from the transmitters terminated by the object itself. As will be described in detail below, the switching order of the light transmitters may be different in different embodiments depending upon the intended application.
- Advantages of the detection method of the invention include, but are not limited to, simultaneous detection of multiple objects including, for example, a hand or hands, a finger or fingers belonging to a single and/or multiple users, thereby making the invention applicable to conventional touch screen applications in addition to the creation of new touch screen applications.
- the ability to detect hands and/or objects allows users to enter information such as size, shape and distance in a single user action, not achievable in the prior art.
- the ability to simultaneously detect multiple objects, hands and/or fingers on the touch screen allows multiple users to simultaneously interact with the touch screen display or allowing single users to simultaneously interact with the touch screen display using two hands.
- the description includes an illustrative example of how calibration is performed and the calculation of an object boundary area in a non-calibration mode including the acts of computing minimum and maximum boundary area estimates.
- FIG. 1 illustrates an infrared optical touch screen display 10, according to one embodiment.
- the light transmitters and sensors being arranged in an alternating pattern (e.g., L 0 , S 1 , L 1 , S 2 , L 15 , S 11 ). It should be appreciated that the number and configuration of light transmitters and sensors may vary in different embodiments.
- the method to be described is generally comprised of two stages, a calibration stage and an operational stage.
- Calibration is performed to collect calibration data.
- Calibration data is comprised of sensor identification information corresponding to those sensors which detect a light beam transmitted from each of the respective light transmitters located on the periphery of the touch screen display 10 during a turn-on time of each light transmitter.
- the turn- on time is defined herein as the time during which light emanates from a respective light transmitter in a switched on state. It should be appreciated that in order to obtain meaningful calibration data, it is required that no objects (e.g., fingers, stylus, etc.) interact with the transmission of the light beams during their respective turn-on times in the calibration mode.
- the light beam that is cast may be detected by certain of the sensors S 0 - S 11 located on the periphery of the touch screen display 10 and may not be detected by certain other sensors.
- the identification of the sensors S 0 - S 11 that detect the respective light transmitter's light beam is recorded as calibration data.
- the calibration data shown is recorded as a plurality of sequential record entries.
- Each record entry is comprised of three columns: a first column which illustrates the identification of one of the light transmitters Lj located on the periphery of the touch screen, a second column illustrating the sensors that are illuminated by the corresponding light transmitter (i.e., detect the light beam) during its respective turn-on time, and a third column illustrating the sensors that are not illuminated by the corresponding light source during its respective turn-on time.
- the data of the third column may be derived from the data of the second column as a corollary to the data in the second column.
- the non-illuminated sensors (column 3) may be derived as the difference between the original sensor set (S 0 , S 1 , ...S ⁇ and the illuminated sensors (column 2).
- each of the respective light transmitters L 0 - L 15 located on the periphery of the touch screen display 10 are switched to an off state. Thereafter, each of the light transmitters L 0 - L 15 is switched on and off for a pre-determined turn-on time. For example, light transmitter L 0 is switched on first for a pre-determined turn-on time during which calibration data is collected. Light transmitter L 0 is turned off. Next, light transmitter L 1 is switched on for a pre-determined time and calibration data is collected. Light transmitter L 0 is turned off. This process continues in a similar manner for each of the remaining light transmitters in the periphery of the touch screen, e.g., L 2 - L 15 , the end of which constitutes the completion of calibration.
- each light transmitter L 0 - L 15 in the calibration sequence is turned-on, a beam of light is transmitted having a characteristic two-dimensional spatial distribution in a plane of the touch screen display 10. It is well known that depending upon the particular transmitter source selected for use, the spatial distribution of the emitted light beam will have a different angular width. Selecting a light transmitter having a light beam of a particular angular width may be determined, at least in part, from the intended application. That is, if it is expected that the objects to be detected in a particular application are particularly large having significant width, then light transmitters having a spatial distribution wider than the object itself are more appropriate for that application.
- FIGS. 1 and 2 correspond, respectively, to snapshots of light beams that are transmitted by the first and second light transmitters, L 0 and L 1 , during their respective turn-on times during calibration.
- Fig. 1 corresponds to a snapshot of a light beam transmitted from light transmitter L 0 during its respective turn-on time
- Fig. 2 corresponds to a snapshot of a light beam transmitted from light transmitter L 1 during its respective turn on time.
- FIG. 1 illustrates a snapshot of the touch screen display 10 during the turn-on time of the light transmitter L 0 .
- the light transmitter L 0 shines a distinctive beam of light having a two-dimensional spatial distribution that defines a lit area in a plane of the touch screen.
- the area illuminated by the light transmitter L 0 is considered to be comprised of three constituent regions, labeled as illuminated regions (IR-I), (IR-2) and (IR-3), respectively.
- IR-I illuminated regions
- IR-2 this region is defined as being bounded in the plane of the touch screen by the outermost sensors (S 5 and S 11 ) capable of detecting the light beam from the light transmitter L 0 .
- illuminated regions IR-I and IR-3 also fall within the illuminated region of the plane of the touch screen, but are separately labeled because they both fall outside the region of detection of the outermost sensors (S 5 and S 11 ) capable of detecting the light beam from light source L 0 .
- the outermost sensor detection information e.g., the sensor range (S 5 - S 11 ) is recorded as part of the calibration data (see the first row entry of Table I above, "outermost illuminated sensors").
- the calibration data may additionally include the identification of those sensors that do not detect the light from the light source L 0 , which in the instant example, are defined by the sensor range S 0 - S 4 as a corollary to the detection information.
- FIG. 2 is an illustration of a snapshot of the touch screen display 10 during a point in time at which the next light source L 1 in the sequence is switched on during calibration.
- the light source L 1 shines a distinctive beam of light having a distinctive coverage pattern in the plane of interest based on its position in the periphery of the touch screen display 10.
- the area lit by the light source L 1 may be considered to be comprised of 3 spatial regions, regions IR-I, IR-2 and IR-3, similar to that discussed above for light source L 0 .
- this region is bounded by the outermost sensors that detect the light beam from the light source L 1 , i.e., outermost sensors S 4 and S 11 .
- Regions IR-I and IR-3 fall within the lit area of the plane of the touch screen but fall outside the region of detection of the outermost sensors (S 4 and S 11 ) capable of detecting the light beam from L 1 .
- This sensor detection information is recorded as part of the calibration data (as shown in the second row entry of Table I above).
- the calibration data may additionally include the identification of those sensors that do not detect the light transmitted from the light transmitter L 1 , namely, sensor range S 0 - S 3 .
- the calibration process continues in a similar manner for each of the remaining light transmitters located in the periphery of the touch screen, namely, the light transmitters L 2 - L 15 .
- the calibration data is used together with non- calibration data acquired during an operational stage to detect the position, shape and size of one or more objects interacting with the touch screen display 10.
- the touch screen display 10 is ready for use to detect the position, shape and size of one or more objects interacting with the touch screen display 10.
- detection of the position, shape and size of one or more objects interacting with the touch screen display 10 is performed continuously over multiple cycles of operation.
- each of the light transmitters L 1 - L 15 illuminates in a pre- determined sequence constituting a single cycle of operation which is repeated over multiple cycles of operation.
- a single cycle of operation in the operational stage starts with the light source L 0 being turned on for a pre-determined turn- on time. After L 0 turns off, light source L 1 is turned on for a pre-determined turn-on time. This process continues in a similar manner for each light transmitter and ends with light transmitter L 15 , the last light transmitter in the sequence.
- FIGS. 3 and 4 illustrate two steps of a single cycle of operation in the operational mode, for the presently described exemplary embodiment.
- Figs. 3 and 4 illustrate a snapshot of light beams transmitted from light transmitters L 0 and L 1, respectively, in the presence of a single circular object 16.
- a single circular object 16 is selected for simplicity to illustrate the operational stage.
- Fig. 3 illustrates a snapshot of the touch screen display 10 in the operational mode during the turn-on time of the light transmitter L 0 in the presence of the circular object 16.
- the light transmitter shines a distinctive beam of light having a two-dimensional coverage pattern in a plane of the touch screen display 10.
- the light distribution pattern of the light transmitter L 0 is considered to be comprised of two regions, a first illuminated region labeled Yl and a second non-illuminated (shadow) region labeled Xl .
- the illuminated region Yl defines an area that is not subjected to the shadow cast by the circular object 16 when illuminated by the light transmitter L 0 .
- the non- illuminated (shadow) region Xl identifies an area that is subjected to the shadow cast by the circular object 16 when illuminated by the light transmitter L 0 .
- the non-illuminated (shadow) region Xl includes sensors S 6 and S 7 on the touch screen display 10 which detect an absence of light during the turn-on time of the light source L 0 . This sensor information is recorded as part of the non-calibration data for the current cycle of operation for the present position of the circular object 16 as shown in Fig.3.
- the next light source in the sequence L 1 is turned-on for its pre-determined turn-on time.
- Fig. 4 it is shown that light transmitter L 1 shines a distinctive beam of light having a two-dimensional coverage pattern on the touch screen display 10.
- the light distribution pattern of the light transmitter L 1 is considered to be comprised of 2 regions, an illuminated region labeled Y2 and a non- illuminated (shadow) region labeled X2.
- the illuminated region Y2 defines an area that is not subjected to the shadow cast by the circular object 16 when illuminated by the light transmitter L 1 .
- the non-illuminated (shadow) region X2 identifies an area that is subjected to the shadow cast by the circular object 16 when illuminated by the light transmitter L 1 .
- the illuminated region Y2 includes all sensors except sensor S 10 .
- the non-illuminated (shadow) region X2 includes only sensor S 10 on the touch screen display 10 which detects an absence of light during the turn-on time of the light transmitter L 1 . This sensor information is recorded as part of the non-calibration data for the current cycle of operation for the present position of the circular object 16 as shown in Fig.4.
- Table II illustrates, by way of example, for the present illustrative embodiment, the non-calibration data that is recorded over a single cycle of operation in the presence of the circular object 16 for light sources L 0 - L 2 .
- table II only shows non-calibration data for three of the sixteen sensors, for a single cycle of operation.
- the operational mode is comprised of multiple cycles of operation. Multiple cycles are required to detect changes in location, size and shape of objects on the screen from one point in time to the next, but also to detect the addition of new objects or removal of already present objects.
- minimum and maximum area estimates are made for the detected objects.
- the estimates are stored in a data repository for later recall in detecting an object boundary area.
- the minimum and maximum area estimates are retrieved from the data repository and combined in a manner to be described below to determine an object boundary area for each detected object in the plane of the touch screen.
- a derivation of a minimum and maximum area estimates for light transmitter L 0 are illustrated.
- the previously collected calibration data and non-calibration data is used to assist in the computation.
- the calibration data for light transmitter L 0 was found to be the range of illuminated sensors (S 5 - S 11 ).
- This sensor range constitute those sensors capable of detecting a presence of light from the light transmitter L 0 during calibration (as shown in the first row of Table I).
- the non-calibration data for light transmitter L 0 in the presence of the circular object 16 was found to be the sensor ranges (SO - S4) & (S6 - S7) detecting an absence of light (as shown in Table II above and illustrated in Fig. 3).
- Fig. 5 illustrates that the circular object 16 blocks the light path between the light source L 0 and sensor S 6 (see dashed line P5) and is also shown to be blocking the light path between the light transmitter L 0 and sensor S 7 (see dashed line P6).
- FIG. 5 further illustrates that the object 16 does not block the light paths between the light transmitter L 0 and the sensors S 5 (line Pl) and S 8 (line P2).
- This information derived from the calibration and non-calibration data, is summarized in Table III and used to determine the minimum and maximum area estimates for the object 16.
- a minimum area estimate can be determined as follows.
- the circular object 16 blocks the light path between the light source L 0 and sensors S 6 (see line P5) and S 7 (see line P6). Therefore, the minimum area estimate of object 16, labeled MIN, during the turn-on time of light source L 0 is defined by the triangle shown in FIG. 5 defined by points (L 0 , S 7 , S 6 ⁇ having two sides defined by the lines P5 and P6.
- a maximum area estimate of object 16, labeled MAX, for light transmitter L 0 may be defined in a similar manner.
- the maximum area estimate is defined by points (L 0 , S 5 , C 2 , S 8 ⁇ . This area is derived by including the sensors S 5 and S 8 adjacent to the shadow area detected with the sensors S 6 -S 7 -It should be noted here that the area includes corner C 2 because the line between S 5 and S 8 should follow the boundary of the screen.
- the minimum and maximum area estimates are stored in a data repository for each light transmitter for the current cycle of operation.
- the process of determining a minimum and maximum area continues in a similar manner for each of the remaining light transmitters L 2 - L 15 .
- the minimum and maximum area results are preferably stored in the data repository as geometrical coordinates, such as, for example, the geometrical coordinates of the min and max area vertexes or coordinates of the lines corresponding to area facets.
- the stored minimum and maximum area estimates are retrieved from the data repository and combined to determine the object boundary area of object 16, as described below.
- the method by which the minimum and maximum area estimate results are combined to determine an object boundary area may be performed in accordance with one embodiment, as follows.
- intersection with the maximum area result Ao t aL , t0 ensure th at the minimum area is completely inside the maximum area. In other words, any portion of the minimum area that falls outside the boundary of the computed maximum area will be ignored. This may occur because not all snapshots result in sufficient input for minimum and maximum area calculations, it is possible that part of the minimum area will fall outside the maximum area. For example, in a situation where the maximum area estimate for a particular light transmitter results in a snapshot that is bounded by only 2 sensors, the minimum area will be empty. Therefore, the particular light transmitter will only produce input for the maximum area calculation.
- a * T 7 ,ot ,al , min and A T 7 ,oI .aI , n ⁇ can contain several sub areas that fall under the definition of a closed set indicating that there are several objects present. Closed sets are described in greater detail in Eric W. Weisstein. "Closed Set.” From MathWorld ⁇ A Wolfram Web Resource, ⁇ IIM ⁇ M ⁇ M ⁇ WQM ⁇ U ⁇ BIG ⁇ M ⁇ M ⁇ B ⁇ -
- Area A 7 , , can be divided in several sub areas A 7 , , , in such a way that
- a i 7 ,u ⁇ u ⁇ , ma ⁇ can be divided in several sub areas A i 7 ,ut,u ⁇ . m?a j in such a
- every A 7 , , . is a closed set that corresponds to a particular object
- j and can be defined as:
- F is the function or method of finding A 7 ⁇ 7 .
- FIG. 6 illustrates a method for combining the minimum A 7 , , , and maximum, A 7 , , , areas to approximate the actual boundary of an
- each line will intersect the border of the maximum area (I) and the border of the minimum area (II).
- line Ll intersects the minimum area (II) at its border through points P2 and further intersects the maximum area (I) at its border through points Pl .
- points Pl and P2 are shown connected by a line segment 45 bifurcated at its midpoint 62 into two equal length line segments Sl and S2. This process is repeated for each line. Line segments 55 are then drawn that connect all the middle points of adjacent line segments.
- Fig. 9 illustrates a boundary area, defined by a boundary border 105, that is formed as a result of connecting all of the midpoints of the adjacent line segments. This boundary area essentially forms the approximated boundary of the object.
- Reference points other than the center of gravity of an object may also be derived, such as, for example, the top left corner of an object or a bounding box.
- shape 3 A ⁇ otal j
- the shape being detected is the convex hull shape of the object on the screen that excludes internal cavities of an object if those are present.
- the object's size can be calculated in different ways for different geometrical figures. However, for any geometrical figure, the maximum size of the geometrical figure along the two axis, x and y, i.e., Max x and Max y may be determined. In most cases, the detected geometrical figure is a polygon in which case, Max x can be defined as the maximum cross section of the resulting polygon taken along the x-axis and Max y as the maximum cross section of the same polygon along the y-axis.
- Another method for determining the size of an object is by providing a unique definition of size for a number of common geometrical shapes. For example, defining the size of a circle as its diameter, defining the size of a square as the length of one of its sides and defining the size of a rectangle as its length and width.
- the present invention provides techniques for the detection of one or more objects based on the object's size and/or shape. Accordingly, for those applications that utilize objects of different sizes and/or shapes, the invention provides an additional capability of performing object recognition based on the object's detected size and/or shape.
- Techniques for performing object recognition include utilizing a learning mode.
- a learning mode a user places an object on the surface of the touch screen, one at a time.
- the shape of the object placed on the surface of the touch screen is detected in the learning mode and object parameters including shape and size are recorded.
- object parameters including shape and size are recorded.
- the operational mode whenever an object is detected, its shape and size are analyzed to determine if it matches the shape and size of one of the learned objects, given an admissible deviation delta defined by the application. If the determination results in a match, then the object can be successfully identified.
- Examples of object recognition include recognition of pawns of a board game with a different shape or recognition of a users hand, when placed on the touch screen.
- the standard shape parameters may be provided to the control software, so that when a similar object form is detected it can be recognized as such by the system.
- switching schemes are contemplated for switching the light transmitters on and off.
- a few exemplary switching schemes are described below. It is noted, however, that the described schemes are merely illustrative. The astute reader will recognize that there are many variants to the schemes described below.
- each light transmitter e.g., L 1 - L 15
- the sequence can be initiated with any light transmitter. Further, once initiated, the sequence can proceed in either a clockwise or counterclockwise direction.
- Another switching scheme which produces, in most cases, the most information about objects present on the screen early in the operational stage is referred to herein as an 'optimized' switching scheme.
- certain of the light transmitters are uniquely positioned in the corners of the touch screen and are directed towards the middle of the touch screen. This is a desirable positioning and orientation because a corner light transmitter lights up the entire touch screen and thus provides maximum information.
- the non-corner light sources by comparison, only illuminate a part of the touch screen, thereby providing information over only a portion of the touch screen.
- the inventors have recognized that if the light sources which are most likely to produce the most information (i.e., the corner light sources) are used first, more information would be available at an earlier stage of the detection process.
- FIG. 10 illustrates a snapshot of the touch screen display 10 in the operational mode during the turn-on time of a first corner light source L 0 in the presence of two circular objects 20 and 21.
- the light transmitters L 1 , L 4 , L 7 and L 11 in each of the respective corners of the touch screen 10 are oriented towards the center of the touch screen 10.
- the light source L 0 by virtue of its strategic orientation and being a corner light transmitter, it is capable of detecting both objects 20, 21.
- the optimized scheme can be started by switching any of the corner light transmitters (e.g. L 0 , L 4 , L 7 , L 11 ) since they would produce equal amount of information.
- the light emanating from the transmitter L 0 positioned in a 'normal' orientation along the frame edge only covers a portion of the touch screen labeled IRl, IR2 and IR3 and does not cover the remaining portion of the touch screen 10 shown in white.
- the light emanating from the transmitter L 0 oriented towards the center of the touch screen 10 and positioned in the corner advantageously covers the entire screen by virtue of its orientation and position including the white areas not covered in Fig. 1.
- Fig. 11 illustrates the result of turning on the light transmitter L 4 in the sequence after switching off L 0 .
- L 4 is located in the upper right corner of the touch screen 10 and emits light over the whole area of touch screen 10. As such, it is capable of detecting both objects 20, 21.
- light transmitters L 11 and L 7 may be employed in addition to light transmitters L 0 and L 4 .
- minimum and maximum area estimates are calculated after light transmitter L 4 is switched off, the result of which is illustrated in FIG. 12. Two areas are shown, the boundaries of which are roughly known as indicated by the darkly shaded gray regions with 4 vertexes around both objects 20 and 21.
- certain of the remaining light transmitters may be strategically selected to produce maximum information to further refine the area boundaries.
- the particular light transmitters selected can differ in different embodiments.
- the next light transmitters that can be turned are light transmitters L 1 and L 13 for the area on the left of the touch screen 10 and light transmitters L 5 and L 8 for the area on the right of the touch screen 10.
- the 'optimized' approach allows fewer transmitters to be switched on/off in each cycle as compared to the 'plain' scheme.
- One possible advantage of the present scheme is that results can be produced earlier and more efficiently than in the previously described schemes, resulting in a faster response and thus possible energy saving in comparison to the 'Plain' scheme.
- the interactive scheme utilizes a strategy for switching on light transmitters based on previous detection results. Specifically, knowing the position of an object (x, y) in a previous detection cycle (or sample time) allows the light switching scheme to be adapted to target that same area in subsequent detection cycles. To account for the rest of the screen area, a simple check could be performed to insure that there are no other new objects present. This scheme is based on the assumption that an object does not substantially change its position in a fraction of a second, from one detection cycle to the next, partly due to slow human reaction times as compared to the sample times of the hardware.
- One possible advantage of the interactive switching scheme is that results can be produced earlier and more efficiently than in the previously described schemes, resulting in a faster response and thus possible energy saving in comparison to the 'Plain' scheme.
- the various switching schemes can be chosen to satisfy the specific requirements for a particular intended application.
- two applications are listed in table IV, (i.e., interactive cafe table and chess game) each requiring a different switching scheme to account for the specific requirements of the particular application.
- the 'optimized' switching scheme may also be applicable to both applications in that they both require fast response times (see characteristic 5).
- multiple light transmitters e.g., two or more
- the touch screen 10 can switch into an energy saving mode thereby reducing processing power requirements and saving on total power consumption.
- the number of light transmitters and sensors used in each cycle are reduced while maintaining or reducing the cycle frequency (number of cycles per second). This results in a lower total 'on time' of the light transmitters per cycle, which results in a lower power consumption.
- the number of lights being switched on and off per second is reduced, the required processing power of the system will be reduced as well.
- the touch frame can switch back to a normal switching scheme.
- FIGS. 13- 15 illustrate another aspect of the invention, which considers object identification based on an object's optical properties (i.e., light absorption, reflection and transmission).
- object identification based on an object's optical properties (i.e., light absorption, reflection and transmission).
- the measurement of the light absorption of an object as well as the light reflection and transmission of the object is taken into account.
- the object being detected is assumed to absorb 100% of the impinging light from a light transmitter.
- the optical properties of the material that an object is made of the light that reaches the surface of the object is partly reflected, partly absorbed and partly transmitted by the object.
- the amount of light reflected, transmitted (i.e., pass through) and absorbed depends on the optical properties of the material of the object and is different for different materials. As a consequence, due to these physical phenomena, two objects of identical shape but made of different materials (e.g. glass and wood) can be distinguished if differences can be detected in the amount of light reflected, absorbed and transmitted by the objects.
- FIG. 13 illustrates a case where less than 100% of the light that reaches the object's surface gets absorbed by the object 33. That is, the light generated by the light transmitter L 0 is partly absorbed and partly reflected by the object 33.
- sensors S 0 -S 4 on the touch screen 10 detecting some light that they would not detect otherwise (i.e. when there is no object present).
- the distribution of signal detected by sensors S 0 -S 4 is not necessarily uniform, meaning that some sensors can detect slightly more light than others.
- the level of light detected by the sensors will depend on a number of factors like the distance between the object and a sensor, shape of the object, reflections caused by other objects, etc.
- sensors S 6 and S 7 by virtue of their being subjected to the shadow of the object, do not detect any signal.
- FIG. 14 illustrates a case where 100% of the light that reaches the object's surface gets absorbed by the object 33.
- sensors S 6 and S 7 do not detect any signal by virtue of their being subjected to the shadow of the object.
- this case differs from the partial absorption case in that sensors S 0 - S 4 also do not detect any signal due to the total absorption of light by object 33.
- sensors (S 0 - S 4 ) and (S 6 - S 7 ) may detect some external noise generated by external light sources that would normally be negligible.
- FIG. 15 illustrates a case where the light generated by the light transmitter L 0 is partly absorbed and partly transmitted by the object 33. This leads to sensors S6 and S7 detecting some light.
- objects of identical shape and size can still differ with regard to their optical characteristics. These differences will cause objects to absorb, reflect and transmit (i.e., pass through) different amounts of light emitted from a light transmitter. It should be appreciated that according to an advantageous aspect, because the amount of light reflected and transmitted can be detected, as was shown in the examples above, objects of identical size and shape can be distinguished if they are made of materials with different optical properties.
- D. - Detection of optical properties for multiple objects the simultaneous detection of optical properties of two or more objects is considered.
- two or more objects can have different shapes and sizes which would make the light distribution pattern detected by the sensors rather complex if it is desired to take into account the optical properties of the objects.
- pattern recognition techniques could be applied to classify objects with respect to the optical properties such as reflectivity, absorption and transmissivity of the material they are made of.
- FIG. 16 illustrates one embodiment where the touch screen 10 has an oval shape. Shapes other than a rectangular shape (e.g., circular) can be used as long as there are enough intersecting areas between the light transmitters and the sensors to meet the desired accuracy in location, shape and size detection. This is in contrast with prior art touch screen detection techniques which in most cases require a rectangular frame.
- a rectangular shape e.g., circular
- the accuracy in determining the position, shape and size of an object is subject to uncertainty.
- the uncertainty may be partially minimized by increasing the number of sensors used in the touch screen display 10. By increasing the number (density) of sensors, the relative spacing between the sensors decreases accordingly which leads to a more accurate calculation of the position, shape and size of an object.
- the number of transmitters may be increased which also leads to a more accurate calculation of the position, shape and size of an object. It is noted that increasing the number of transmitters will highlight the object from additional angles thus providing additional information leading to more accurate results.
- the overall measurement accuracy may be increased by increasing the density of transmitters and/or receivers in certain areas of the screen where detection proves to be less accurate than other areas. This non-even configuration of transmitters and/or receivers can compensate for the less accurate detection. Overall measurement accuracy may suffer in certain situations dependent upon the position of the object on the touch screen. As such, differences in resolution and precision in detecting the location, shape and size of the object may occur.
- FIG. 17 illustrates the first situation where a circular object 24 having diameter d is positioned in the center of the screen 10 and transmitter L 10 is switched on. This results in a shadow having length close to 2d on the opposite side of the screen 10. The shadow will be detected by the two sensors S 1 and S 2 provided that the distance between those two sensors is
- FIG. 18 illustrates the second situation where the same object 24 is placed close to the edge of the upper edge of the touch screen 10 and LED L 10 is switched on. As shown, a shadow is dropped by the object on the opposite side of the screen and is slightly longer than d, meaning that neither of the two sensors Sl and S2 will be able to detect any shadow. Comparing this situation with the first situation where the object 24 is in the center of the screen, in the current scenario the other transmitters L 0 , L 1 , L 3 and L 4 will not provide any information whereas in the first case (i.e., "object situated in the center") the transmitters L 0 , L 1 , L 3 and L 4 would provide substantial information.
- FIG. 19 illustrates that for the second situation, the only light transmitters that are capable of detecting the object are light transmitters L6 and L14.
- FIG. 20 illustrates that in the second situation (i.e., 'close to the edge') information is only provided by the light transmitters L 6 , L 14 and L 2 .
- FIG. 21 illustrates an even more extreme situation (i.e., the third situation) where the same object 24 is now placed in the upper left corner of the touch screen 10.
- the light transmitter L 10 is switched on during its turn-on time, it results in shadows along two edges of the corner both having the length ⁇ d. This shadow cannot be detected by any of the touch screen sensors. If we consider what can be detected in this situation by consequently switching on and off one LED after another, it become clear that only blocking of the L 0 and L 15 transmitters can be detected as shown in FIG. 21.
- the calculation of the maximum area in this case gives an even less precise estimation of the position, size and shape of the object compared to the two previous cases, 'in the middle' and 'close to the edge'.
- FIGS. 22-25 illustrate another embodiment where different angular positions are selected for the light transmitters.
- the light transmitters in certain embodiments can be oriented in a non-perpendicular orientation to the edge of the touch screen display 10.
- the angle ⁇ indicates an angular measure between an edge of the screen and the axis of one of the light transmitters (e.g. L 0 ) and the angle ⁇ indicates an angular width of the emitted light beam from the light transmitter L 0 .
- Fig. 23 certain of the light transmitters are positioned in the corner areas of the touch screen display 10 and are rotated (angularly directed) towards the middle of the touch screen display, so that the light beam would light up the total screen area. It should be appreciated that by rotating the light transmitters in the corner areas, the efficiency of the rotated light transmitters is increased. It should also be noted that the angular rotations are fixed in the touch screen display 10 and cannot be re-oriented thereafter. In a further embodiment of the present invention, a combination of different light transmitters may be used in the same application.
- transmitters having light beams of different angular widths For example transmitters used in the corners of a rectangular screen would optimally have a 90-degree light beam since emitted light outside this angle will not be used. Other transmitters of the same touch screen however can emit a wider light beam.
- a large flat area e.g. a table or a wall surface with a touch screen as input device could be used to display a game for one or more users.
- the user can use more than one interaction point, (e.g. both hands) or the user can place tangible objects (e.g. pawns) on the surface. In such case the location of multiple touch points and multiple tangible objects can be detected and if necessary identified.
- This type of application can use the input of single of multiple users to make a drawing.
- One type of a drawing application can be finger-painting application for children where they can draw with their fingers or other objects like brushes on a large touch screen. Multiple children can draw at the same time, together or using their own private part of the screen.
- the method of the invention offers an alternative solution to this problem because it provides a capability for distinguishing between a hand and a stylus based on the shape and multiple touch points detected.
- Gestures can be a powerful way of interacting with systems.
- most gestures come from a screen, tablets or other input devices with a single input point. This results in enabling only a limited set of gestures that are built up from (a sequential set) single lines or curves.
- the present invention also allows for gestures that consist of multiple lines and curves that are drawn simultaneously, or even enabling symbolic gestures by detecting the hand shape. This allows for more freedom in interaction styles, because more information can be conveyed to the system in a single user action.
- An example gesture consisting of multiple input points is, e.g. two fingers closely placed together on a screen and moving them apart in two different directions.
- the example gesture can for instance be interpreted as 'enlarge the window on screen to this new size relative to the starting point (of the gestures)' in a desktop environment or 'zoom in on this picture on the position of the starting point (of the gesture), with the zoom factor relative to the distance both fingers have traveled across the screen' in a picture viewer application
- the user interaction styles (techniques) enabled by the described touch screen include: • Input of a single touch point like in traditional touch screens
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008500329A JP2008533581A (en) | 2005-03-10 | 2006-03-08 | System and method for detecting position, size and shape of multiple objects interacting with a touch screen display |
EP06711053A EP1859339A2 (en) | 2005-03-10 | 2006-03-08 | System and method for detecting the location, size and shape of multiple objects that interact with a touch screen display |
US11/908,032 US20090135162A1 (en) | 2005-03-10 | 2006-03-08 | System and Method For Detecting the Location, Size and Shape of Multiple Objects That Interact With a Touch Screen Display |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US66036605P | 2005-03-10 | 2005-03-10 | |
US60/660,366 | 2005-03-10 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2006095320A2 true WO2006095320A2 (en) | 2006-09-14 |
WO2006095320A3 WO2006095320A3 (en) | 2007-03-01 |
Family
ID=36607433
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2006/050728 WO2006095320A2 (en) | 2005-03-10 | 2006-03-08 | System and method for detecting the location, size and shape of multiple objects that interact with a touch screen display |
Country Status (6)
Country | Link |
---|---|
US (1) | US20090135162A1 (en) |
EP (1) | EP1859339A2 (en) |
JP (1) | JP2008533581A (en) |
KR (1) | KR20070116870A (en) |
CN (1) | CN101137956A (en) |
WO (1) | WO2006095320A2 (en) |
Cited By (75)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008093258A1 (en) | 2007-01-29 | 2008-08-07 | Koninklijke Philips Electronics N.V. | Method and system for locating an object on a surface |
WO2008148307A1 (en) * | 2007-06-04 | 2008-12-11 | Beijing Irtouch Systems Co., Ltd. | Method for identifying multiple touch points on an infrared touch screen |
WO2008154792A1 (en) * | 2007-06-15 | 2008-12-24 | Vtron Technologies Ltd. | Infrared touch screen and multi-point touch positioning method |
US20090153519A1 (en) * | 2007-12-17 | 2009-06-18 | Suarez Rovere Victor Manuel | Method and apparatus for tomographic touch imaging and interactive system using same |
WO2009135320A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive input system and illumination assembly therefor |
WO2010006886A2 (en) * | 2008-06-23 | 2010-01-21 | Flatfrog Laboratories Ab | Determining the location of one or more objects on a touch surface |
WO2010015408A1 (en) * | 2008-08-07 | 2010-02-11 | Owen Drumm | Method and apparatus for detecting a multitouch event in an optical touch-sensitive device |
US20100079407A1 (en) * | 2008-09-26 | 2010-04-01 | Suggs Bradley N | Identifying actual touch points using spatial dimension information obtained from light transceivers |
WO2010038926A1 (en) | 2008-10-02 | 2010-04-08 | Korea Institute Of Science And Technology | Optical recognition user input device and method of recognizing input from user |
EP2174204A1 (en) * | 2007-07-23 | 2010-04-14 | Smart Technologies ULC | Touchscreen based on frustrated total internal reflection |
WO2010081702A2 (en) | 2009-01-14 | 2010-07-22 | Citron Gmbh | Multitouch control panel |
WO2010064983A3 (en) * | 2008-12-05 | 2010-08-05 | Flatfrog Laboratories Ab | A touch sensing apparatus and method of operating the same |
US20100245293A1 (en) * | 2009-03-27 | 2010-09-30 | Epson Imaging Devices Corporation | Position detecting device and electro-optical device |
WO2010112404A1 (en) * | 2009-03-31 | 2010-10-07 | International Business Machines Corporation | Multi-touch optical touch panel |
WO2011003205A1 (en) * | 2009-07-10 | 2011-01-13 | Smart Technologies Ulc | Disambiguating pointers by imaging multiple touch-input zones |
US20110095989A1 (en) * | 2009-10-23 | 2011-04-28 | Smart Technologies Ulc | Interactive input system and bezel therefor |
WO2011049511A1 (en) * | 2009-10-19 | 2011-04-28 | Flatfrog Laboratories Ab | Extracting touch data that represents one or more objects on a touch surface |
WO2011049513A1 (en) * | 2009-10-19 | 2011-04-28 | Flatfrog Laboratories Ab | Determining touch data for one or more objects on a touch surface |
US8055022B2 (en) | 2000-07-05 | 2011-11-08 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US8089462B2 (en) | 2004-01-02 | 2012-01-03 | Smart Technologies Ulc | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US8120596B2 (en) | 2004-05-21 | 2012-02-21 | Smart Technologies Ulc | Tiled touch system |
EP2433204A1 (en) * | 2009-05-18 | 2012-03-28 | FlatFrog Laboratories AB | Determining the location of an object on a touch surface |
US8167698B2 (en) | 2006-09-13 | 2012-05-01 | Koninklijke Philips Electronics N.V. | Determining the orientation of an object placed on a surface |
EP2473906A1 (en) * | 2009-09-02 | 2012-07-11 | FlatFrog Laboratories AB | Touch-sensitive system and method for controlling the operation thereof |
US8228304B2 (en) | 2002-11-15 | 2012-07-24 | Smart Technologies Ulc | Size/scale orientation determination of a pointer in a camera-based touch system |
US8237685B2 (en) | 2006-06-28 | 2012-08-07 | Koninklijke Philips Electronics N.V. | Method and apparatus for object learning and recognition based on optical parameters |
WO2012116429A1 (en) * | 2011-02-28 | 2012-09-07 | Baanto International Ltd. | Systems and methods for sensing and tracking radiation blocking objects on a surface |
US8274496B2 (en) | 2004-04-29 | 2012-09-25 | Smart Technologies Ulc | Dual mode touch systems |
US8325134B2 (en) | 2003-09-16 | 2012-12-04 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
US8339378B2 (en) | 2008-11-05 | 2012-12-25 | Smart Technologies Ulc | Interactive input system with multi-angle reflector |
US8416206B2 (en) | 2009-07-08 | 2013-04-09 | Smart Technologies Ulc | Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system |
US8456451B2 (en) | 2003-03-11 | 2013-06-04 | Smart Technologies Ulc | System and method for differentiating between pointers used to contact touch surface |
US8456418B2 (en) | 2003-10-09 | 2013-06-04 | Smart Technologies Ulc | Apparatus for determining the location of a pointer within a region of interest |
US8482547B2 (en) | 2008-06-23 | 2013-07-09 | Flatfrog Laboratories Ab | Determining the location of one or more objects on a touch surface |
US8502789B2 (en) | 2010-01-11 | 2013-08-06 | Smart Technologies Ulc | Method for handling user input in an interactive input system, and interactive input system executing the method |
US8519977B2 (en) | 2010-05-21 | 2013-08-27 | Kabushiki Kaisha Toshiba | Electronic apparatus, input control program, and input control method |
US8542217B2 (en) | 2008-06-23 | 2013-09-24 | Flatfrog Laboratories Ab | Optical touch detection using input and output beam scanners |
US8780066B2 (en) | 2010-05-03 | 2014-07-15 | Flatfrog Laboratories Ab | Touch determination by tomographic reconstruction |
US8810522B2 (en) | 2008-09-29 | 2014-08-19 | Smart Technologies Ulc | Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method |
WO2014027241A3 (en) * | 2012-04-30 | 2014-09-12 | Rapt Ip Limited | Detecting multitouch events in an optical touch-sensitive device using touch event templates |
US8892944B2 (en) | 2010-12-30 | 2014-11-18 | International Business Machines Corporation | Handling a failed processor of multiprocessor information handling system |
US8890843B2 (en) | 2008-06-23 | 2014-11-18 | Flatfrog Laboratories Ab | Detecting the location of an object on a touch surface |
US8902193B2 (en) | 2008-05-09 | 2014-12-02 | Smart Technologies Ulc | Interactive input system and bezel therefor |
US8902195B2 (en) | 2009-09-01 | 2014-12-02 | Smart Technologies Ulc | Interactive input system with improved signal-to-noise ratio (SNR) and image capture method |
US9092092B2 (en) | 2008-08-07 | 2015-07-28 | Rapt Ip Limited | Detecting multitouch events in an optical touch-sensitive device using touch event templates |
US9134854B2 (en) | 2008-06-23 | 2015-09-15 | Flatfrog Laboratories Ab | Detecting the locations of a plurality of objects on a touch surface |
US9158401B2 (en) | 2010-07-01 | 2015-10-13 | Flatfrog Laboratories Ab | Data processing in relation to a multi-touch sensing apparatus |
US9158416B2 (en) | 2009-02-15 | 2015-10-13 | Neonode Inc. | Resilient light-based touch surface |
US9250794B2 (en) | 2012-01-23 | 2016-02-02 | Victor Manuel SUAREZ ROVERE | Method and apparatus for time-varying tomographic touch imaging and interactive system using same |
US9411430B2 (en) * | 2008-06-19 | 2016-08-09 | Neonode Inc. | Optical touch screen using total internal reflection |
US9442607B2 (en) | 2006-12-04 | 2016-09-13 | Smart Technologies Inc. | Interactive input system and method |
US9874978B2 (en) | 2013-07-12 | 2018-01-23 | Flatfrog Laboratories Ab | Partial detect mode |
US9916041B2 (en) | 2012-07-13 | 2018-03-13 | Rapt Ip Limited | Low power operation of an optical touch-sensitive device for detecting multitouch events |
US9927920B2 (en) | 2011-12-16 | 2018-03-27 | Flatfrog Laboratories Ab | Tracking objects on a touch surface |
US10019113B2 (en) | 2013-04-11 | 2018-07-10 | Flatfrog Laboratories Ab | Tomographic processing for touch detection |
WO2018174787A1 (en) * | 2017-03-22 | 2018-09-27 | Flatfrog Laboratories | Eraser for touch displays |
US10126882B2 (en) | 2014-01-16 | 2018-11-13 | Flatfrog Laboratories Ab | TIR-based optical touch systems of projection-type |
US10146376B2 (en) | 2014-01-16 | 2018-12-04 | Flatfrog Laboratories Ab | Light coupling in TIR-based optical touch systems |
US10161886B2 (en) | 2014-06-27 | 2018-12-25 | Flatfrog Laboratories Ab | Detection of surface contamination |
US10168835B2 (en) | 2012-05-23 | 2019-01-01 | Flatfrog Laboratories Ab | Spatial resolution in touch displays |
US10282035B2 (en) | 2016-12-07 | 2019-05-07 | Flatfrog Laboratories Ab | Touch device |
US10318074B2 (en) | 2015-01-30 | 2019-06-11 | Flatfrog Laboratories Ab | Touch-sensing OLED display with tilted emitters |
US10401546B2 (en) | 2015-03-02 | 2019-09-03 | Flatfrog Laboratories Ab | Optical component for light coupling |
EP3537269A1 (en) | 2015-02-09 | 2019-09-11 | FlatFrog Laboratories AB | Optical touch system |
US10437389B2 (en) | 2017-03-28 | 2019-10-08 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
US10761657B2 (en) | 2016-11-24 | 2020-09-01 | Flatfrog Laboratories Ab | Automatic optimisation of touch signal |
US11182023B2 (en) | 2015-01-28 | 2021-11-23 | Flatfrog Laboratories Ab | Dynamic touch quarantine frames |
US11256371B2 (en) | 2017-09-01 | 2022-02-22 | Flatfrog Laboratories Ab | Optical component |
US11301089B2 (en) | 2015-12-09 | 2022-04-12 | Flatfrog Laboratories Ab | Stylus identification |
US11474644B2 (en) | 2017-02-06 | 2022-10-18 | Flatfrog Laboratories Ab | Optical coupling in touch-sensing systems |
US11567610B2 (en) | 2018-03-05 | 2023-01-31 | Flatfrog Laboratories Ab | Detection line broadening |
US11893189B2 (en) | 2020-02-10 | 2024-02-06 | Flatfrog Laboratories Ab | Touch-sensing apparatus |
US11943563B2 (en) | 2019-01-25 | 2024-03-26 | FlatFrog Laboratories, AB | Videoconferencing terminal and method of operating the same |
US12056316B2 (en) | 2019-11-25 | 2024-08-06 | Flatfrog Laboratories Ab | Touch-sensing apparatus |
US12055969B2 (en) | 2018-10-20 | 2024-08-06 | Flatfrog Laboratories Ab | Frame for a touch-sensitive device and tool therefor |
Families Citing this family (129)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9213443B2 (en) | 2009-02-15 | 2015-12-15 | Neonode Inc. | Optical touch screen systems using reflected light |
US9052771B2 (en) | 2002-11-04 | 2015-06-09 | Neonode Inc. | Touch screen calibration and update methods |
US8674966B2 (en) * | 2001-11-02 | 2014-03-18 | Neonode Inc. | ASIC controller for light-based touch screen |
US9471170B2 (en) | 2002-11-04 | 2016-10-18 | Neonode Inc. | Light-based touch screen with shift-aligned emitter and receiver lenses |
US9778794B2 (en) | 2001-11-02 | 2017-10-03 | Neonode Inc. | Light-based touch screen |
US8902196B2 (en) * | 2002-12-10 | 2014-12-02 | Neonode Inc. | Methods for determining a touch location on a touch screen |
US7629967B2 (en) | 2003-02-14 | 2009-12-08 | Next Holdings Limited | Touch screen signal processing |
US8508508B2 (en) | 2003-02-14 | 2013-08-13 | Next Holdings Limited | Touch screen signal processing with single-point calibration |
US8456447B2 (en) | 2003-02-14 | 2013-06-04 | Next Holdings Limited | Touch screen signal processing |
US7538759B2 (en) | 2004-05-07 | 2009-05-26 | Next Holdings Limited | Touch panel display system with illumination and detection provided from a single edge |
US7995039B2 (en) * | 2005-07-05 | 2011-08-09 | Flatfrog Laboratories Ab | Touch pad system |
US8013845B2 (en) * | 2005-12-30 | 2011-09-06 | Flatfrog Laboratories Ab | Optical touch pad with multilayer waveguide |
US8094136B2 (en) * | 2006-07-06 | 2012-01-10 | Flatfrog Laboratories Ab | Optical touchpad with three-dimensional position determination |
US8031186B2 (en) * | 2006-07-06 | 2011-10-04 | Flatfrog Laboratories Ab | Optical touchpad system and waveguide for use therein |
US9317124B2 (en) * | 2006-09-28 | 2016-04-19 | Nokia Technologies Oy | Command input by hand gestures captured from camera |
KR100782431B1 (en) * | 2006-09-29 | 2007-12-05 | 주식회사 넥시오 | Multi position detecting method and area detecting method in infrared rays type touch screen |
US9063617B2 (en) * | 2006-10-16 | 2015-06-23 | Flatfrog Laboratories Ab | Interactive display system, tool for use with the system, and tool management apparatus |
US20080189046A1 (en) * | 2007-02-02 | 2008-08-07 | O-Pen A/S | Optical tool with dynamic electromagnetic radiation and a system and method for determining the position and/or motion of an optical tool |
US8115753B2 (en) | 2007-04-11 | 2012-02-14 | Next Holdings Limited | Touch screen system with hover and click input methods |
FR2915591A1 (en) * | 2007-04-27 | 2008-10-31 | Thomson Licensing Sas | METHOD FOR DETECTING A FLEXION EXERCISED ON A FLEXIBLE SCREEN, AND APPARATUS PROVIDED WITH SUCH A SCREEN FOR CARRYING OUT THE METHOD |
US8065624B2 (en) * | 2007-06-28 | 2011-11-22 | Panasonic Corporation | Virtual keypad systems and methods |
US7911453B2 (en) * | 2007-06-29 | 2011-03-22 | Microsoft Corporation | Creating virtual replicas of physical objects |
US8432377B2 (en) | 2007-08-30 | 2013-04-30 | Next Holdings Limited | Optical touchscreen with improved illumination |
KR20100075460A (en) | 2007-08-30 | 2010-07-02 | 넥스트 홀딩스 인코포레이티드 | Low profile touch panel systems |
US8139110B2 (en) * | 2007-11-01 | 2012-03-20 | Northrop Grumman Systems Corporation | Calibration of a gesture recognition interface system |
US20130217491A1 (en) * | 2007-11-02 | 2013-08-22 | Bally Gaming, Inc. | Virtual button deck with sensory feedback |
US8405636B2 (en) | 2008-01-07 | 2013-03-26 | Next Holdings Limited | Optical position sensing system and optical position sensor assembly |
US20090213093A1 (en) * | 2008-01-07 | 2009-08-27 | Next Holdings Limited | Optical position sensor using retroreflection |
AU2009205567B2 (en) * | 2008-01-14 | 2014-12-11 | Avery Dennison Corporation | Retroreflector for use in touch screen applications and position sensing systems |
US20090256811A1 (en) * | 2008-04-15 | 2009-10-15 | Sony Ericsson Mobile Communications Ab | Optical touch screen |
US20090278794A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System With Controlled Lighting |
JP5448370B2 (en) * | 2008-05-20 | 2014-03-19 | キヤノン株式会社 | Information processing apparatus, control method therefor, program, and storage medium |
US8248691B2 (en) * | 2008-05-30 | 2012-08-21 | Avery Dennison Corporation | Infrared light transmission film |
US8676007B2 (en) * | 2008-06-19 | 2014-03-18 | Neonode Inc. | Light-based touch surface with curved borders and sloping bezel |
US9063615B2 (en) * | 2008-08-07 | 2015-06-23 | Rapt Ip Limited | Detecting multitouch events in an optical touch-sensitive device using line images |
CN102177493B (en) | 2008-08-07 | 2014-08-06 | 拉普特知识产权公司 | Optical control systems with modulated emitters |
US8540569B2 (en) * | 2008-09-05 | 2013-09-24 | Eric Gustav Orlinsky | Method and system for multiplayer multifunctional electronic surface gaming apparatus |
KR20100031204A (en) * | 2008-09-12 | 2010-03-22 | 삼성전자주식회사 | Input device based on a proximity sensor and operation method using the same |
TWI402793B (en) * | 2008-10-01 | 2013-07-21 | Quanta Comp Inc | Calibrating apparatus and method for image processing apparatus |
US8289288B2 (en) | 2009-01-15 | 2012-10-16 | Microsoft Corporation | Virtual object adjustment via physical object detection |
US9063614B2 (en) | 2009-02-15 | 2015-06-23 | Neonode Inc. | Optical touch screens |
JP4683135B2 (en) * | 2009-03-04 | 2011-05-11 | エプソンイメージングデバイス株式会社 | Display device with position detection function and electronic device |
US8502803B2 (en) * | 2009-04-07 | 2013-08-06 | Lumio Inc | Drift compensated optical touch screen |
EP2443481B1 (en) * | 2009-06-18 | 2021-05-19 | Baanto International Ltd. | Systems and methods for sensing and tracking radiation blocking objects on a surface |
CN101957690B (en) * | 2009-07-16 | 2012-07-04 | 瑞鼎科技股份有限公司 | Optical touch device and operation method thereof |
TWI490751B (en) * | 2009-08-04 | 2015-07-01 | 瑞鼎科技股份有限公司 | Optical touch apparatus |
US8179376B2 (en) * | 2009-08-27 | 2012-05-15 | Research In Motion Limited | Touch-sensitive display with capacitive and resistive touch sensors and method of control |
US7932899B2 (en) * | 2009-09-01 | 2011-04-26 | Next Holdings Limited | Determining the location of touch points in a position detection system |
JP2011064936A (en) * | 2009-09-17 | 2011-03-31 | Seiko Epson Corp | Screen device with light receiving element, and display device with position detection function |
JP2013508851A (en) * | 2009-10-19 | 2013-03-07 | フラットフロッグ ラボラトリーズ アーベー | Touch surface with two-dimensional compensation |
US20120182268A1 (en) * | 2009-10-26 | 2012-07-19 | Sharp Kabushiki Kaisha | Position detection system, display panel, and display device |
CN102053757B (en) * | 2009-11-05 | 2012-12-19 | 上海精研电子科技有限公司 | Infrared touch screen device and multipoint positioning method thereof |
US8390600B2 (en) * | 2009-11-13 | 2013-03-05 | Microsoft Corporation | Interactive display system with contact geometry interface |
TWI494823B (en) * | 2009-11-16 | 2015-08-01 | Pixart Imaging Inc | Locating method of optical touch device and optical touch device |
KR101627715B1 (en) * | 2009-11-18 | 2016-06-14 | 엘지전자 주식회사 | Touch Panel, Driving Method for Touch Panel, and Display Apparatus having a Touch Panel |
ES2605595T3 (en) | 2009-12-11 | 2017-03-15 | Avery Dennison Corporation | Position detection systems for use in touch screens and prismatic film used therein |
US9052778B2 (en) * | 2009-12-16 | 2015-06-09 | Beijing Irtouch Systems Co., Ltd | Infrared touch screen |
EP2517090A1 (en) * | 2009-12-21 | 2012-10-31 | FlatFrog Laboratories AB | Touch surface with identification of reduced performance |
CN102129328A (en) * | 2010-01-16 | 2011-07-20 | 鸿富锦精密工业(深圳)有限公司 | Infrared touch screen |
CN102129327A (en) * | 2010-01-20 | 2011-07-20 | 鸿友科技股份有限公司 | High-efficiency infrared touch panel device |
TWM393739U (en) * | 2010-02-12 | 2010-12-01 | Pixart Imaging Inc | Optical touch control apparatus |
MX2012010864A (en) | 2010-03-22 | 2013-04-03 | Mattel Inc | Electronic device and the input and output of data. |
SG183856A1 (en) * | 2010-03-24 | 2012-10-30 | Neonode Inc | Lens arrangement for light-based touch screen |
US11429272B2 (en) * | 2010-03-26 | 2022-08-30 | Microsoft Technology Licensing, Llc | Multi-factor probabilistic model for evaluating user input |
CN101930322B (en) * | 2010-03-26 | 2012-05-23 | 深圳市天时通科技有限公司 | Identification method capable of simultaneously identifying a plurality of contacts of touch screen |
TW201137704A (en) * | 2010-04-23 | 2011-11-01 | Sunplus Innovation Technology Inc | Optical touch-control screen system and method for recognizing relative distance of objects |
CN102236473B (en) * | 2010-04-23 | 2013-07-17 | 太瀚科技股份有限公司 | Input device and position scanning method |
JP5740104B2 (en) * | 2010-05-13 | 2015-06-24 | セイコーエプソン株式会社 | Optical position detection device and device with position detection function |
CN102270069B (en) * | 2010-06-03 | 2015-01-28 | 乐金显示有限公司 | Touch panel integrated display device |
US20120005632A1 (en) * | 2010-06-30 | 2012-01-05 | Broyles Iii Paul J | Execute a command |
JP5533408B2 (en) * | 2010-08-04 | 2014-06-25 | セイコーエプソン株式会社 | Optical position detection device and device with position detection function |
US20120054588A1 (en) * | 2010-08-24 | 2012-03-01 | Anbumani Subramanian | Outputting media content |
WO2012027829A1 (en) * | 2010-09-02 | 2012-03-08 | Baanto International Ltd. | Systems and methods for sensing and tracking radiation blocking objects on a surface |
KR20120023867A (en) * | 2010-09-02 | 2012-03-14 | 삼성전자주식회사 | Mobile terminal having touch screen and method for displaying contents thereof |
JP5725774B2 (en) * | 2010-09-13 | 2015-05-27 | キヤノン株式会社 | Coordinate input device and coordinate input method |
KR101323196B1 (en) * | 2010-10-05 | 2013-10-30 | 주식회사 알엔디플러스 | Multi-touch on touch screen apparatus |
TWI428804B (en) * | 2010-10-20 | 2014-03-01 | Pixart Imaging Inc | Optical screen touch system and method thereof |
US8605046B2 (en) * | 2010-10-22 | 2013-12-10 | Pq Labs, Inc. | System and method for providing multi-dimensional touch input vector |
US20120105378A1 (en) * | 2010-11-03 | 2012-05-03 | Toshiba Tec Kabushiki Kaisha | Input apparatus and method of controlling the same |
TWI450155B (en) * | 2011-02-15 | 2014-08-21 | Wistron Corp | Method and system for calculating calibration information for an optical touch apparatus |
CN102419661B (en) * | 2011-03-09 | 2014-09-03 | 北京汇冠新技术股份有限公司 | Touch positioning method, touch positioning device and infrared touch screen |
KR101361209B1 (en) * | 2011-05-12 | 2014-02-10 | 유병석 | Touch Screen using synchronized light pulse transfer |
KR20130007230A (en) * | 2011-06-30 | 2013-01-18 | 삼성전자주식회사 | Apparatus and method for executing application in portable terminal with touch screen |
KR20130031563A (en) * | 2011-09-21 | 2013-03-29 | 삼성전자주식회사 | Display apparatus, touch sensing apparatus and method for sensing of touch |
TWI563437B (en) * | 2011-09-26 | 2016-12-21 | Egalax Empia Technology Inc | Apparatus for detecting position by infrared rays and touch panel using the same |
CN103019459A (en) * | 2011-09-28 | 2013-04-03 | 程抒一 | Non-rectangular staggered infrared touch screen |
CN102331890A (en) * | 2011-10-24 | 2012-01-25 | 苏州佳世达电通有限公司 | Optical touch screen and optical sensing correction method thereof |
WO2013089623A2 (en) * | 2011-12-16 | 2013-06-20 | Flatfrog Laboratories Ab | Tracking objects on a touch surface |
CN103206967B (en) * | 2012-01-16 | 2016-09-28 | 联想(北京)有限公司 | A kind of method and device determining that sensor arranges position |
US9058168B2 (en) * | 2012-01-23 | 2015-06-16 | Blackberry Limited | Electronic device and method of controlling a display |
EP2817696A4 (en) | 2012-02-21 | 2015-09-30 | Flatfrog Lab Ab | Touch determination with improved detection of weak interactions |
WO2013126905A2 (en) * | 2012-02-24 | 2013-08-29 | Moscarillo Thomas J | Gesture recognition devices and methods |
TWI475446B (en) * | 2012-04-24 | 2015-03-01 | Wistron Corp | Optical touch control system and capture signal adjusting method thereof |
WO2013176614A2 (en) * | 2012-05-23 | 2013-11-28 | Flatfrog Laboratories Ab | Touch-sensitive apparatus with improved spatial resolution |
TWI498771B (en) * | 2012-07-06 | 2015-09-01 | Pixart Imaging Inc | Gesture recognition system and glasses with gesture recognition function |
US9223406B2 (en) * | 2012-08-27 | 2015-12-29 | Samsung Electronics Co., Ltd. | Screen display control method of electronic device and apparatus therefor |
CN102902422A (en) * | 2012-08-30 | 2013-01-30 | 深圳市印天印象科技有限公司 | Multi-point touch system and method |
US10585530B2 (en) | 2014-09-23 | 2020-03-10 | Neonode Inc. | Optical proximity sensor |
US10282034B2 (en) | 2012-10-14 | 2019-05-07 | Neonode Inc. | Touch sensitive curved and flexible displays |
US9921661B2 (en) | 2012-10-14 | 2018-03-20 | Neonode Inc. | Optical proximity sensor and associated user interface |
US10324565B2 (en) | 2013-05-30 | 2019-06-18 | Neonode Inc. | Optical proximity sensor |
US9207800B1 (en) | 2014-09-23 | 2015-12-08 | Neonode Inc. | Integrated light guide and touch screen frame and multi-touch determination method |
US9741184B2 (en) | 2012-10-14 | 2017-08-22 | Neonode Inc. | Door handle with optical proximity sensors |
US9164625B2 (en) | 2012-10-14 | 2015-10-20 | Neonode Inc. | Proximity sensor for determining two-dimensional coordinates of a proximal object |
JP6119518B2 (en) | 2013-02-12 | 2017-04-26 | ソニー株式会社 | Sensor device, input device and electronic apparatus |
CN103123555B (en) * | 2013-02-19 | 2016-12-28 | 创维光电科技(深圳)有限公司 | A kind of pattern recognition method based on infrared touch panel, device and infrared touch panel |
US9183755B2 (en) * | 2013-03-12 | 2015-11-10 | Zheng Shi | System and method for learning, composing, and playing music with physical objects |
WO2014147943A1 (en) * | 2013-03-18 | 2014-09-25 | ソニー株式会社 | Sensor device, input device, and electronic device |
CN104216549B (en) * | 2013-06-04 | 2018-10-12 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN105283744B (en) * | 2013-06-05 | 2018-05-18 | Ev 集团 E·索尔纳有限责任公司 | To determine the measuring device and method of pressure map |
CN104281330A (en) * | 2013-07-02 | 2015-01-14 | 北京汇冠新技术股份有限公司 | Infrared touch screen and infrared element non-equidistant arranging method thereof |
JP6142745B2 (en) | 2013-09-10 | 2017-06-07 | ソニー株式会社 | Sensor device, input device and electronic apparatus |
US9367174B2 (en) * | 2014-03-28 | 2016-06-14 | Intel Corporation | Wireless peripheral data transmission for touchscreen displays |
CN104978078B (en) * | 2014-04-10 | 2018-03-02 | 上海品奇数码科技有限公司 | A kind of touch point recognition methods based on infrared touch screen |
US10402017B2 (en) * | 2014-09-02 | 2019-09-03 | Rapt Ip Limited | Instrument detection with an optical touch sensitive device |
JP6390277B2 (en) * | 2014-09-02 | 2018-09-19 | ソニー株式会社 | Information processing apparatus, control method, and program |
US9823750B2 (en) * | 2015-03-23 | 2017-11-21 | Visteon Global Technologies, Inc. | Capturing gesture-based inputs |
CN105302381B (en) * | 2015-12-07 | 2019-07-02 | 广州华欣电子科技有限公司 | Infrared touch panel precision method of adjustment and device |
US9898102B2 (en) | 2016-03-11 | 2018-02-20 | Microsoft Technology Licensing, Llc | Broadcast packet based stylus pairing |
US10073617B2 (en) | 2016-05-19 | 2018-09-11 | Onshape Inc. | Touchscreen precise pointing gesture |
CN106325737B (en) * | 2016-08-03 | 2021-06-18 | 海信视像科技股份有限公司 | Writing path erasing method and device |
CN106775135B (en) * | 2016-11-14 | 2020-06-09 | 海信视像科技股份有限公司 | Method and device for positioning touch point on infrared touch device and terminal equipment |
CN107783695B (en) * | 2017-09-27 | 2021-01-12 | 深圳市天英联合教育股份有限公司 | Infrared touch screen arrangement method and device and display equipment |
KR101969528B1 (en) * | 2017-09-29 | 2019-04-16 | 에스케이텔레콤 주식회사 | Method and apparatus for controlling touch display and touch display system |
EP3731068A4 (en) * | 2017-12-19 | 2021-05-12 | Sony Corporation | Information processing system, information processing method, and program |
CN115039060A (en) | 2019-12-31 | 2022-09-09 | 内奥诺德公司 | Non-contact touch input system |
IL275807B (en) | 2020-07-01 | 2022-02-01 | Elbit Systems Ltd | A touchscreen |
US11669210B2 (en) | 2020-09-30 | 2023-06-06 | Neonode Inc. | Optical touch sensor |
WO2023106983A1 (en) * | 2021-12-09 | 2023-06-15 | Flatfrog Laboratories Ab | Improved touch-sensing apparatus |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2133537A (en) | 1982-12-16 | 1984-07-25 | Glyben Automation Limited | Position detector system |
GB2156514A (en) | 1984-03-29 | 1985-10-09 | Univ London | Shape sensors |
US20020075243A1 (en) | 2000-06-19 | 2002-06-20 | John Newton | Touch panel display system |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4703316A (en) * | 1984-10-18 | 1987-10-27 | Tektronix, Inc. | Touch panel input apparatus |
US4746770A (en) * | 1987-02-17 | 1988-05-24 | Sensor Frame Incorporated | Method and apparatus for isolating and manipulating graphic objects on computer video monitor |
JPH01314324A (en) * | 1988-06-14 | 1989-12-19 | Sony Corp | Touch panel device |
US5605406A (en) * | 1992-08-24 | 1997-02-25 | Bowen; James H. | Computer input devices with light activated switches and light emitter protection |
US6864882B2 (en) * | 2000-05-24 | 2005-03-08 | Next Holdings Limited | Protected touch panel display system |
US6836367B2 (en) * | 2001-02-28 | 2004-12-28 | Japan Aviation Electronics Industry, Limited | Optical touch panel |
US7148913B2 (en) * | 2001-10-12 | 2006-12-12 | Hrl Laboratories, Llc | Vision-based pointer tracking and object classification method and apparatus |
US7042444B2 (en) * | 2003-01-17 | 2006-05-09 | Eastman Kodak Company | OLED display and touch screen |
US7576725B2 (en) * | 2004-10-19 | 2009-08-18 | Microsoft Corporation | Using clear-coded, see-through objects to manipulate virtual objects |
US7705835B2 (en) * | 2005-03-28 | 2010-04-27 | Adam Eikman | Photonic touch screen apparatus and method of use |
-
2006
- 2006-03-08 CN CNA200680007818XA patent/CN101137956A/en active Pending
- 2006-03-08 WO PCT/IB2006/050728 patent/WO2006095320A2/en not_active Application Discontinuation
- 2006-03-08 US US11/908,032 patent/US20090135162A1/en not_active Abandoned
- 2006-03-08 EP EP06711053A patent/EP1859339A2/en not_active Withdrawn
- 2006-03-08 JP JP2008500329A patent/JP2008533581A/en active Pending
- 2006-03-08 KR KR1020077023149A patent/KR20070116870A/en not_active Application Discontinuation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2133537A (en) | 1982-12-16 | 1984-07-25 | Glyben Automation Limited | Position detector system |
GB2156514A (en) | 1984-03-29 | 1985-10-09 | Univ London | Shape sensors |
US20020075243A1 (en) | 2000-06-19 | 2002-06-20 | John Newton | Touch panel display system |
Cited By (139)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8203535B2 (en) | 2000-07-05 | 2012-06-19 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US8055022B2 (en) | 2000-07-05 | 2011-11-08 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US8378986B2 (en) | 2000-07-05 | 2013-02-19 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US8228304B2 (en) | 2002-11-15 | 2012-07-24 | Smart Technologies Ulc | Size/scale orientation determination of a pointer in a camera-based touch system |
US8456451B2 (en) | 2003-03-11 | 2013-06-04 | Smart Technologies Ulc | System and method for differentiating between pointers used to contact touch surface |
US8325134B2 (en) | 2003-09-16 | 2012-12-04 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
US8456418B2 (en) | 2003-10-09 | 2013-06-04 | Smart Technologies Ulc | Apparatus for determining the location of a pointer within a region of interest |
US8576172B2 (en) | 2004-01-02 | 2013-11-05 | Smart Technologies Ulc | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US8089462B2 (en) | 2004-01-02 | 2012-01-03 | Smart Technologies Ulc | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US8274496B2 (en) | 2004-04-29 | 2012-09-25 | Smart Technologies Ulc | Dual mode touch systems |
US8120596B2 (en) | 2004-05-21 | 2012-02-21 | Smart Technologies Ulc | Tiled touch system |
US8237685B2 (en) | 2006-06-28 | 2012-08-07 | Koninklijke Philips Electronics N.V. | Method and apparatus for object learning and recognition based on optical parameters |
US8167698B2 (en) | 2006-09-13 | 2012-05-01 | Koninklijke Philips Electronics N.V. | Determining the orientation of an object placed on a surface |
US9442607B2 (en) | 2006-12-04 | 2016-09-13 | Smart Technologies Inc. | Interactive input system and method |
JP2010517157A (en) * | 2007-01-29 | 2010-05-20 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Method and system for positioning an object on a surface |
WO2008093258A1 (en) | 2007-01-29 | 2008-08-07 | Koninklijke Philips Electronics N.V. | Method and system for locating an object on a surface |
US8199055B2 (en) | 2007-01-29 | 2012-06-12 | Koninklijke Philips Electronics N.V. | Method and system for locating an object on a surface |
WO2008148307A1 (en) * | 2007-06-04 | 2008-12-11 | Beijing Irtouch Systems Co., Ltd. | Method for identifying multiple touch points on an infrared touch screen |
WO2008154792A1 (en) * | 2007-06-15 | 2008-12-24 | Vtron Technologies Ltd. | Infrared touch screen and multi-point touch positioning method |
EP2174204A4 (en) * | 2007-07-23 | 2011-11-16 | Smart Technologies Ulc | Touchscreen based on frustrated total internal reflection |
EP2174204A1 (en) * | 2007-07-23 | 2010-04-14 | Smart Technologies ULC | Touchscreen based on frustrated total internal reflection |
US8094137B2 (en) | 2007-07-23 | 2012-01-10 | Smart Technologies Ulc | System and method of detecting contact on a display |
US8803848B2 (en) * | 2007-12-17 | 2014-08-12 | Victor Manuel SUAREZ ROVERE | Method and apparatus for tomographic touch imaging and interactive system using same |
US20090153519A1 (en) * | 2007-12-17 | 2009-06-18 | Suarez Rovere Victor Manuel | Method and apparatus for tomographic touch imaging and interactive system using same |
US9836149B2 (en) | 2007-12-17 | 2017-12-05 | Victor Manuel SUAREZ ROVERE | Method and apparatus for tomographic tough imaging and interactive system using same |
US20130201142A1 (en) * | 2007-12-17 | 2013-08-08 | Victor Manuel SUAREZ ROVERE | Method and apparatus for tomographic tough imaging and interactive system using same |
WO2009135320A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive input system and illumination assembly therefor |
US8902193B2 (en) | 2008-05-09 | 2014-12-02 | Smart Technologies Ulc | Interactive input system and bezel therefor |
EP2288980A4 (en) * | 2008-05-09 | 2012-12-05 | Smart Technologies Ulc | Interactive input system and illumination assembly therefor |
EP2288980A1 (en) * | 2008-05-09 | 2011-03-02 | SMART Technologies ULC | Interactive input system and illumination assembly therefor |
US9411430B2 (en) * | 2008-06-19 | 2016-08-09 | Neonode Inc. | Optical touch screen using total internal reflection |
US8542217B2 (en) | 2008-06-23 | 2013-09-24 | Flatfrog Laboratories Ab | Optical touch detection using input and output beam scanners |
US8482547B2 (en) | 2008-06-23 | 2013-07-09 | Flatfrog Laboratories Ab | Determining the location of one or more objects on a touch surface |
WO2010006886A2 (en) * | 2008-06-23 | 2010-01-21 | Flatfrog Laboratories Ab | Determining the location of one or more objects on a touch surface |
CN102150117A (en) * | 2008-06-23 | 2011-08-10 | 平蛙实验室股份公司 | Determining the location of one or more objects on a touch surface |
US8890843B2 (en) | 2008-06-23 | 2014-11-18 | Flatfrog Laboratories Ab | Detecting the location of an object on a touch surface |
US9134854B2 (en) | 2008-06-23 | 2015-09-15 | Flatfrog Laboratories Ab | Detecting the locations of a plurality of objects on a touch surface |
WO2010006886A3 (en) * | 2008-06-23 | 2011-05-26 | Flatfrog Laboratories Ab | Determining the location of one or more objects on a touch surface |
EP2318905B1 (en) * | 2008-06-23 | 2017-08-16 | FlatFrog Laboratories AB | Determining the location of one or more objects on a touch surface |
US9552104B2 (en) | 2008-08-07 | 2017-01-24 | Rapt Ip Limited | Detecting multitouch events in an optical touch-sensitive device using touch event templates |
WO2010015408A1 (en) * | 2008-08-07 | 2010-02-11 | Owen Drumm | Method and apparatus for detecting a multitouch event in an optical touch-sensitive device |
US8350831B2 (en) | 2008-08-07 | 2013-01-08 | Rapt Ip Limited | Method and apparatus for detecting a multitouch event in an optical touch-sensitive device |
US9092092B2 (en) | 2008-08-07 | 2015-07-28 | Rapt Ip Limited | Detecting multitouch events in an optical touch-sensitive device using touch event templates |
US10067609B2 (en) | 2008-08-07 | 2018-09-04 | Rapt Ip Limited | Detecting multitouch events in an optical touch-sensitive device using touch event templates |
US20100079407A1 (en) * | 2008-09-26 | 2010-04-01 | Suggs Bradley N | Identifying actual touch points using spatial dimension information obtained from light transceivers |
US9317159B2 (en) * | 2008-09-26 | 2016-04-19 | Hewlett-Packard Development Company, L.P. | Identifying actual touch points using spatial dimension information obtained from light transceivers |
US8810522B2 (en) | 2008-09-29 | 2014-08-19 | Smart Technologies Ulc | Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method |
WO2010038926A1 (en) | 2008-10-02 | 2010-04-08 | Korea Institute Of Science And Technology | Optical recognition user input device and method of recognizing input from user |
EP2350780A4 (en) * | 2008-10-02 | 2013-06-12 | Korea Inst Sci & Tech | Optical recognition user input device and method of recognizing input from user |
EP2350780A1 (en) * | 2008-10-02 | 2011-08-03 | Korea Institute of Science and Technology | Optical recognition user input device and method of recognizing input from user |
US8339378B2 (en) | 2008-11-05 | 2012-12-25 | Smart Technologies Ulc | Interactive input system with multi-angle reflector |
US8581884B2 (en) | 2008-12-05 | 2013-11-12 | Flatfrog Laboratories Ab | Touch sensing apparatus and method of operating the same |
EP2370884A4 (en) * | 2008-12-05 | 2012-05-23 | Flatfrog Lab Ab | A touch sensing apparatus and method of operating the same |
US10048773B2 (en) | 2008-12-05 | 2018-08-14 | Flatfrog Laboratories Ab | Touch sensing apparatus and method of operating the same |
US9442574B2 (en) | 2008-12-05 | 2016-09-13 | Flatfrog Laboratories Ab | Touch sensing apparatus and method of operating the same |
EP2983070A1 (en) | 2008-12-05 | 2016-02-10 | FlatFrog Laboratories AB | A touch sensing apparatus and method of operating the same |
US10474249B2 (en) | 2008-12-05 | 2019-11-12 | Flatfrog Laboratories Ab | Touch sensing apparatus and method of operating the same |
WO2010064983A3 (en) * | 2008-12-05 | 2010-08-05 | Flatfrog Laboratories Ab | A touch sensing apparatus and method of operating the same |
WO2010081702A2 (en) | 2009-01-14 | 2010-07-22 | Citron Gmbh | Multitouch control panel |
US9811163B2 (en) | 2009-02-15 | 2017-11-07 | Neonode Inc. | Elastic touch input surface |
US9158416B2 (en) | 2009-02-15 | 2015-10-13 | Neonode Inc. | Resilient light-based touch surface |
US20100245293A1 (en) * | 2009-03-27 | 2010-09-30 | Epson Imaging Devices Corporation | Position detecting device and electro-optical device |
US8654101B2 (en) * | 2009-03-27 | 2014-02-18 | Epson Imaging Devices Corporation | Position detecting device and electro-optical device |
WO2010112404A1 (en) * | 2009-03-31 | 2010-10-07 | International Business Machines Corporation | Multi-touch optical touch panel |
US8878818B2 (en) | 2009-03-31 | 2014-11-04 | International Business Machines Corporation | Multi-touch optical touch panel |
EP2433204A4 (en) * | 2009-05-18 | 2014-07-23 | Flatfrog Lab Ab | Determining the location of an object on a touch surface |
EP2433204A1 (en) * | 2009-05-18 | 2012-03-28 | FlatFrog Laboratories AB | Determining the location of an object on a touch surface |
US8416206B2 (en) | 2009-07-08 | 2013-04-09 | Smart Technologies Ulc | Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system |
WO2011003205A1 (en) * | 2009-07-10 | 2011-01-13 | Smart Technologies Ulc | Disambiguating pointers by imaging multiple touch-input zones |
US8902195B2 (en) | 2009-09-01 | 2014-12-02 | Smart Technologies Ulc | Interactive input system with improved signal-to-noise ratio (SNR) and image capture method |
EP2473906A1 (en) * | 2009-09-02 | 2012-07-11 | FlatFrog Laboratories AB | Touch-sensitive system and method for controlling the operation thereof |
EP2473906A4 (en) * | 2009-09-02 | 2014-01-15 | Flatfrog Lab Ab | Touch-sensitive system and method for controlling the operation thereof |
US8686974B2 (en) | 2009-09-02 | 2014-04-01 | Flatfrog Laboratories Ab | Touch-sensitive system and method for controlling the operation thereof |
EP3196739A1 (en) | 2009-09-02 | 2017-07-26 | FlatFrog Laboratories AB | Touch-sensitive system and method for controlling the operation thereof |
WO2011049513A1 (en) * | 2009-10-19 | 2011-04-28 | Flatfrog Laboratories Ab | Determining touch data for one or more objects on a touch surface |
US9024916B2 (en) | 2009-10-19 | 2015-05-05 | Flatfrog Laboratories Ab | Extracting touch data that represents one or more objects on a touch surface |
US9430079B2 (en) | 2009-10-19 | 2016-08-30 | Flatfrog Laboratories Ab | Determining touch data for one or more objects on a touch surface |
WO2011049511A1 (en) * | 2009-10-19 | 2011-04-28 | Flatfrog Laboratories Ab | Extracting touch data that represents one or more objects on a touch surface |
US20110095989A1 (en) * | 2009-10-23 | 2011-04-28 | Smart Technologies Ulc | Interactive input system and bezel therefor |
US8502789B2 (en) | 2010-01-11 | 2013-08-06 | Smart Technologies Ulc | Method for handling user input in an interactive input system, and interactive input system executing the method |
US9996196B2 (en) | 2010-05-03 | 2018-06-12 | Flatfrog Laboratories Ab | Touch determination by tomographic reconstruction |
US8780066B2 (en) | 2010-05-03 | 2014-07-15 | Flatfrog Laboratories Ab | Touch determination by tomographic reconstruction |
US9547393B2 (en) | 2010-05-03 | 2017-01-17 | Flatfrog Laboratories Ab | Touch determination by tomographic reconstruction |
US8760430B2 (en) | 2010-05-21 | 2014-06-24 | Kabushiki Kaisha Toshiba | Electronic apparatus, input control program, and input control method |
US8519977B2 (en) | 2010-05-21 | 2013-08-27 | Kabushiki Kaisha Toshiba | Electronic apparatus, input control program, and input control method |
US9274611B2 (en) | 2010-05-21 | 2016-03-01 | Kabushiki Kaisha Toshiba | Electronic apparatus, input control program, and input control method |
US9158401B2 (en) | 2010-07-01 | 2015-10-13 | Flatfrog Laboratories Ab | Data processing in relation to a multi-touch sensing apparatus |
US9710101B2 (en) | 2010-07-01 | 2017-07-18 | Flatfrog Laboratories Ab | Data processing in relation to a multi-touch sensing apparatus |
US10013107B2 (en) | 2010-07-01 | 2018-07-03 | Flatfrog Laboratories Ab | Data processing in relation to a multi-touch sensing apparatus |
US8898517B2 (en) | 2010-12-30 | 2014-11-25 | International Business Machines Corporation | Handling a failed processor of a multiprocessor information handling system |
US8892944B2 (en) | 2010-12-30 | 2014-11-18 | International Business Machines Corporation | Handling a failed processor of multiprocessor information handling system |
JP2014510341A (en) * | 2011-02-28 | 2014-04-24 | バーント インターナショナル リミテッド | System and method for detecting and tracking radiation shielding on a surface |
WO2012116429A1 (en) * | 2011-02-28 | 2012-09-07 | Baanto International Ltd. | Systems and methods for sensing and tracking radiation blocking objects on a surface |
US9453726B2 (en) | 2011-02-28 | 2016-09-27 | Baanto International Ltd. | Systems and methods for sensing and tracking radiation blocking objects on a surface |
US9927920B2 (en) | 2011-12-16 | 2018-03-27 | Flatfrog Laboratories Ab | Tracking objects on a touch surface |
US9250794B2 (en) | 2012-01-23 | 2016-02-02 | Victor Manuel SUAREZ ROVERE | Method and apparatus for time-varying tomographic touch imaging and interactive system using same |
WO2014027241A3 (en) * | 2012-04-30 | 2014-09-12 | Rapt Ip Limited | Detecting multitouch events in an optical touch-sensitive device using touch event templates |
US10168835B2 (en) | 2012-05-23 | 2019-01-01 | Flatfrog Laboratories Ab | Spatial resolution in touch displays |
US9916041B2 (en) | 2012-07-13 | 2018-03-13 | Rapt Ip Limited | Low power operation of an optical touch-sensitive device for detecting multitouch events |
US10019113B2 (en) | 2013-04-11 | 2018-07-10 | Flatfrog Laboratories Ab | Tomographic processing for touch detection |
US9874978B2 (en) | 2013-07-12 | 2018-01-23 | Flatfrog Laboratories Ab | Partial detect mode |
US10126882B2 (en) | 2014-01-16 | 2018-11-13 | Flatfrog Laboratories Ab | TIR-based optical touch systems of projection-type |
US10146376B2 (en) | 2014-01-16 | 2018-12-04 | Flatfrog Laboratories Ab | Light coupling in TIR-based optical touch systems |
US10161886B2 (en) | 2014-06-27 | 2018-12-25 | Flatfrog Laboratories Ab | Detection of surface contamination |
US11182023B2 (en) | 2015-01-28 | 2021-11-23 | Flatfrog Laboratories Ab | Dynamic touch quarantine frames |
US10318074B2 (en) | 2015-01-30 | 2019-06-11 | Flatfrog Laboratories Ab | Touch-sensing OLED display with tilted emitters |
EP3537269A1 (en) | 2015-02-09 | 2019-09-11 | FlatFrog Laboratories AB | Optical touch system |
US11029783B2 (en) | 2015-02-09 | 2021-06-08 | Flatfrog Laboratories Ab | Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel |
US10496227B2 (en) | 2015-02-09 | 2019-12-03 | Flatfrog Laboratories Ab | Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel |
US10401546B2 (en) | 2015-03-02 | 2019-09-03 | Flatfrog Laboratories Ab | Optical component for light coupling |
US11301089B2 (en) | 2015-12-09 | 2022-04-12 | Flatfrog Laboratories Ab | Stylus identification |
US10761657B2 (en) | 2016-11-24 | 2020-09-01 | Flatfrog Laboratories Ab | Automatic optimisation of touch signal |
US11579731B2 (en) | 2016-12-07 | 2023-02-14 | Flatfrog Laboratories Ab | Touch device |
US11281335B2 (en) | 2016-12-07 | 2022-03-22 | Flatfrog Laboratories Ab | Touch device |
EP3667475A1 (en) | 2016-12-07 | 2020-06-17 | FlatFrog Laboratories AB | A curved touch device |
EP4152132A1 (en) | 2016-12-07 | 2023-03-22 | FlatFrog Laboratories AB | An improved touch device |
US10775935B2 (en) | 2016-12-07 | 2020-09-15 | Flatfrog Laboratories Ab | Touch device |
US10282035B2 (en) | 2016-12-07 | 2019-05-07 | Flatfrog Laboratories Ab | Touch device |
US11740741B2 (en) | 2017-02-06 | 2023-08-29 | Flatfrog Laboratories Ab | Optical coupling in touch-sensing systems |
US11474644B2 (en) | 2017-02-06 | 2022-10-18 | Flatfrog Laboratories Ab | Optical coupling in touch-sensing systems |
US11016605B2 (en) | 2017-03-22 | 2021-05-25 | Flatfrog Laboratories Ab | Pen differentiation for touch displays |
US10481737B2 (en) | 2017-03-22 | 2019-11-19 | Flatfrog Laboratories Ab | Pen differentiation for touch display |
WO2018174787A1 (en) * | 2017-03-22 | 2018-09-27 | Flatfrog Laboratories | Eraser for touch displays |
US11099688B2 (en) | 2017-03-22 | 2021-08-24 | Flatfrog Laboratories Ab | Eraser for touch displays |
US10606414B2 (en) | 2017-03-22 | 2020-03-31 | Flatfrog Laboratories Ab | Eraser for touch displays |
US10739916B2 (en) | 2017-03-28 | 2020-08-11 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
US11269460B2 (en) | 2017-03-28 | 2022-03-08 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
US10845923B2 (en) | 2017-03-28 | 2020-11-24 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
US11281338B2 (en) | 2017-03-28 | 2022-03-22 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
US10437389B2 (en) | 2017-03-28 | 2019-10-08 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
US10606416B2 (en) | 2017-03-28 | 2020-03-31 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
US12086362B2 (en) | 2017-09-01 | 2024-09-10 | Flatfrog Laboratories Ab | Optical component |
US11650699B2 (en) | 2017-09-01 | 2023-05-16 | Flatfrog Laboratories Ab | Optical component |
US11256371B2 (en) | 2017-09-01 | 2022-02-22 | Flatfrog Laboratories Ab | Optical component |
US11567610B2 (en) | 2018-03-05 | 2023-01-31 | Flatfrog Laboratories Ab | Detection line broadening |
US12055969B2 (en) | 2018-10-20 | 2024-08-06 | Flatfrog Laboratories Ab | Frame for a touch-sensitive device and tool therefor |
US11943563B2 (en) | 2019-01-25 | 2024-03-26 | FlatFrog Laboratories, AB | Videoconferencing terminal and method of operating the same |
US12056316B2 (en) | 2019-11-25 | 2024-08-06 | Flatfrog Laboratories Ab | Touch-sensing apparatus |
US11893189B2 (en) | 2020-02-10 | 2024-02-06 | Flatfrog Laboratories Ab | Touch-sensing apparatus |
Also Published As
Publication number | Publication date |
---|---|
JP2008533581A (en) | 2008-08-21 |
KR20070116870A (en) | 2007-12-11 |
WO2006095320A3 (en) | 2007-03-01 |
US20090135162A1 (en) | 2009-05-28 |
EP1859339A2 (en) | 2007-11-28 |
CN101137956A (en) | 2008-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1859339A2 (en) | System and method for detecting the location, size and shape of multiple objects that interact with a touch screen display | |
US8167698B2 (en) | Determining the orientation of an object placed on a surface | |
US9857892B2 (en) | Optical sensing mechanisms for input devices | |
US8988396B2 (en) | Piezo-based acoustic and capacitive detection | |
US8799803B2 (en) | Configurable input device | |
US9264037B2 (en) | Keyboard including movement activated optical keys and related methods | |
US20160246443A1 (en) | Method and apparatus for detecting lift off on a touchscreen | |
US8959013B2 (en) | Virtual keyboard for a non-tactile three dimensional user interface | |
TWI396123B (en) | Optical touch system and operating method thereof | |
US20080075368A1 (en) | Stroke-Based Data Entry Device, System, And Method | |
US20100295821A1 (en) | Optical touch panel | |
KR20050098234A (en) | Compact optical pointing apparatus and method | |
US20100225588A1 (en) | Methods And Systems For Optical Detection Of Gestures | |
JP2006509269A (en) | Apparatus and method for inputting data | |
US11392214B2 (en) | Touch control system and method | |
US20170170826A1 (en) | Optical sensor based mechanical keyboard input system and method | |
JP2020170311A (en) | Input device | |
JP2022007868A (en) | Aerial image display input device and aerial image display input method | |
US20160092032A1 (en) | Optical touch screen system and computing method thereof | |
US9389702B2 (en) | Input association | |
US9703410B2 (en) | Remote sensing touchscreen | |
WO2021260989A1 (en) | Aerial image display input device and aerial mage display input method | |
US11580772B2 (en) | Method and device for monitoring a mobile input device | |
US9213418B2 (en) | Computer input device | |
US20140153790A1 (en) | Biometrics Touchscreen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006711053 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008500329 Country of ref document: JP Ref document number: 11908032 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200680007818.X Country of ref document: CN Ref document number: 3950/CHENP/2007 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077023149 Country of ref document: KR |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: RU |
|
WWP | Wipo information: published in national office |
Ref document number: 2006711053 Country of ref document: EP |