WO2005116801A2 - User interface - Google Patents

User interface Download PDF

Info

Publication number
WO2005116801A2
WO2005116801A2 PCT/IB2005/051616 IB2005051616W WO2005116801A2 WO 2005116801 A2 WO2005116801 A2 WO 2005116801A2 IB 2005051616 W IB2005051616 W IB 2005051616W WO 2005116801 A2 WO2005116801 A2 WO 2005116801A2
Authority
WO
WIPO (PCT)
Prior art keywords
means
system
user input
user
display
Prior art date
Application number
PCT/IB2005/051616
Other languages
French (fr)
Other versions
WO2005116801A3 (en
Inventor
Anthonie H. Bergman
Hubertus M. R. Cortenraad
Rogier Winters
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to EP04102307.8 priority Critical
Priority to EP04102307 priority
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2005116801A2 publication Critical patent/WO2005116801A2/en
Publication of WO2005116801A3 publication Critical patent/WO2005116801A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Abstract

An interactive system (100) is disclosed. The system (100) comprises: display means (102) for displaying a target (116); user input means (104-110) for accepting a touch-less user input of a user (101) who is pointing to the target (116); and feedback means (117) for providing a tactile feedback to the user (101) in response of the touch-less user input.

Description

User interface

The invention relates to a system comprising: display means for displaying a target; and user input means for accepting a touch-less user input of a user who is pointing to the target.

An embodiment of an interactive system which is arranged to accept user input of a user who is pointing to a target is well-known. Typically, the target corresponds to a button of a graphical user interface being displayed by a display device. Various technologies exist to perform touch sensing, i.e. accepting the user input by sensing the touch of the user: Capacitive: A tiny current is fed in to a conductive coating within a glass sandwich from bus bars at the edge of the glass. The current leaks to ground through a finger and electronics determine an x-y position in relation to each of the bus bars; - Resistive: A flexible layer with conductive coating on one side is brought into contact with a rigid layer to form a circuit with a varying resistance. An x-y position is measured by volt drop from each of 4 bus bars at the edge; and Surface-Acoustic Wave: Exciters around the edge of a layer of glass set up vibrations on its surface. Touching the glass causes a shadow which is detected by receivers also around the edge. A system based on this technology is typically also arranged to detect pressure giving a z-axis reading. The above-mentioned technologies need the finger or other object actually touching the screen. Technologies that do not need an actual contact are also well known. Interactive systems which are based on these technologies are arranged to accept a touch-less user input. Examples of these technologies are: Projected Capacitive: Fine wires embedded in a glass substrate project an electrical field on the surface of the glass layer which is disturbed by finger presence. The x-y position is determined by electronics built into the sensor; Infrared: Infrared (IR) technology relies on the interruption of an IR light grid in front of the screen of the display device. The touch frame or opto-matrix frame contains a row of IR- light emitting diode (LEDs) and photo transistors, each mounted on two opposite sides to create a grid of invisible infrared light. The frame assembly is comprised of printed wiring boards on which the opto-electronics are mounted and is concealed behind an IR- transparent bezel. The bezel shields the opto-electronics from the operating environment while allowing the IR beams to pass through. The IR controller sequentially pulses the LEDs to create a grid of IR light beams. When a stylus, such as a finger, enters the grid, it obstructs the beams. One or more photo transistors detect the absence of light and transmits a signal that identifies the x and y coordinates; Charge Transfer Sensing: By charging a sense electrode, which can be anything electrically conductive, to a fixed potential, then transferring that charge to a charge detector comprising another known capacitor, the capacitance of the sense electrode can be readily ascertained. The charge and transfer operations are conducted by switches. This is ideally done by digital logic; in fact, the sensor is almost ideally suited to digital control and processing from start to finish. The only analog signal is a typically slow signal requiring no special precautions. Conversion of this to digital can be performed by any of a number of commercially available ADC chips. Cross Capacitance Sensing: See for instance the article "3D Touchless Display Interaction", by C. van Berkel, in SID Proceedings International Symposium, vol. 33, number 2, ppl410-1413, 2002". Capacitive sensors are arranged around the edge of a display to detect hand and finger positions. Implemented with either transistors on glass or with conventional CMOS, this offers a better front-of-screen performance and lower cost than conventional touch screens. An advantage of touch-less user input means is that in theory contamination can be prevented. Contamination can be prevented in both direction: the display means displaying the "button" can not get dirty and the finger that pressed the "button" cannot get contaminated. With "button" is meant a visualization of a target to be pointed at. Applications that are likely taking advantage of the first situation are for instance kitchen appliances, display in the kitchen (turn page while kneading the dough), in the bathroom, in a car garage (present cars need a PC for maintenance and grease). For the second situation an operating room is a good example: the surgeons can keep there hand sterile while controlling the medical equipment. In general these buttons are very hygienic, antiseptic and stay germfree. Although, touch-less user input means do exist, it appears that very often the contamination of is not prevented. In other words, although there is no actual need for touching e.g. a screen in order to provide user input, many users do touch the screen and hence the screen or finger get dirty.

It is an object of the invention to provide a system of the kind described in the opening paragraph which is arranged to minimize the probability that a user touches the system. This object of the invention is achieved in that the system further comprises feedback means for providing a tactile feedback to the user in response of the touch- less user input. The above-mentioned problem arises from the natural tendency to touch a "button" to activate it. So, even if the system reacts visibly for instance because a finger is detected by the user input means in front of the display means, users often keep moving until they touch something. By generating something which is to be sensed by the user, in particular by the finger or hand with which the user is providing the user input, the user receives an appropriate feedback which is a sign that the system has accepted the user input. This tactile feedback will prevent the user to continue with its movement towards the display means and to touch the display means. Although the display means might be based on a print, i.e. showing a static picture, it is preferred that the display means according to the invention comprises a display device for displaying a state dependent graphical user interface comprising the target. With a display device is meant a device having a structure of light generating elements which are independently controllable to generate different images subsequently. For example a CRT (Cathode Ray Tube), LCD (Liquid Crystal Display) or PDP (Plasma display panel). In an embodiment of the system according to the invention, the feedback means is arranged to eject a fluid in a predetermined direction which is related to the touch- less user input. Preferably, the predetermined direction is such that the fluid touches the finger or hand directly after the user input has been accepted. Experiments have shown that by ejecting a fluid with a velocity which exceeds a predetermined threshold in such a predetermined direction, an appropriate feedback can be provided to the user. The fluid can be any type of material which does not result in an effect which takes longer than e.g. one second. A fluid can be a liquid or a gas. Optionally, the fluid has a temperature which is some degrees higher or lower than room temperature. The temperature difference enhances the touch sensation which is caused by the speed of the ejected fluid, i.e. the pressure of the ejected fluid as perceived by the user. Preferably, the fluid comprises air. An advantage of applying air is that no separate container is required which has to be refilled with a special fluid. In another embodiment of the system according to the invention, the feedback means comprises a heating device for generating heath to be noticed by the user. By locally generating heath the user is provided with a heath sensation. Preferably, the heat is provided with a pulse which does not take long, e.g. less than 100 milliseconds. It will be clear that other types of feedback which can be sensed by e.g. the fingers or hand, i.e. the part of the body of the user with which the user input is provided to the system, can be applied. Even a spark or electrical shock with an intensity which is below a predetermined threshold, i.e. pain level, can be applied to provide the feedback. In an embodiment of the system according to the invention, the user input means comprises means for sensing capacitance related to the user. An advantage of such a technology is that it is relatively robust. In an other embodiment of the system according to the invention, the user input means comprises a light source which is arranged to cooperate with a light sensor. An advantage of such technology for accepting user input is that it is relatively cheap. An embodiment of the system according to the invention further comprises audio generating means for generating audio feedback. Besides generating tactile feedback it is advantageous to provide additional feedback by means of an audio signal which is generated synchronously with the tactile feedback. In an embodiment of the system according to the invention, the display means is arranged to display the target a predetermined distance in front of a screen of the display means. With in front of the screen is meant that the target is closer to the user then the screen. In standard displays, i.e. not according to the invention, a target is projected at the screen of the display device or even a predetermined distance behind the screen. In the latter case the screen is e.g. a plate of glass which is disposed in front of the luminance and/or color modulation and/or generation unit. For instance in the case of an LCD display a plate of glass is disposed in front of the polarization layer. Typically, a user wants to touch the target. By displaying the target in a surface which is located at a predetermined distance in front of the screen, the user perceives visual feedback when he reaches e.g. with his finger to that surface, i.e. when he crosses the surface in which the target is projected. So by displaying the target a predetermined distance in front of the screen, the risk of contamination is further reduced. In an embodiment of the system according to the invention, the display means is arranged to display the target with a first appearance corresponding to a first state and to display the target with a second appearance in response of the touch-less user input. In other words, visible feedback is generated too, by means of updating the appearance of the graphical user interface. It is well-known in the art of graphical user interfaces to indicate that e.g. a button is "pushed", or that a scroll bar is selected for movement.

These and other aspects of the system according to the invention will become apparent from and will be elucidated with respect to the implementations and embodiments described hereinafter and with reference to the accompanying drawings, wherein: Fig. 1 schematically shows an embodiment of the system according to the invention; Fig. 2 schematically shows another embodiment of the system according to the invention, comprising display means for displaying the target a predetermined distance in front of the screen; Fig. 3 schematically shows another embodiment of the system according to the invention, comprising heat generating means; Fig. 4A schematically shows another embodiment of the system according to the invention, whereby the display means comprises a mirror; Fig. 4B schematically shows the inside of a car, with an embodiment of the system according to the invention being located in the dash-board of the car; and Figs. 5A and 5B schematically show the concept of cross capacitance sensing. Same reference numerals are used to denote similar parts throughout the figures.

Fig. 1 schematically shows a first embodiment of the system 100 according to the invention. The system 100 comprises: a display device 102 for displaying targets e.g. 112, 1 16; user input means 104-110, 114 for accepting a touch-less user input of a user 101 who is pointing to one of the targets 116; and feedback means 117-122 for providing a tactile feedback to the user 101 in response of the touch- less user input. The targets corresponds to user interface gadgets of a graphical user interface. Fig. 1 depicts twenty of these gadgets showing the respective characters "0"-"9" and "a"-"j". These user interface gadgets corresponds to respective functions of a particular application which runs on the system 100. The display device 102 is arranged to display the graphical components of the user interface of the system 100. The display device 102 comprises e.g. a CRT, LCD or PDP. The user input means comprises a number of sensing means 104-110, 114 being disposed adjacent to the display device 102. The sensing of the sensing means 104-110, 114 might be based on acoustic, infra-red or radio frequency waves. The sensing means are arranged to detect the presence of an object in their vicinity. If e.g. one of the sensors 104 which is disposed at the left side of the display device 102 detects the presence of an object, e.g. a finger of a user, then the application which runs on the system 100 interprets that as user input. For instance a function which is related to target 112 is activated. Sensing means based on light are commercially available, e.g. at IVO GmbH & Co. Typically these optical sensing means are arranged to measure a distance. These sensing means are so-called proximity detectors. The appliance of these sensing means in the system according to the invention is as follows. If e.g. a hand is put in the neighborhood of the display device 102 in order to provide the system with user input then a particular one of the sensors detects that there is an object close to it. So, each of the sensors measures the distance of objects in their respective neighborhoods. If one of the measured distances is below a predetermined threshold then a user input is accepted. Sensing means based on ultra-sound are also commercially available. Alternative user input means might be applied in the system according to the invention. Instead of proximity sensors pairs of light generating and light sensing elements being disposed in a regular structure can be applied. The presence of an object like a finger of a user results in an obstruction of light going from a light generating element to the corresponding light sensing element. A further alternative is based on capacity sensing. This is explained in more detail in connection with Fig. 5A and 5B. A further alternative is based on imaging the user by means of cameras. Cameras in combination with a suitable programmed image processing unit are advantages for accepting touch- less user input. The feedback means comprises a number of tubes, e.g. 122 with a number of equidistantly disposed nozzles 1 17, 120. The nozzles 1 17, 120 comprise respective valves 115, 118 with which the nozzles can be closed. The tubes are connected to a pump for providing fluid under a predetermined pressure. The pump is not depicted in Fig. 1. The location of the nozzles is related to the location of the respective targets. As can be seen in Fig. 1, there is a nozzle for each of the targets. The working of the system 100 according to the invention is as follows. If the system 100 is turned on, a graphical user interface is displayed on the display device 102. Suppose that a user wants to control the application which runs on the system 100. In particular, to user wants to start the function corresponding to the target indicated with reference number 116. In order to start the invention he points at that particular target 116, i.e. that means that he moves his finger in the direction of the location of the display device 102 where that particular target 116 is displayed. As soon as the distance between the finger and the display device 102 is below a predetermined distance the presence of the finger is observed by two sets of sensing means, i.e. a first light generating element 110 in cooperation with a first light sensing element 108 and a second light generating clement 114 in cooperation with a second light sensing element 113. The presence of the finger is interpreted as user input. It should be noted that the user does not have to touch the display device 102. The user input is processed by the system. As a result the pump is activated, the valve 115 is opened and consequently a puff of gas leaves the appropriate nozzle 117. The user perceives a little pressure and hence is notified that his user input has been accepted by the system 100. It will be clear that although Fig. 1 shows a nozzle for each of the targets, other configurations are possible. For instance a single nozzle might be applied in connection with multiple targets or vice versa. Generating the tactile feedback by means of ejection of a fluid, preferably a puff of air can be realized by means of several types of pressure generating means. It might be based on a compressor in conjunction with a container in which the pressure is relatively high compared with the atmosphere. Alternatively the pressure pulse is generated momentously. A puff of air can be generated relatively simple by means of a loudspeaker that is blocked except for a small hole. A low frequency signal which is provided at the terminals of the loudspeaker will generate a flow of air. A small dedicated cylinder with a piston on a permanent magnet forms the basis of an alternative implementation. The permanent magnet cylinder is surrounded by a solenoid electro-magnet. A short pulse current will push the magnet out and thereby actuating the piston in the cylinder. The produced puff of air is clearly noticeable. Further improvements are possible. Fig. 2 schematically shows a second embodiment of the system 200 according to the invention, comprising display means 102 for displaying a target 116 in a surface 204 which is located at a predetermined distance in front of the screen 202. There are several techniques to achieve this, i.e. to create a so-called virtual display or target. A relatively simple construction is based on a single lens or is based on a set of parabolic mirrors. An alternative construction for displaying targets at such a surface 204 is disclosed in US patent 6,492, 961. Displaying the targets at such a surface 204 is an important measure to prevent touching the screen 202 of the display device 102. This is a feed- forward mechanism to stop the movement of the hand of the user at the position of the visible image of the target and not to continue up to the screen 202 of the display device 102 itself. Besides displaying the target 116 at a predetermined distance from the display device 102, the system 200 as depicted in Fig. 2 comprises further measures to prevent touching the screen. First of all, the system 200 comprises tactile feedback means as described in connection with Fig. 1. That means a nozzle 120 for ejecting a fluid. The system 200 further comprises a loudspeaker 206 for providing audio feedback in response to the user input. The application running on the system 200 is arranged to change the appearance of the target 116 as a result of the user input. The system 200 comprises user input means for accepting the user input. These user input means are not depicted in Fig. 2. Fig. 3 schematically shows a third embodiment of the system 300 according to the invention, comprising heat generating means. This system 300 according to the invention basically comprises the same components as the system 200 as described in connection with Fig. 2. The difference between the two systems is related to the implementation of the tactile feedback means. The system 300 comprises a heat generating element 302 which is arranged to produce warmth having an energy level which can be well perceived by a user without hurting the user. Preferably, the heat generating element 302 is arranged to radiate the warmth 304 primarily in a direction towards the finger or hand of the user and preferably only during a relatively short time interval. Otherwise the system 300 according to the invention would dissipate too much energy without any use. Fig. 4A schematically shows the system 400 according to the invention whereby the display means comprises a mirror. The display means comprises a display device 102 in front of which a semi-transparent mirror is located. The behavior of the display means is as follows. If the display device 102 is turned off, then the user will see a reflection of his own face and body. However if the display device 102 is turned on, then the user will primarily see the images being generated by means of the display device 102. The system 400 as depicted in Fig. 4A comprises a second display device which is disposed at the border of the mirror and which is applied to display a graphical user interface comprising the targets corresponding to buttons showing the characters "V", "A", "S" and "P". The tactile feedback means, e.g. nozzles being arranged to eject air, are integrated in the frame 404 of the mirror. It will be clear that a system 400 having a mirror as screen should be kept clean. Hence, it is advantageous to apply the invention in such a in interactive display system 400. Fig. 4B schematically shows the inside of a car, with the system 402 according to the invention being located in the dash-board of the car. Fig. 4B shows the view 424 on the road in front of the car, the steering-wheel 422 of the driver and a mirror 426. The mirror 426 is optional. Alternatively the display system 402 provides a view to the driver of the car which corresponds to images being captured by a camera which is located such that the scene behind the car can be imaged. The system 402 comprises sensing means 104 as described in connection with Fig 1 or 5 A and 5B. Alternatively, the system 402 comprises cameras 106a- 106b being located in the ceiling and/or wall of the car. These cameras 106a- 106b are designed to observe the environment in front of the display device 102. If one of the cameras 106a- 106b images an arm, hand or fingers, the orientation of it is determined. The determined orientation is used to establish to which one of the targets the user is pointing at. Figs. 5A and 5B schematically show the concept of cross capacitance sensing. Fig. 5A shows a simple combination of two electrodes 502-504 forming a capacitor, with a capacitive current flowing between them. Fig. 5B shows the same two electrodes 502-504 whereby a hand 101 is placed near the electrodes 502-504. Some of the field lines 506-510 are terminated on the hand and the current is decreased relative to the current which flows in the case without the hand 101 being placed near the electrodes, like in Fig 5 A. A measurable effect can be obtained in which the spatial range is roughly equivalent to the separation between the electrodes 502-504. Arrays of multiple electrodes offer rich possibilities for object and gesture recognition. For instance, n transmitters and n receivers arranged around the edge of a display contain n2 electrode pairs at many different separations. See for more details the article "3D Touchless Display Interaction", by C. van Berkel, in SID Proceedings International Symposium, vol. 33, number 2, ppl410-1413, 2002. It should be noted that the above-mentioned embodiments illustrate rather than limit the invention and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be constructed as limiting the claim. The word 'comprising' does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements and by means of a suitable programmed computer. In the unit claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words are to be interpreted as names.

Claims

CLAIMS:
1. A system (100) comprising: display means (102) for displaying a target; user input means (104-110) for accepting a touch-less user input of a user (101) who is pointing to the target (116); and - feedback means (120) for providing a tactile feedback to the user (101) in response of the touch-less user input.
2. A system (100) as claimed in Claim 1, wherein the display means (102) comprises a display device for displaying a graphical user interface comprising the target (116).
3. A system (100) as claimed in Claim 1 or 2, wherein the feedback means is arranged to eject a fluid in a predetermined direction which is related to the touch-less user input.
4. A system (100) as claimed in Claim 3, wherein the fluid comprises air.
5. A system (300) as claimed in Claim 1 or 2, wherein the feedback means comprises a heating device for generating heath to be noticed by the user (101).
6. A system (100) as claimed in Claim 1, wherein the user input means comprises means for sensing capacitance related to the user (101).
7. A system (100) as claimed in Claim 1, wherein the user input means (104-110) comprises a light source which is arranged to cooperate with a light sensor.
8. A system (200, 300) as claimed in Claim 1, further comprising audio generating means (206) for generating audio feedback.
9. A system (200, 300) as claimed in Claim 2, wherein the display means (102) is arranged to display the target (116) a predetermined distance in front of a screen (202) of the display means (102).
10. A system (100) as claimed in Claim 2, wherein the display means (102) is arranged to display the target (116) with a first appearance corresponding to a first state and to display the target (116) with a second appearance in response of the touch-less user input.
PCT/IB2005/051616 2004-05-26 2005-05-18 User interface WO2005116801A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP04102307.8 2004-05-26
EP04102307 2004-05-26

Publications (2)

Publication Number Publication Date
WO2005116801A2 true WO2005116801A2 (en) 2005-12-08
WO2005116801A3 WO2005116801A3 (en) 2006-03-30

Family

ID=34967383

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2005/051616 WO2005116801A2 (en) 2004-05-26 2005-05-18 User interface

Country Status (1)

Country Link
WO (1) WO2005116801A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7415352B2 (en) 2005-05-20 2008-08-19 Bose Corporation Displaying vehicle information
WO2010070534A1 (en) 2008-12-19 2010-06-24 Koninklijke Philips Electronics N.V. Apparatus and method for providing a user interface to an information processing system
CN103176721A (en) * 2011-12-20 2013-06-26 深圳市蓝韵网络有限公司 Method and system used for realizing medical graphic digital input and based on touch control device
US8545322B2 (en) 2001-09-28 2013-10-01 Konami Gaming, Inc. Gaming machine with proximity sensing touchless display
WO2014191036A1 (en) * 2013-05-29 2014-12-04 Brainlab Ag Gesture feedback for non-sterile medical displays

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0266825A (en) * 1988-08-31 1990-03-06 Aisin Seiki Co Ltd Space switch
JPH0346724A (en) * 1989-07-14 1991-02-28 Aisin Seiki Co Ltd Switching device
US5583478A (en) * 1995-03-01 1996-12-10 Renzi; Ronald Virtual environment tactile system
US5631861A (en) * 1990-02-02 1997-05-20 Virtual Technologies, Inc. Force feedback and texture simulating interface device
US5709219A (en) * 1994-01-27 1998-01-20 Microsoft Corporation Method and apparatus to create a complex tactile sensation
WO1998039842A1 (en) * 1997-03-06 1998-09-11 Howard Robert B Wrist-pendant wireless optical keyboard
US6031519A (en) * 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
WO2001041636A1 (en) * 1999-12-12 2001-06-14 Ralph Lander Open loop tactile feedback
US6400359B1 (en) * 1998-08-27 2002-06-04 Pentel Kabushiki Kaisha Apparatus for detecting an approaching conductor, or an approach point of a conductor and an electrostatic capacity type touch panel apparatus
US6492961B1 (en) * 1994-12-05 2002-12-10 Tietech Corporation Image forming device and touchless display switch
US20030117371A1 (en) * 2001-12-13 2003-06-26 Roberts John W. Refreshable scanning tactile graphic display for localized sensory stimulation
WO2004042693A1 (en) * 2002-11-01 2004-05-21 Immersion Corporation Method and apparatus for providing haptic feedback

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0266825A (en) * 1988-08-31 1990-03-06 Aisin Seiki Co Ltd Space switch
JPH0346724A (en) * 1989-07-14 1991-02-28 Aisin Seiki Co Ltd Switching device
US5631861A (en) * 1990-02-02 1997-05-20 Virtual Technologies, Inc. Force feedback and texture simulating interface device
US5709219A (en) * 1994-01-27 1998-01-20 Microsoft Corporation Method and apparatus to create a complex tactile sensation
US6492961B1 (en) * 1994-12-05 2002-12-10 Tietech Corporation Image forming device and touchless display switch
US5583478A (en) * 1995-03-01 1996-12-10 Renzi; Ronald Virtual environment tactile system
WO1998039842A1 (en) * 1997-03-06 1998-09-11 Howard Robert B Wrist-pendant wireless optical keyboard
US6031519A (en) * 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
US6400359B1 (en) * 1998-08-27 2002-06-04 Pentel Kabushiki Kaisha Apparatus for detecting an approaching conductor, or an approach point of a conductor and an electrostatic capacity type touch panel apparatus
WO2001041636A1 (en) * 1999-12-12 2001-06-14 Ralph Lander Open loop tactile feedback
US20030117371A1 (en) * 2001-12-13 2003-06-26 Roberts John W. Refreshable scanning tactile graphic display for localized sensory stimulation
WO2004042693A1 (en) * 2002-11-01 2004-05-21 Immersion Corporation Method and apparatus for providing haptic feedback

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BERKEL VAN C ED - SOCIETY FOR INFORMATION DISPLAY: "3D TOUCHLESS DISPLAY INTERACTION" 2002 SID INTERNATIONAL SYMPOSIUM DIGEST OF TECHNICAL PAPERS. BOSTON, MA, MAY 21 - 23, 2002, SID INTERNATIONAL SYMPOSIUM DIGEST OF TECHNICAL PAPERS, SAN JOSE, CA : SID, US, vol. VOL. 33 / 2, May 2002 (2002-05), pages 1410-1413, XP001134343 cited in the application *
HIROTA K ET AL: "Development of surface display" VIRTUAL REALITY ANNUAL INTERNATIONAL SYMPOSIUM, 1993., 1993 IEEE SEATTLE, WA, USA 18-22 SEPT. 1993, NEW YORK, NY, USA,IEEE, 18 September 1993 (1993-09-18), pages 256-262, XP010130484 ISBN: 0-7803-1363-1 *
PATENT ABSTRACTS OF JAPAN vol. 014, no. 241 (E-0931), 22 May 1990 (1990-05-22) & JP 02 066825 A (AISIN SEIKI CO LTD), 6 March 1990 (1990-03-06) *
PATENT ABSTRACTS OF JAPAN vol. 015, no. 185 (E-1066), 13 May 1991 (1991-05-13) & JP 03 046724 A (AISIN SEIKI CO LTD), 28 February 1991 (1991-02-28) *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9452351B2 (en) 2001-09-28 2016-09-27 Konami Gaming, Inc. Gaming machine with proximity sensing touchless display
US8545322B2 (en) 2001-09-28 2013-10-01 Konami Gaming, Inc. Gaming machine with proximity sensing touchless display
US7415352B2 (en) 2005-05-20 2008-08-19 Bose Corporation Displaying vehicle information
WO2010070534A1 (en) 2008-12-19 2010-06-24 Koninklijke Philips Electronics N.V. Apparatus and method for providing a user interface to an information processing system
US9752568B2 (en) 2008-12-19 2017-09-05 Koninklijke Philips N.V. Apparatus and method for providing a user interface to an information processing system
CN102257455A (en) * 2008-12-19 2011-11-23 皇家飞利浦电子股份有限公司 Apparatus and method for providing a user interface to an information processing system
CN103176721A (en) * 2011-12-20 2013-06-26 深圳市蓝韵网络有限公司 Method and system used for realizing medical graphic digital input and based on touch control device
WO2014191036A1 (en) * 2013-05-29 2014-12-04 Brainlab Ag Gesture feedback for non-sterile medical displays
WO2014191341A1 (en) * 2013-05-29 2014-12-04 Brainlab Ag Gesture feedback for non-sterile medical displays
US20160132122A1 (en) * 2013-05-29 2016-05-12 Brainlab Ag Gesture Feedback for Non-Sterile Medical Displays

Also Published As

Publication number Publication date
WO2005116801A3 (en) 2006-03-30

Similar Documents

Publication Publication Date Title
EP2702468B1 (en) Electro-vibrotactile display
CN105353877B (en) System and method for rub display and additional tactile effect
CN101847068B (en) Method and apparatus for operating touch panel
US8259240B2 (en) Multi-touch sensing through frustrated total internal reflection
DE102009032637B4 (en) image magnification system for a computer interface
US8587549B2 (en) Virtual object adjustment via physical object detection
JP5948021B2 (en) Electric vibration for touch surface
CN101385069B (en) User input apparatus, system, method and computer program for use with a screen having a translucent surface
Hodges et al. ThinSight: versatile multi-touch sensing for thin form-factor displays
US9910502B2 (en) Gesture-based user-interface with user-feedback
US10191547B2 (en) Tactile sensation providing apparatus and control method for tactile sensation providing apparatus
JP4654211B2 (en) Force / position sensing display
CA2481396C (en) Gesture recognition method and touch system incorporating the same
CN102016713B (en) Projection of images onto tangible user interfaces
US20190253672A1 (en) Vehicular vision system with split display
US9092129B2 (en) System and method for capturing hand annotations
US20100097323A1 (en) Hydrogel-based tactile-feedback touch screen
US20120113051A1 (en) Touch- or proximity -sensitive interface
JP2011503756A (en) Touchpad with combined display and proximity and touch detection capabilities
US7176885B2 (en) Retaskable switch-indicator unit
JP5411265B2 (en) Multi-touch touch screen with pen tracking
US20120223880A1 (en) Method and apparatus for producing a dynamic haptic effect
US20080278450A1 (en) Method and Device for Preventing Staining of a Display Device
CN1996202B (en) Embedded camera with privacy filter
US9037354B2 (en) Controlling vehicle entertainment systems responsive to sensed passenger gestures

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase