WO2001031424A2 - Interactive system - Google Patents
Interactive system Download PDFInfo
- Publication number
- WO2001031424A2 WO2001031424A2 PCT/DE2000/003716 DE0003716W WO0131424A2 WO 2001031424 A2 WO2001031424 A2 WO 2001031424A2 DE 0003716 W DE0003716 W DE 0003716W WO 0131424 A2 WO0131424 A2 WO 0131424A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- arrangement according
- interface
- surface unit
- wireless interface
- module box
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
Definitions
- the invention relates to an arrangement for interaction
- the object to be recognized is a pointer unit, e.g. a hand or a pointing stick.
- a pointer unit e.g. a hand or a pointing stick.
- a user can use a graphic
- gesture control Interact with the interface by triggering the appropriate actions by gesture control.
- Such an arrangement is also referred to as a gesture computer or a "virtual touch screen”.
- the way in which the virtual touch screen works is that the interface is projected onto a predetermined area, the projection preferably having virtual buttons, the actuation of which triggers predetermined actions.
- a trigger mechanism is the pointing of the user with his hand as a pointer unit to one of the buttons. After a predetermined period of time, the action associated with the button is triggered.
- the position of the hand of the user is recorded by a camera, the picture is digitized and a computer determines the position of the hand in relation to the projection. In this way it can be determined whether a virtual button has been operated or not.
- This release mechanism can be compared in its effect with the operation of a computer mouse as an input medium for a graphic interface.
- the mouse pointer is moved by hand on the virtual touch screen, instead of double-clicking, the hand is held over a selected button for a predetermined minimum period.
- the possibility of additional lighting of the user interface with waves in the invisible spectral range is presented.
- the projection can be appropriately differentiated from the pointer unit by illuminating the user interface with waves in the invisible spectral range.
- the user interface When illuminated with infrared light, the user interface reflects the infrared light more strongly than the pointing device (hand). The hand therefore has a higher absorption than the user interface.
- this effect can be enhanced by the fact that the control surface is additionally provided with a reflective layer, so that the absorption of the infrared light on the control surface is as low as possible. The hand appears dark in front of the user interface.
- Projector can be reached. However, if input components, for example, are also to be designed on the user interface, then these and in particular their lines to the actuating unit are potentially exposed to vandalism.
- the object of the invention is to provide an arrangement for interaction which also has components on a surface unit, the electrical connections of which are protected against vandalism.
- the module box has the following components in particular:
- Surface unit an image can be represented.
- a camera that captures the surface unit.
- a computer that controls the projector and evaluates the recording of the camera in such a way that a recording of a movement or a dwelling of an interaction component on or in front of the surface unit as a functionality of a
- Input pointer is interpretable.
- This wireless interface is in particular a transmission interface for the transmission of data and / or energy.
- the interaction component is a hand or a finger of a user or a pointer unit.
- the user can make an entry on the surface unit, onto which a user surface (graphic user interface (GUI)) is projected expediently based on the projector.
- GUI graphics user interface
- the user moves his hand over the user interface, a mouse pointer being controlled by the movement of the hand.
- a predetermined action can be drawn on the user interface, for example, by holding the hand for a predetermined period of time at a specific point on the projection. This lingering action is triggered which is associated with a projected user interface. This corresponds to controlling a user interface using a computer mouse on a conventional personal computer.
- At least one wireless interface is a sound interface.
- the sound interface can preferably be an ultrasound interface.
- the ultrasound interface expediently has an ultrasound transmitter and an ultrasound receiver, which can be implemented on the surface unit and in the module box.
- One embodiment consists of the at least one wireless interface being an optical interface.
- the optical interface can have an infrared transmitter and an infrared receiver.
- the infrared transmitter is preferably an infrared light-emitting diode or an infrared laser diode.
- the at least one wireless interface is an electromagnetic interface.
- the transmitter of the electromagnetic interface can be designed as one of the following components
- the receiver of the electromagnetic interface can be designed as one of the following components: a) antenna coil, b) slot antenna, c) dipole antenna.
- a further embodiment consists in that the respective transmitter of the at least one wireless interface m of the module box and the respective receiver of the at least one wireless interface m of the surface unit are designed.
- the respective transmitter of the at least one wireless interface m of the surface unit and the respective receiver of the at least one wireless interface m of the module box can be designed accordingly.
- an input device is provided in the surface unit.
- the input device can include the following: a) joystick, b) button, c) rocker arm, d) distance-sensitive sensor, e) touch-sensitive sensor, f) pressure-dependent resistance, g) spring plate, h) switch,
- the surface unit has a sensor for converting light energy into electrical energy, the sensor supplying the surface unit with current.
- the sensor for converting light energy into electrical energy can be a solar cell.
- Fig.l an arrangement for interaction
- 2 shows a processor unit; 3 shows an arrangement for interaction, the one
- FIG. 4 shows a surface unit with possible input devices.
- Fig.l shows how a virtual touch screen works.
- the surface unit in the form of a passive user interface BOF is presented.
- Surface unit is understood in particular to mean the physical expression of the base surface onto which a projection is made.
- the content of the projection is e.g. the BOF user interface.
- the user interface BOF (interaction area, image of a graphical user interface GUI) is mapped to a predeterminable area (surface), here a projection display PD (interaction area).
- the Pro etionsdisplay PD replaces a conventional screen. The input is made by pointing directly with the interaction component, here a hand H on the
- the projection display PD is illuminated with infrared light in FIG.
- the infrared light source IRL can advantageously be designed using infrared light-emitting diodes.
- a camera K preferably with a special infrared filter IRF, which is particularly sensitive in an infrared spectral range is configured, the projection display PD includes the hand H of the user.
- the operating surface BOF is imaged on the projection display PD.
- the user interface BOF can be configured as a graphical user interface (GUI) on a monitor of the computer R.
- GUI graphical user interface
- a mouse pointer MZ is moved by the hand H of the user.
- a pointer unit can also be used as the interaction component.
- the hand H is moved to the field F, the mouse pointer MZ follows the hand H.
- the hand H remains above the for a predeterminable period of time Field F, the function associated with field F is drawn on computer R. In conventional systems, this corresponds to double-clicking the mouse when the mouse pointer is over a virtual switch.
- a processor unit PRZE (computer) is shown in FIG.
- the processor unit PRZE comprises a processor CPU, a memory MEM and an input / output interface IOS, which is used in different ways via an interface IFC: via a graphic interface, an output on a monitor MON, in particular a projector, becomes visible and / or printed on a PRT printer. An entry is made using a mouse MAS or a keyboard TAST.
- the processor unit PRZE also has a data bus BUS, which ensures the connection of a memory MEM, the processor CPU and the input / output interface IOS. Additional components can also be connected to the data bus BUS, for example additional memory, data memory (hard disk), camera, frame grabber, detection unit, input devices, network connections or scanners.
- 3 shows an arrangement for interaction, which has a surface unit, which surface unit comprises additional functional acts.
- a module box 111 comprises a computer (processor unit, evaluation unit) 112, a camera (Detektionsemheit for shafts, preferably emitted from IRL radiation) 113, a pro j ector 127, an infrared light 128 (source of waves for non-visible light, preferably infrared light).
- the surface unit 121 comprises a multiplicity of components which, in addition to the contactless interaction described above, also enable a touch-sensitive interaction.
- a laser diode 114 which is provided in the module box, radiates specifically onto a photodiode 120, which is arranged in the surface unit 121.
- Optical energy is thus transmitted from laser diode 114 to the associated sensor (photodiode 120).
- energy or data / signals are transmitted from an optical transmitter 119, for example a light-emitting diode or a laser diode, both of which preferably emit light in the non-visible spectral range, to a photodiode.
- the energy transmitted, in particular from the module box 111 to the surface unit 121, can be used by converting electrical energy at the receiver to operate an input device or the surface unit itself. Parts of the surface unit can be made interchangeable.
- a joystick 118 shows a signaling input device that controls the optical transmitter 119.
- the signals that are transmitted to the evaluation unit 112 via the interaction of the joystick 118 are emitted via the duplex connection Transmitters 119, 114 and receivers 120, 115.
- the Ü transmission takes place without electrical lines, that is certainly invisible against vandalism and for the user.
- the surface unit 121 is supplied with energy via a solar cell 122.
- energy can be transmitted from the module box 111 to the surface unit 121 in a targeted manner by the laser diode 114 feeding the optical receiver 120, the optical receiver 120 providing the light energy m electrical energy for operation the surface unit converts.
- FIG. 3 additionally shows a full-duplex transmitter-receiver connection based on ultrasound.
- ultrasonic transmitters 123 and 125 or ultrasonic receivers 124 and 126 are provided.
- a wireless interface of an electromagnetic type is shown in the form of the receiving and transmitting coils 130 and 116.
- a keypad 117 transmits signals which are transmitted via the transmission coil 116 from the surface unit 121 to the module box 111, there the reception coil 130.
- the computer 112 takes over the coding of the transmitted signals m predetermined actions.
- FIG. 4 shows a surface unit 216 with possible input devices.
- An area 211 is provided as an information area.
- Two light-sensitive input devices photodiodes are suitable for recording certain ambient lighting (optical button,
- a joystick 214 a joystick 214, pushbutton or push switch 217, a projection surface 218, a three-dimensional design of an interaction surface in the form of an elevation 219 and a depression 220 and a surface 221, which is shown by the camera 113 m in FIG is recorded.
- a joystick 214 a joystick 214, pushbutton or push switch 217, a projection surface 218, a three-dimensional design of an interaction surface in the form of an elevation 219 and a depression 220 and a surface 221, which is shown by the camera 113 m in FIG is recorded.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Projection Apparatus (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE1999151322 DE19951322A1 (en) | 1999-10-25 | 1999-10-25 | Interactive arrangement of a virtual touch screen type has a wireless connection between an operating module and the virtual touch screen operating surface to improve resistance to vandalism |
DE19951322.8 | 1999-10-25 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2001031424A2 true WO2001031424A2 (en) | 2001-05-03 |
WO2001031424A3 WO2001031424A3 (en) | 2001-11-29 |
Family
ID=7926770
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2000/003716 WO2001031424A2 (en) | 1999-10-25 | 2000-10-20 | Interactive system |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE19951322A1 (en) |
WO (1) | WO2001031424A2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1661538B (en) * | 2004-02-27 | 2010-05-05 | 三星电子株式会社 | Pointing device for a terminal having a touch screen and method for using the same |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE20216904U1 (en) | 2002-11-02 | 2003-01-02 | MAN Roland Druckmaschinen AG, 63075 Offenbach | Data entry for a printing press |
DE10260305A1 (en) * | 2002-12-20 | 2004-07-15 | Siemens Ag | HMI setup with an optical touch screen |
DE102005001417B4 (en) | 2004-01-29 | 2009-06-25 | Heidelberger Druckmaschinen Ag | Projection screen-dependent display / operating device |
DE102008046092A1 (en) * | 2007-09-06 | 2009-09-03 | Bernd Hopp | Endless navigator for e.g. controlling, e.g. data, in computer, has operating region with control unit, where borders of operating region are arranged such that functional unit is removed from and integrated to operating region |
DE102011119082A1 (en) * | 2011-11-21 | 2013-05-23 | Übi UG (haftungsbeschränkt) | Device arrangement for providing interactive screen of picture screen, has pointer which scans gestures in close correlation with screen, and control unit is configured to interpret scanned gestures related to data as user input |
DE102016224260A1 (en) * | 2016-12-06 | 2018-06-07 | Bayerische Motoren Werke Aktiengesellschaft | User interface, means of locomotion and method for entering information into a means of transportation |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19708240A1 (en) * | 1997-02-28 | 1998-09-10 | Siemens Ag | Arrangement for the detection of an object in a region illuminated by waves in the invisible spectral range |
DE19734511A1 (en) * | 1997-08-08 | 1999-02-11 | Siemens Ag | Communication device |
DE19806021A1 (en) * | 1998-02-13 | 1999-08-19 | Siemens Nixdorf Inf Syst | Device with virtual input device |
WO2000055802A1 (en) * | 1999-03-17 | 2000-09-21 | Siemens Aktiengesellschaft | Interaction device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5528263A (en) * | 1994-06-15 | 1996-06-18 | Daniel M. Platzker | Interactive projected video image display system |
DE29804165U1 (en) * | 1998-03-09 | 1998-05-07 | Scm Microsystems Gmbh | Peripheral data communication device |
-
1999
- 1999-10-25 DE DE1999151322 patent/DE19951322A1/en not_active Withdrawn
-
2000
- 2000-10-20 WO PCT/DE2000/003716 patent/WO2001031424A2/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19708240A1 (en) * | 1997-02-28 | 1998-09-10 | Siemens Ag | Arrangement for the detection of an object in a region illuminated by waves in the invisible spectral range |
DE19734511A1 (en) * | 1997-08-08 | 1999-02-11 | Siemens Ag | Communication device |
DE19806021A1 (en) * | 1998-02-13 | 1999-08-19 | Siemens Nixdorf Inf Syst | Device with virtual input device |
WO2000055802A1 (en) * | 1999-03-17 | 2000-09-21 | Siemens Aktiengesellschaft | Interaction device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1661538B (en) * | 2004-02-27 | 2010-05-05 | 三星电子株式会社 | Pointing device for a terminal having a touch screen and method for using the same |
Also Published As
Publication number | Publication date |
---|---|
DE19951322A1 (en) | 2001-04-26 |
WO2001031424A3 (en) | 2001-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0963563B1 (en) | Method and device for detecting an object in an area radiated by waves in the invisible spectral range | |
DE69130282T2 (en) | Position and function input system for large surface display | |
DE4423005C1 (en) | Computer data entry stylus with indistinguishable contact surfaces | |
DE69331614T2 (en) | Optical reader | |
EP1998996B1 (en) | Interactive operating device and method for operating the interactive operating device | |
DE69914659T2 (en) | Input device for computers in the form of a pen | |
DE20221921U1 (en) | Portable electronic device with mouse-like capabilities | |
EP2325727B1 (en) | Drawing, writing and pointing device for human-computer interaction | |
EP2130109B1 (en) | Mobile communication device and input device for the same | |
EP2016480B1 (en) | Optoelectronic device for the detection of the position and/or movement of an object, and associated method | |
EP2315103A2 (en) | Touchless pointing device | |
EP1184804B1 (en) | Image reproduction system | |
DE20111879U1 (en) | Universal display device | |
DE102004044999A1 (en) | Input control for devices | |
WO2013034294A1 (en) | Control device for a motor vehicle and method for operating the control device for a motor vehicle | |
WO2000023938A1 (en) | Input device for a computer | |
EP2331362A2 (en) | Operator control apparatus and method for operating an operator control apparatus with improved approach sensing | |
WO2001031424A2 (en) | Interactive system | |
EP1161740A1 (en) | Interaction device | |
CN105700715B (en) | A kind of Pen for turning page and its control method of controllable computer cursor | |
US6770864B2 (en) | Light beam operated personal interfaces to computers | |
WO1992005483A1 (en) | Data input device | |
EP1603011B1 (en) | Power saving in coordinate input device | |
DE4321825C2 (en) | Computer input or control device for rooms with high EM interference, explosive or contaminated rooms | |
DE10124834C2 (en) | Method for entering information for a human-machine interface, pen for carrying out the method and device for entering information by means of the method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): JP US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): JP US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2000987005 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2000987005 Country of ref document: EP |
|
122 | Ep: pct application non-entry in european phase |