EP1161740A1 - Installation pour interaction - Google Patents

Installation pour interaction

Info

Publication number
EP1161740A1
EP1161740A1 EP00920363A EP00920363A EP1161740A1 EP 1161740 A1 EP1161740 A1 EP 1161740A1 EP 00920363 A EP00920363 A EP 00920363A EP 00920363 A EP00920363 A EP 00920363A EP 1161740 A1 EP1161740 A1 EP 1161740A1
Authority
EP
European Patent Office
Prior art keywords
camera
arrangement according
user interface
projection surface
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00920363A
Other languages
German (de)
English (en)
Inventor
Christoph Maggioni
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of EP1161740A1 publication Critical patent/EP1161740A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • the invention relates to an arrangement for interaction.
  • a so-called virtual touchscreen is known from [1].
  • an interaction component e.g. a hand or a pointer
  • an interaction surface onto which a graphical user interface is preferably projected it is possible to interact directly on the graphical user interface, the division described above between the display of the user interface and the touch screen being omitted.
  • a flat speaker (surface speaker) is known from [2].
  • a miniature camera also known as a "keyhole camera” with a very small lens diameter of, for example, 1 mm is also known and can be purchased from electronics retailers.
  • the object of the invention is to provide an arrangement for interaction, on the basis of which a user can be recorded when viewing a display area as if he were looking directly into a camera. This object is achieved according to the features of the independent claim. Further developments of the invention also result from the dependent claims.
  • an arrangement for interaction which has a projection surface which is arranged so that it can be seen by a user.
  • a camera is also provided, which is arranged in the projection surface.
  • An above-mentioned miniature camera with a small lens diameter is preferably suitable for this.
  • a hole in the order of magnitude of the objective diameter is advantageously provided in the projection surface.
  • the camera is located behind this hole. With the small lens diameter, such a small hole in the projection surface is no longer distracting. This allows the face of the
  • the user For each viewing area, the user can be viewed from the front.
  • a service such as video telephony, in which the face of an addressee is displayed within the projection surface, the addressee has the impression that the user is looking him in the eye. The annoying effect of the participants looking past each other in video telephony is thereby avoided.
  • a further development consists in that a dark spot, at least comprising the lens of the camera, is projected onto the location of the camera. This ensures good quality in the recognition of the participant.
  • a processor unit is provided which is set up in such a way that a (graphical) user interface can be represented on the projection surface (also: interaction surface).
  • a further camera can be provided, by means of which the user interface can be recorded.
  • Interaction component in particular a hand or a finger of the user, can be interpreted on the projection surface as functionality of an input pointer.
  • a graphical user interface is projected onto the interaction surface.
  • the camera records the user interface. If the interaction component, e.g. Hand or finger of the user, via the user interface, the interaction component is recorded and a function shown on the user interface is triggered by the processor unit depending on its position. In other words, the interaction component perceives the functionality of an input pointer, in particular a (computer) mouse pointer, on the user interface.
  • a (triggering) event in the analogous example with the computer mouse: click or double-click) can, in particular, be a dwelling of the interaction component for a predetermined period of time at the position associated with the function.
  • the interaction surface can be illuminated with infrared light.
  • the recording camera can be set up in such a way that it (especially) for the spectral range of the infrared light is sensitive. This results in increased insensitivity to the influence of extraneous light.
  • the arrangement described is suitable for use in a virtual touchscreen or in a
  • Videophone can also be a special application of the virtual touchscreen.
  • One embodiment consists in that the projection surface (interaction surface) is designed as a flat loudspeaker.
  • Fig.l an arrangement for interaction
  • FIG. 2 shows a processor unit
  • FIG. 1 An arrangement of a virtual touch screen is described in FIG.
  • An interaction area (graphical user interface BOF) is mapped onto a predefinable area, here a projection display PD (interaction area).
  • the PD projection display replaces a conventional screen.
  • the input is made by pointing directly to the interaction component, hand H, on the BOF user interface.
  • the gestures are recognized and positioned within the BOF user interface by a video-based system (gesture computer) that is able to recognize and track the projection and shape of the human hand, for example, in real time.
  • the projection display PD illuminated with infrared light.
  • the infrared light source IRL can advantageously be designed using infrared light-emitting diodes.
  • a camera K which is preferably designed with a special infrared filter IRF, which is sensitive in the infrared spectral range, takes this
  • the user interface BO is mapped onto the projection display PD.
  • the user interface BOF can be designed as a menu system on a monitor of the computer R.
  • a mouse pointer MZ is moved by the hand H of the user. Instead of the hand H, a pointer can also be used as an interaction component.
  • the hand H is moved to the field F, the mouse pointer MZ follows the hand H. If the hand H remains above the field F for a predefinable time duration, the function associated with the field F is triggered on the computer R.
  • the user interface BOF is preferably designed as a flat loudspeaker, so that a sound development spreads from the surface of the user interface.
  • the flat speaker is controlled by means of a control line SL via the computer R.
  • a user KPF talks to the representation of his addressee GES.
  • the user KPF looks at the representation and makes virtual eye contact z with the addressee GES (indicated by the line of sight SEHL).
  • the user KPF is recorded frontally by a view camera KAM, which is located in the projection area, preferably in the image of the face of the addressee GES, and this image is transmitted to the addressee GES.
  • This is preferably the image of the user KPF by means of a camera line KAML in the computer R and from there, for example, via telephone line to the addressee.
  • Both participants, both the user KPF and the addressee GES perceive the "videotelephony" service as having the impression that they are in direct eye contact with each other.
  • a dark field corresponding to the size of the lens diameter of the view camera KAM is projected on the place of the view camera KAM by means of the projector P. This enables interference-reduced and high-quality transmission of the recording of the user KPF.
  • the view camera is preferably designed as a miniature camera with a small diameter.
  • the processor unit PRZE comprises a processor CPU, a memory SPE and an input / output interface IOS, which is used in different ways via an interface IFC: an output is visible on a monitor MON and / or on a printer via a graphic interface PRT issued. An entry is made using a mouse MAS or a keyboard TAST.
  • the processor unit PRZE also has a data bus BUS, which connects a memory MEM, the processor CPU and the input / output interface IOS guaranteed.
  • additional components can be connected to the data bus BUS, for example additional memory, data storage (hard disk) or scanner.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne une installation pour interaction qui comporte une surface de projection disposée de façon à pouvoir être observée par un utilisateur. L'installation comporte également une caméra qui est disposée dans la surface de projection.
EP00920363A 1999-03-17 2000-03-01 Installation pour interaction Withdrawn EP1161740A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE19911985 1999-03-17
DE19911985 1999-03-17
PCT/DE2000/000637 WO2000055802A1 (fr) 1999-03-17 2000-03-01 Installation pour interaction

Publications (1)

Publication Number Publication Date
EP1161740A1 true EP1161740A1 (fr) 2001-12-12

Family

ID=7901363

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00920363A Withdrawn EP1161740A1 (fr) 1999-03-17 2000-03-01 Installation pour interaction

Country Status (4)

Country Link
US (1) US20020041325A1 (fr)
EP (1) EP1161740A1 (fr)
JP (1) JP2002539742A (fr)
WO (1) WO2000055802A1 (fr)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19951322A1 (de) * 1999-10-25 2001-04-26 Siemens Ag Anordnung zur Interaktion
AU2002951208A0 (en) * 2002-09-05 2002-09-19 Digislide International Pty Ltd A portable image projection device
AU2007249116B2 (en) * 2001-09-14 2010-03-04 Accenture Global Services Limited Lab window collaboration
US7007236B2 (en) 2001-09-14 2006-02-28 Accenture Global Services Gmbh Lab window collaboration
US20040205256A1 (en) * 2002-11-27 2004-10-14 Richard Hoffman System and method for communicating between two or more locations
NO318883B1 (no) 2003-04-07 2005-05-18 Tandberg Telecom As Arrangement og fremgangsmate for forbedret kommunikasjon mellom deltakere i en videokonferanse
KR100539904B1 (ko) * 2004-02-27 2005-12-28 삼성전자주식회사 터치 스크린을 구비한 단말기에 사용되는 포인팅 디바이스및 그 사용 방법
US7949616B2 (en) * 2004-06-01 2011-05-24 George Samuel Levy Telepresence by human-assisted remote controlled devices and robots
JP2008009572A (ja) * 2006-06-27 2008-01-17 Fuji Xerox Co Ltd ドキュメント処理システム、ドキュメント処理方法及びプログラム
US8589824B2 (en) 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
US9696808B2 (en) 2006-07-13 2017-07-04 Northrop Grumman Systems Corporation Hand-gesture recognition method
US8972902B2 (en) 2008-08-22 2015-03-03 Northrop Grumman Systems Corporation Compound gesture recognition
US8432448B2 (en) 2006-08-10 2013-04-30 Northrop Grumman Systems Corporation Stereo camera intrusion detection system
JP5053655B2 (ja) * 2007-02-20 2012-10-17 キヤノン株式会社 映像装置および画像通信装置
JP2008227883A (ja) * 2007-03-13 2008-09-25 Brother Ind Ltd プロジェクタ
US8139110B2 (en) 2007-11-01 2012-03-20 Northrop Grumman Systems Corporation Calibration of a gesture recognition interface system
US9377874B2 (en) 2007-11-02 2016-06-28 Northrop Grumman Systems Corporation Gesture recognition light and video image projector
US8345920B2 (en) 2008-06-20 2013-01-01 Northrop Grumman Systems Corporation Gesture recognition interface system with a light-diffusive screen
CA2767714C (fr) * 2009-07-10 2017-09-26 Bio2 Technologies, Inc. Structure de tissu tridimensionnel resorbable fabrique a partir de fibres de verre bioactif liees par du verre bioactif
US20110206828A1 (en) * 2009-07-10 2011-08-25 Bio2 Technologies, Inc. Devices and Methods for Tissue Engineering

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5436639A (en) * 1993-03-16 1995-07-25 Hitachi, Ltd. Information processing system
DE69430967T2 (de) * 1993-04-30 2002-11-07 Xerox Corp Interaktives Kopiersystem
US5732227A (en) * 1994-07-05 1998-03-24 Hitachi, Ltd. Interactive information processing system responsive to user manipulation of physical objects and displayed images
DE19708240C2 (de) * 1997-02-28 1999-10-14 Siemens Ag Anordnung und Verfahren zur Detektion eines Objekts in einem von Wellen im nichtsichtbaren Spektralbereich angestrahlten Bereich
DE19734511A1 (de) * 1997-08-08 1999-02-11 Siemens Ag Kommunikationseinrichtung

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0055802A1 *

Also Published As

Publication number Publication date
WO2000055802A1 (fr) 2000-09-21
JP2002539742A (ja) 2002-11-19
US20020041325A1 (en) 2002-04-11

Similar Documents

Publication Publication Date Title
EP1161740A1 (fr) Installation pour interaction
DE102009032637B4 (de) Bildvergrößerungssystem für eine Computerschnittstelle
EP0963563B1 (fr) Agencement et procede pour la detection d'un objet dans une zone irradiee par des ondes dans le domaine spectral invisible
DE102008000001B4 (de) Integrierte Hardware- und Softwarebenutzerschnittstelle
DE602004011676T2 (de) Tragbares Kommunikationsgerät mit einer dreidimensionalen Anzeigevorrichtung
DE4344050C1 (de) Eingabeverfahren und Eingabevorrichtung für Computerterminals zum Schutz vor Ausspähung und zur Gewährleistung der Intimität eines Nutzers
EP2060118B1 (fr) Procédé de génération d'une vue d'ensemble de l'environnement d'un véhicule à moteur
DE10007891A1 (de) Verfahren und Anordnung zur Interaktion mit einer in einem Schaufenster sichtbaren Darstellung
EP1184804A2 (fr) Système pour la reproduction d'images
EP1012698B1 (fr) Dispositif pour la representation et l'entree virtuelle de donnees
DE102010013843A1 (de) Bedienvorrichtung
DE19744941C2 (de) Verfahren zur Fernbedienung einer Präsentationseinrichtung
WO2001008409A1 (fr) Visiophone mobile
DE102012008986B4 (de) Kamerasystem mit angepasster ROI, Kraftfahrzeug und entsprechendes Verfahren
DE102008037060A1 (de) Anzeigesystem
DE10119648B4 (de) Anordnung zur Bedienung von fernsehtechnischen Geräten
WO2001084482A2 (fr) Dispositif pour entrer des coordonnees relatives
DE10054242A1 (de) Verfahren zum Eingeben von Daten in ein System und Eingabeeinrichtung
DE102019201766A1 (de) Projektionsvorrichtung für eine Datenbrille, Verfahren zum Darstellen von Bildinformationen mittels einer Projektionsvorrichtung und Steuergerät
EP1161720B1 (fr) Procede et dispositif pour creer une interaction avec un utilisateur
DE102004027289B4 (de) Verfahren und Anordnung zur berührungslosen Navigation in einem Dokument
WO2000054134A1 (fr) Dispositif d'entree/sortie pour terminal utilisateur
DE102019002884A1 (de) Bedienvorrichtung
WO2003056504A1 (fr) Systeme de projection et de saisie interactif
WO2003046821A1 (fr) Saisie au stylet virtuelle

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20010712

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

17Q First examination report despatched

Effective date: 20020131

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20020716