EP1419483A2 - Presentation d'informations utilisateurs - Google Patents

Presentation d'informations utilisateurs

Info

Publication number
EP1419483A2
EP1419483A2 EP02754527A EP02754527A EP1419483A2 EP 1419483 A2 EP1419483 A2 EP 1419483A2 EP 02754527 A EP02754527 A EP 02754527A EP 02754527 A EP02754527 A EP 02754527A EP 1419483 A2 EP1419483 A2 EP 1419483A2
Authority
EP
European Patent Office
Prior art keywords
camera
user information
information
image
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP02754527A
Other languages
German (de)
English (en)
Inventor
Peter Wiedenberg
Soeren Moritz
Thomas Jodoin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of EP1419483A2 publication Critical patent/EP1419483A2/fr
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Definitions

  • the invention relates to a system and a method for
  • the invention has for its object to improve the simultaneous display of user information and image information of an environment.
  • This object is achieved by a system for displaying user information with a camera for recording image information of a section of an environment, a zoom device for changing the size of the recorded section according to a zoom factor and / or a device for three-dimensional alignment of the camera according to a space vector - with a computer unit for calculating location coordinates of the image information based on spatial coordinates of the camera and / or the control variables zoom factor and space vector, for assigning user information to the location coordinates and for calculating positions of images of the image information on a display surface of a visualization device and
  • This object is achieved by a method for displaying user information, in which
  • Image information of a section of an environment is recorded with a camera, the size of the recorded section being changeable with a zoom device in accordance with a zoom factor and / or the camera being aligned three-dimensionally with a device in accordance with a space vector, a computer unit - Location coordinates of the image information based on
  • an image processing unit calculates the image information and the user information for playback with the visualization device and for the correct display of the user information on the display surface at the positions of the images of the image information with location coordinates to which the respective user information is assigned.
  • the system or method according to the invention enables the dynamic insertion of user information - e.g. B. of process values, status information of a control program - in the user-visualized image of a section of an environment.
  • This image is recorded by a camera which is movable and / or by means of a zoom device offers the possibility of changing the size of the image section.
  • the camera therefore does not have to have a fixed image section, but a free definition of the image section (orientation and / or zoom factor) is possible.
  • the user information to be faded in does not have to relate to a static image in the sense of camera orientation and zoom factor, but rather receives a reference to the real spatial coordinates of the image information in the area currently captured by the camera.
  • the user information on the currently visible location section is automatically displayed at the correct position.
  • the dynamic overlays do not change their position in relation to the images of the image information (for example, objects) visible on the display surface of the visualization device a changed viewing angle of the camera, ie when the camera moves (rotations or inclinations, zoom factor).
  • the computer unit contains a control unit for controlling the camera, the zoom device and / or the device for three-dimensional alignment of the camera in accordance with the control variables zoom factor or space vector.
  • the control variables are thus already known to the computer unit and can be used directly by it to calculate the location coordinates of the
  • Image information of the section of the environment can be used.
  • a particular user friendliness can be achieved in that the image processing unit is used to select and display the user information depending on the zoom factor. It is conceivable. B. that in a wide-angle shot only for individual objects on the display surface user information, for. B. Object names can be shown. If the camera zooms in on these objects, detailed information could be displayed, e.g. B. level, temperature or the like. The detailed information would currently be read from an operating and monitoring system.
  • the user information in this embodiment is thus designed as a combination of static and dynamic information.
  • any other data sources can be connected, for example the connection to databases with static information or
  • control unit for controlling the camera, the zoom device and the device for three-dimensional alignment of the camera have means for operation by a user. So that the camera, regardless of the computer unit z. B. can also be moved with a remote control.
  • FIG. 1 shows a schematic overview of a system for displaying user information
  • FIG 3 shows views of a display surface of a 5 visualization device with different control variables, space vector and zoom factor.
  • FIG. 1 shows an embodiment of a system for
  • a camera 1 captures image information 2 of a Section of the surroundings of the camera 1.
  • the image information 2 is the view of a tank 21 with a valve 22.
  • the viewing angle 23 of the camera 1, which captures the section of the surroundings, is shown in a stylized manner.
  • the camera 1 is mounted on a device 4 for three-dimensional alignment of the camera and has a zoom device 3.
  • the camera 1 and the device 4 are connected to a computer unit 5.
  • the computer unit 5 contains a control unit 10 and a display area 7.
  • the computer unit 5 contains user information 6, which in the example is provided via a process connection 20 of measuring points 17, 18.
  • the user information 6 is linked in an image processing unit 9 with location coordinates 12 and is shown as a display 16 on the display surface 7 together with an image 13 of the image information 2.
  • the computer unit also has various input means for a user, a computer mouse 14, a keyboard 15 and other means 11 for operation by a user.
  • the camera 1 captures the objects 21, 22 lying in its viewing angle 23 as image information 2.
  • the viewing angle 23 can be adjusted with a zoom device 3, e.g. B. a focal length adjustment, adjustable in its opening angle and by rotating or tilting the camera 1 in its orientation.
  • the different size of the opening angle of the camera 1 is called the zoom factor and is an important control variable of the system.
  • the camera 1 captures a larger or smaller section of its surroundings.
  • the camera 1 is fastened on a device 4 for three-dimensional alignment and is therefore rotatable about two of its axes of movement.
  • the device 4 for three-dimensional alignment is driven, for. B.
  • the movement of the device 4, the adjustment of the zoom device 3 and the functions of the camera 1 are controlled by a control unit 10 of the computer unit 5.
  • the orientation of the camera 1 in the room is described by the control variable room vector.
  • the camera 1 and the device 4 for three-dimensional alignment return actual values of the space vector and zoom factor to the computer unit.
  • the placement of the camera 1 in the room is defined in the form of spatial coordinates of the camera 1.
  • the computer unit has further information about the surroundings of camera 1 available, e.g. B. in the form of a model which describes the essential points of the objects 21, 22 of the environment in the form of spatial coordinates or as vectors.
  • the computer unit 5 thus has enough information available to determine the location coordinates 12 of the image information 2 captured by the camera 1.
  • the location coordinates 12 are calculated from the control variables zoom factor and space vector and - in the case of linear movements - the space coordinates of the camera 1.
  • the result of this calculation determines the viewing angle 23 of the camera 1 in terms of its size and its position in space.
  • the image information 2 is processed by an image processing unit 9 of the computer unit 5 such that it can be represented as a two-dimensional image 13 of the objects 21, 22 on the display surface 7 of the visualization device. Based on the calculation of the location coordinates 12, the information is also available at which position on the
  • the actual values of the control variables room vector and zoom factor change continuously, and the detected section of the surroundings also changes accordingly.
  • the position of the image 13 on the display surface 7 changed as a result can also be calculated and the user information 6 can continue to be displayed in the same position relative to the image 13, even if its position on the display surface is thereby shifted.
  • the location coordinates 12 are thus assigned to the user information 6 and the current orientation (space vector) of the camera 1, the current zoom factor and - in the case of linear movement of the camera 1 in space - the space coordinates of the camera 1 (ie their placement in space) are known, so the display and the placement of the user information 6 for the overlay technology can currently be calculated, so that the user information 6 for the currently visible area is shown in the correct place.
  • a temperature sensor 17 is attached, and on the valve
  • the recorded process values of temperature or valve opening are transmitted via the process connection 20 to the computer unit 5, where they are then used as user information 6 Are available and displayed in the correct position in the image of objects 21, 22.
  • the image of objects 21, 22 displayed to the user is thus enriched with user information 6 with the additionally displayed process variables.
  • the user can operate the computer unit 5 with input means 14, 15 and also has the option of directly specifying the orientation and the zoom factor of the camera 1 by means 11 for operation.
  • the camera 1 is designed as a video camera 27, the computer unit 5 as a personal computer 28 and the visualization device as a screen 29.
  • the device 4 for three-dimensional alignment, on which the video camera 27 is attached, is in this
  • the turning and tilting device 30 pan, tilt
  • the zoom device 3 of the video camera 27 are connected via an RS232 connection 24 to a serial interface 25 of the personal computer 28.
  • VISCA a corresponding protocol
  • the video camera 27 can be moved by software and the resulting viewing angles can be read out.
  • the video camera 27 can also be moved independently of the personal computer using a remote control, which is not shown in FIG. 2. Since the associated data for rotation, tilt and zoom factor are read out from the video camera 27 for each video frame to be displayed on the screen 29, it is possible to do so Fade in user information dynamically in the correct position, regardless of whether the video camera 27 was moved by software or by remote control.
  • z. B Show supporting texts.
  • the particular advantage of the proposed system and method thus lies in the dynamic insertion of information into the video image, taking into account the area currently captured by the video camera 27.
  • the dynamic overlays do not change their position from those on the
  • Video image of visible objects during movements (rotations or inclinations, zoom factor) of the video camera 27 Only in the context of the lens distortion of the video camera 27 and the perspective distortion do the dynamic insertions move slightly compared to the visible objects.
  • Visualization device 8 with a display surface 7 at different viewing angles of a camera 1 according to FIG 1 in a system according to the invention. The one on the
  • Image be united. This is possible because the current position and zoom settings of camera 1 are also read out for each video image. In addition, depending on the zoom, more or less data can be shown in the image. It is conceivable. B. that in a
  • Tankl Wide angle shot only individual objects are named (e.g. Tankl, control cabinet2). If you zoom in on these elements, you could Detailed information about this is displayed (e.g. Tankl: fill level 3 m). These data would currently be read out from an operating and monitoring system.
  • the invention thus relates to a system and a method for displaying user information, in which the simultaneous display of user information and image information of an environment is improved.
  • the system contains a camera 1 for capturing image information 2 of a section of an environment, a zoom device 3 for changing the size of the captured section in accordance with a zoom factor and / or a device 4 for three-dimensionally aligning the camera 1 in accordance with a space vector, a computer unit 5 for calculating location coordinates 12 of the image information 2 on the basis of spatial coordinates of the camera 1 and / or the control variables zoom factor and space vector, for assigning user information 6 to the location coordinates 12 and for calculating positions of images 13 of the image information 2 on a display surface 7 of a visualization device 8 and an image processing unit 9 for processing the image information 2 and the user information 6 for reproduction with the visualization device 8 and for correctly displaying the user information 6 on the display surface 7 at the positions of the images 13 of the image information 2 with location coordinates 12, to which the respective user information 6 is assigned.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un système et un procédé pour présenter des informations utilisateurs, selon lesquels la présentation simultanée d'informations utilisateur et de données images concernant un environnement est améliorée. Le système est constitué des éléments suivants : une caméra (1) pour la saisie de données images (2) d'une partie d'un environnement, cette caméra comportant un dispositif de zoom (3) permettant de modifier la grandeur de la partie saisie en fonction d'un facteur de zoom, et/ou un dispositif (4) pour l'ajustement tridimensionnel de la caméra (1) en fonction d'un vecteur spatial ; une unité de calcul (5) pour calculer les coordonnées de lieu (12) des informations images (2) sur la base de coordonnées spatiales de la caméra (1) et/ou des variables de référence facteur de zoom et vecteur spatial, pour mettre en correspondance des informations utilisateurs (6) avec les coordonnées de lieu (12) et pour calculer des positions d'illustrations (13) des données images (2) sur la surface d'affichage (7) d'un dispositif de visualisation (8) ; une unité de traitement d'images (9) pour préparer les données images (2) et les informations utilisateurs (6) en vue d'une restitution au moyen du dispositif de visualisation (8) et en vue d'un affichage correct quant à l'emplacement des informations utilisateurs (6) sur la surface d'affichage (7), à savoir aux positions des illustrations (13) des données images (2) dont les coordonnées de lieu (12) correspondent aux informations utilisateur (6).
EP02754527A 2001-08-24 2002-08-12 Presentation d'informations utilisateurs Ceased EP1419483A2 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE10141521A DE10141521C1 (de) 2001-08-24 2001-08-24 Darstellung von Anwenderinformationen
DE10141521 2001-08-24
PCT/DE2002/002956 WO2003019474A2 (fr) 2001-08-24 2002-08-12 Presentation d'informations utilisateurs

Publications (1)

Publication Number Publication Date
EP1419483A2 true EP1419483A2 (fr) 2004-05-19

Family

ID=7696487

Family Applications (1)

Application Number Title Priority Date Filing Date
EP02754527A Ceased EP1419483A2 (fr) 2001-08-24 2002-08-12 Presentation d'informations utilisateurs

Country Status (4)

Country Link
US (1) US20040227818A1 (fr)
EP (1) EP1419483A2 (fr)
DE (1) DE10141521C1 (fr)
WO (1) WO2003019474A2 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10089534B2 (en) * 2016-12-16 2018-10-02 Adobe Systems Incorporated Extracting high quality images from a video

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19900884A1 (de) * 1999-01-12 2000-07-20 Siemens Ag System und Verfahren zum Bedienen und Beobachten eines Automatisierungssystems mit Prozeßvisualisierung und Prozeßsteuerung durch virtuelle Anlagenmodelle als Abbild einer realen Anlage

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9119964D0 (en) * 1991-09-18 1991-10-30 Sarnoff David Res Center Pattern-key video insertion
JP3179623B2 (ja) * 1992-04-29 2001-06-25 キヤノン インフォメーション システムズ リサーチ オーストラリア プロプライエタリー リミテツド ビデオムービー
US5488675A (en) * 1994-03-31 1996-01-30 David Sarnoff Research Center, Inc. Stabilizing estimate of location of target region inferred from tracked multiple landmark regions of a video image
DE19710727A1 (de) * 1997-03-14 1998-09-17 Sick Ag Überwachungseinrichtung
DE10005213A1 (de) * 2000-02-05 2001-08-16 Messer Griesheim Gmbh Überwachungssystem und Verfahren zum Fernüberwachen von Messgrößen

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19900884A1 (de) * 1999-01-12 2000-07-20 Siemens Ag System und Verfahren zum Bedienen und Beobachten eines Automatisierungssystems mit Prozeßvisualisierung und Prozeßsteuerung durch virtuelle Anlagenmodelle als Abbild einer realen Anlage

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BEHRINGER R ET AL: "A novel interface for device diagnostics using speech recognition, augmented reality visualization, and 3D audio auralization", MULTIMEDIA COMPUTING AND SYSTEMS, 1999. IEEE INTERNATIONAL CONFERENCE ON FLORENCE, ITALY 7-11 JUNE 1999, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, vol. 1, 7 June 1999 (1999-06-07), pages 427 - 432, XP010342778, ISBN: 978-0-7695-0253-3, DOI: 10.1109/MMCS.1999.779240 *
KOSAKA A ET AL: "AUGMENTED REALITY SYSTEM FOR SURGICAL NAVIGATION USING ROBUST TARGET VISION", PROCEEDINGS 2000 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION. CVPR 2000. HILTON HEAD ISLAND, SC, JUNE 13-15, 2000; [PROCEEDINGS OF THE IEEE COMPUTER CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION], LOS ALAMITOS, CA : IEEE COMP., 13 June 2000 (2000-06-13), pages 187 - 194, XP001035639, ISBN: 978-0-7803-6527-8 *
See also references of WO03019474A3 *
ZINSER K ED - INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS: "Integrated multi media and visualisation techniques for process S&C", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS. LE TOUQUET, OCT. 17 - 20, 1993; [PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS], NEW YORK, IEEE, US, vol. -, 17 October 1993 (1993-10-17), pages 367 - 372, XP010132025, ISBN: 978-0-7803-0911-1, DOI: 10.1109/ICSMC.1993.384772 *

Also Published As

Publication number Publication date
US20040227818A1 (en) 2004-11-18
WO2003019474A3 (fr) 2003-08-28
DE10141521C1 (de) 2003-01-09
WO2003019474A2 (fr) 2003-03-06

Similar Documents

Publication Publication Date Title
DE69835185T2 (de) Kamerasteuersystem
EP1141799B1 (fr) Systeme et procede pour le controle-commande d'un systeme d'automatisation
DE69233439T2 (de) Überwachungsvorrichtung mit Steuerung der Kamera und der Linsenmontage
EP2196892B1 (fr) Procédé et dispositif d'affichage d'informations
DE19836681A1 (de) Stereoskopisches Aufnahme- und Wiedergabesystem
DE102012004793B4 (de) Kraftfahrzeug mit einem elektronischen Rückspiegel
DE102005035776A1 (de) Vorrichtung zur visuellen Kontrolle einer Fahrzeugumgebung
DE102012203491B4 (de) Elektronisches Rückspiegel-System
WO2005045729A1 (fr) Systeme et procede pour mettre en oeuvre et visualiser des simulations dans une realite etendue
DE19932217A1 (de) Steuervorrichtung
EP1638064A2 (fr) Simulateur pour l'entrainement aux techniques opératoires minimalement invasive
DE102006049981A1 (de) Bedienungshilfe für eine Vorrichtung zum Behandeln von Behältnissen
EP2822814A1 (fr) Véhicule à moteur équipé d'un rétroviseur électronique
DE102006051533A1 (de) Bedienungshilfe für eine Vorrichtung zum Behandeln von Behältnissen (II)
DE10246652B4 (de) Verfahren zum Betrieb eines Darstellungssystems in einem Fahrzeug
DE10141521C1 (de) Darstellung von Anwenderinformationen
DE102006060904B4 (de) Flughafenverkehrsinformations-Anzeigesystem
EP2898666B1 (fr) Dispositif client pour représenter des images d'une caméra pilotable, procédé, programme d'ordinateur ainsi que système de surveillance comprenant le dispositif client
DE10117030C2 (de) Verfahren zur Darstellung von bildhaft strukturierten Informationen auf einem Bildschirm
DE102005050350A1 (de) System und Verfahren zur Überwachung einer technischen Anlage sowie Datenbrille auf Basis eines solchen Systems
DE19811286C2 (de) Kamerabewegungssteuerung
EP1912431A2 (fr) Procédé et dispositif destinés à la commande d'une caméra pivotante
DE60103454T2 (de) Prozesssteuerungssystem und -verfahren
DE4138453A1 (de) Prozessbeobachtungssystem und fensteranzeigeverfahren hierfuer
DE112004002016T5 (de) In der Hand gehaltenes Gerät zur Navigation durch Daten und zur Datenanzeige

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20031212

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SIEMENS AKTIENGESELLSCHAFT

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SIEMENS AKTIENGESELLSCHAFT

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SIEMENS AKTIENGESELLSCHAFT

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20170606