WO2020126240A1 - Procédé pour faire fonctionner un appareil de terrain de la technique d'automatisation dans un environnement de réalité augmentée/de réalité mixte - Google Patents

Procédé pour faire fonctionner un appareil de terrain de la technique d'automatisation dans un environnement de réalité augmentée/de réalité mixte Download PDF

Info

Publication number
WO2020126240A1
WO2020126240A1 PCT/EP2019/081214 EP2019081214W WO2020126240A1 WO 2020126240 A1 WO2020126240 A1 WO 2020126240A1 EP 2019081214 W EP2019081214 W EP 2019081214W WO 2020126240 A1 WO2020126240 A1 WO 2020126240A1
Authority
WO
WIPO (PCT)
Prior art keywords
field device
marker
control unit
field
metadata
Prior art date
Application number
PCT/EP2019/081214
Other languages
German (de)
English (en)
Inventor
Tanja Haag
Martin Kropf
Eric Birgel
Original Assignee
Endress+Hauser SE+Co. KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Endress+Hauser SE+Co. KG filed Critical Endress+Hauser SE+Co. KG
Publication of WO2020126240A1 publication Critical patent/WO2020126240A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/23Pc programming
    • G05B2219/23363Barcode
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36371Barcode reader

Definitions

  • the invention relates to a method for operating a field device of automation technology in an augmented reality or mixed reality environment, the operating unit being a
  • Has image capture unit and wherein the field device transmits identification data at regular and / or fixed time intervals via a wireless communication interface.
  • Field devices which are used in industrial systems have already become known from the prior art. Field devices are widely used in automation technology as well as in production automation. In principle, field devices are all devices that are used close to the process and that supply or process process-relevant information. Field devices are used to record and / or influence process variables. Measuring devices or sensors are used to record process variables. These are used for example for pressure and temperature measurement, conductivity measurement, flow measurement, pH measurement, level measurement, etc. and record the corresponding process variables pressure, temperature, conductivity, pH value, level, flow etc. Actuators are used to influence process variables. These are, for example, pumps or valves that can influence the flow of a liquid in a pipe or the level in a container. In addition to the aforementioned measuring devices and actuators, field devices are also understood to mean remote I / Os, radio adapters or generally devices which are arranged at the field level.
  • the Endress + Hauser Group produces and sells a large number of such field devices.
  • Replacement field devices that replace an outdated or defective field device of an application must be specifically adapted to the respective application of the field device in the measuring point during commissioning.
  • these field devices are configured and parameterized during or after production.
  • the configuration describes, on the one hand, the hardware-side configuration, which, for example, comprises the flange material of a flow measuring device, and also the
  • Parameterization is the definition and definition of parameters with the help of which the operation of the field device is based on the respective characteristics of the
  • a field device can have several hundreds of different parameters, to which parameter values are assigned during commissioning.
  • the parameterization of a field device is carried out using software tools.
  • the inputs of the Parameter values are only possible based on text and require a technical understanding on the part of the operator.
  • Display unit of the operating unit is superimposed with virtual components which make it easier for the user to enter parameters, or make parameter suggestions to the user on the basis of detected / recognized geometries of the measuring point in which the respective field device is installed.
  • Suitable operating units are, for example, data glasses such as Microsoft Hololens or mobile devices, such as tablets or smartphones.
  • “Augmented Reality” means the computer-assisted expansion of the perception of reality.
  • “Mixed Reality” encompasses the entire “reality-virtuality continuum” with the exception of only reality and only virtuality. Between the two extremes “only reality” and “only virtuality” there are infinitely intermediate stages that mix the two. In particular, “augmented reality” and “augmented virtuality” are special expressions of the mixed reality principle.
  • a prerequisite for such augmented reality or mixed reality applications is, on the one hand, that it must be known what type of object is involved.
  • the position of the object in three-dimensional space must be known in order to allow the detected object to be correctly overlaid with the virtual components.
  • Augmented reality or mixed reality devices usually work with so-called 3D depth sensors in order to be able to record their surroundings in three dimensions.
  • a so-called 3D mesh (model) is generated from the depth information at runtime, which maps the surfaces of the real environment.
  • applications can now overlay virtual content from the real world.
  • Current optical markers such as a QR code, are used to enable spatial allocation of objects between the real and virtual world.
  • Virtual content for example a three-dimensional model or metadata, can then be displayed in the vicinity thereof.
  • such a marker is also called a "fiducial marker”.
  • markers are well suited for optically determining the position of an object. However, these are only conditionally suitable for identifying the object, since the marker must contain a data field, which data field necessarily includes identification data, so that the marker can be assigned to a specific object. If a marker is accidentally placed on the wrong object attached, this is linked to incorrect data (virtual models, metadata, etc.) within an augmented reality or mixed reality application. In particular, this is a problem with markers that can be temporarily attached to the object.
  • the object of the invention is to present a method which enables reliable identification of a field device for augmented reality or mixed reality applications.
  • the object is achieved by a method for operating a field device of automation technology in an augmented reality or mixed reality environment with an operating unit, the operating unit having an image acquisition unit and the field device using identification data at regular and / or defined time intervals a wireless
  • Sends out communication interface comprising:
  • a mobile marker which itself has no identification data of the field device and is used to determine the position of the field device.
  • the marker is designed such that it can be temporarily mechanically attached to the field device, for example by means of a clip mechanism.
  • the identification data originate from the field device itself and are repeatedly transmitted by means of a burst or broadcoast command via radio, in particular via Bluetooth (LE), WiFi, ZigBee or the like.
  • the identification data is a TAG of the field device and / or a serial number of the field device.
  • the type of the marker defines where this is by default on a particular one
  • Field device type is attached.
  • the position of the field device in three-dimensional space can therefore be calculated via the position of the marker in three-dimensional space.
  • a three-dimensional model of the field device can be loaded and used in an augmented reality or mixed reality application. Additionally or alternatively, metadata of the field device can be loaded, for example status data,
  • this metadata or the three-dimensional model, can be displayed appropriately in relation to the position of the field device in the augmented reality or mixed reality application.
  • the mobile marker is not exposed to any aging effects, so that the method according to the invention can also be used after a longer operating phase of a field device.
  • the method according to the invention can also be applied to older field devices for which it was not intended to attach a marker during its production.
  • the image acquisition unit is, for example, an optical image acquisition unit, such as a camera, a lidar system (“light detection and ranging”) or the like.
  • Operating unit has a display unit, the display unit representing the field of vision of the operator, with the operator visualizing a constantly updated image captured by the image capturing unit, which image is overlaid with the virtual model or the metadata of the field device.
  • the display unit of the control unit shows the live image, which is recorded by the camera.
  • the operator should point the control unit towards the field device in such a way that the component of the measuring point is captured by the camera.
  • the virtual model of the field device or the metadata is placed over the current live image of the camera. This method is suitable for control units that do not have a transparent pane, but have a conventional display as a display unit.
  • the operator looks through an essentially transparent pane of the operating unit and that virtual model, or the metadata, of the field device for projecting the field device onto the transparent pane is projected.
  • the pane is a transparent glass.
  • the control unit also has a projector. The operator looks through the glass. The environment viewed through the glass is called the field of view.
  • the projector is designed to throw a projection onto the glass that the operator perceives.
  • the metadata and / or the virtual model of the field device are thus placed over the current field of vision of the operator. If the field of view changes with respect to the component of the measuring point, for example by turning the head, the image visualized on the display unit changes accordingly.
  • the visualization model, or the metadata remain in the assigned position of the field device and “move”, or accordingly rotate with the field device depending on the displacement of the field device.
  • the operating unit uses the type of the marker to determine a reference structure of the marker.
  • the reference structure rejects defined dimensions and represents the top view of the structure of the marker.
  • An advantageous embodiment of the method according to the invention provides that a perspective distortion of the detected structure of the marker is determined in comparison to the reference structure, the distortion being determined for both dimensions of the structure.
  • a relative position, in particular a relative angle of inclination, of the marker to the operating unit is determined on the basis of the perspective distortion determined.
  • the position and location of the reflective pattern to the operating unit is transmitted in the location coordinates of a relative coordinate system of the operating unit.
  • the operating unit uses position sensors to determine a scaling of the three-dimensional space.
  • the control unit determines its own absolute position in three-dimensional space, for example by means of position sensors, one or more gyroscopes, a lidar system or by means of a camera system, for example a ToF ("Time of Flight" a) camera.
  • the operating unit determines its respective location, for example using GPS sensors.
  • the operating unit in three-dimensional space, the operating unit defines an absolute
  • the type of the marker defines the location at which the marker can be attached to at least one type of field device and the type of the marker is the position of the field device in relation to the position of the marker after correct Attaching the marker to the field device defined.
  • the database is a local database, which is located on the
  • the database is a global database, which is located, for example, in a cloud computing environment and which can be contacted by the operating unit via the Internet.
  • a distance of the respective field device from the operating unit is determined by means of the signal strength, only one identification being made of the field device which is geographically closest to the control unit or which is within a predetermined radius of the control unit.
  • two or more field devices are located within the predetermined radius of the operating unit, requiring the operator to select the field device with which one
  • data glasses are used as the operating unit.
  • Such data glasses have a screen, which displays the image captured by the camera. To the operator, it appears as if the image captured by the camera is captured by their own eyes.
  • data glasses have a projector which projects an image onto a pane of the glasses. The operator sees the surroundings through the pane of the glasses.
  • a camera of the glasses captures the surroundings to the extent that the operator's eyes capture the surroundings.
  • the virtual model of the field device, or the metadata is projected onto the pane.
  • An example of such data glasses of the latter type which is suitable for the method according to the invention, is the “Hololens” from Microsoft.
  • such data glasses have a translucent display through which the operator looks. Similar to data glasses with a projector, the operator sees the surroundings through this display. At the same time, a camera of the glasses captures the surroundings to the extent that the operator's eyes capture the surroundings. The display then visualizes the virtual model of the field device or the metadata.
  • a mobile terminal is used as the operating unit.
  • Fig. 1 an embodiment of the method according to the invention.
  • a field device FG is shown in FIG. 1.
  • the field device FG is a
  • Level measuring device which is used to record the level of a process medium in a tank.
  • the field device FG must be parameterized for commissioning. The operator wants the
  • Carry out parameterization of the field device FG using a mixed reality or augmented reality application For example, the application should independently identify relevant tank geometries and propose specific parameter values on this basis. Alternatively, information and assistance for parameterization should be displayed.
  • the operator BD would like to run the virtual reality application on his operating unit BE in the form of data glasses.
  • the operating unit BE In order to be able to suggest specific parameter values, the operating unit BE must be informed which specific field device FG or which type field device FG it is.
  • special virtual objects for example visualization models, or the like may be used in mixed reality or augmented reality applications. used, which are displayed on a display unit of the control unit BE and specific to the respective
  • the mixed reality or augmented reality application knows the positioning and the position of the field device FG relative to the operating unit BE in three-dimensional space.
  • the operator moves close to the FG field device.
  • the field device FG sends out identification information ID at regular and / or fixed, recurring time intervals.
  • the field device FG has a wireless one
  • Communication interface KS by means of which the identification information ID is transmitted by radio.
  • Bluetooth LE is used as the radio protocol - however, any other radio protocols can also be used.
  • the control unit BE receives this identification information. If the control unit BE receives the identification information from two or more field devices, it checks in each case at what distance the sender of the identification information ID is to determine the nearest sender. This is done by means of an analysis of the
  • a field device FG is within this radius, it is regarded as the closest field device FG. If several field devices FG are within the specified radius, the operating unit BE requests a selection by the operator for the field device FG to be identified.
  • Image acquisition unit BD for example a camera, a marker MK attached to the field device FG.
  • the marker MK is designed in the form of a clamp and temporarily attached to a defined location on the field device FG.
  • the marker MK has a two-dimensional structure ST.
  • the structure is rectangular and has defined side lengths in both dimensions x, y. Furthermore, the structure has a plurality of encodings CD.
  • the control unit BE analyzes the detected structure ST and uses the codes CD to determine the type of the marker MK. For this purpose, the control unit BE accesses a database DB, on which information about several types of markers MK are located. Is the type of marker MK determined, the control unit BE calls up a reference structure for this type of marker MK.
  • the detected structure ST is compared with the reference structure by means of image processing algorithms.
  • a perspective distortion of the detected structure is determined in comparison to the reference structure, the distortion being determined for both dimensions x, y of the structure.
  • the marker MK to the operating unit BE is determined.
  • the position and position of the marker MK with respect to the operating unit BE is created in location coordinates of a relative coordinate system of the operating unit BE.
  • the control unit BE uses position sensors, a camera system, a lidar and / or at least a gyroscope to determine an absolute position of the control unit BE in three-dimensional space.
  • the absolute position of the control unit BE includes the inclination of the control unit BE and its respective position, which the control unit detects, for example, using GPS sensors.
  • the control unit BE determines the absolute position of the marker MK in the three-dimensional space by comparing the relative position of the marker MK to the control unit BE and the absolute position of the control unit BE in the three-dimensional space. By determining the absolute position of the control unit BE in three-dimensional space, the control unit BE defines an absolute coordinate system. The location coordinates of the structure ST of the marker MK are then in the absolute
  • Coordinate system is transformed, thereby determining the position of the marker MK and, if appropriate, its position in relation to the operating unit BE.
  • the method can be shortened.
  • the control unit BE must determine its absolute position in three-dimensional space.
  • the time-of-flight camera KA then records the distance between the
  • Operating unit BE to the respective corner points of the structure and then calculates the distance to the marker MK or from it the absolute position and location of the marker MK in
  • this information also contains
  • control unit BE can finally determine the relative position of the field device FG to the control unit BE, or determine the absolute position of the field device FG in three-dimensional space.
  • the actual augmented reality application can be executed on the operating unit BE, which is, for example, said application to support the operator in the parameterization of the field device FG.
  • the identification information of the field device FG is transferred to the database DB.
  • the database DB also includes
  • Metadata MD of the field device FG for example via parameterization instructions. These are retrieved from the DB database and visualized on the BE control unit.
  • the control unit BE has a transparent pane GL through which the operator looks.
  • the metadata MD are now projected onto the writing GL so that it overlaps the operator's field of vision.
  • the metadata overlays the user's field of vision at defined positions, which positions are contained in the metadata MD itself, which are relative to the known position of the field device FG, and which remain projected at the defined position relative to the field device FG even when the operator moves his head and thereby changes his field of vision.
  • a virtual, three-dimensional model of the field device FG can also be loaded, which is projected by the operating unit BE onto the position of the field device FG in the field of vision of the operator.
  • a mobile terminal can also be used as the control unit BE.
  • a smartphone or tablet is suitable for this, but also a laptop with a webcam.
  • the method according to the invention is suitable for all types of field device types and is not restricted to level measuring devices.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé pour faire fonctionner un appareil de terrain (FG) de la technique d'automatisation dans un environnement de réalité augmentée, comprenant une unité de commande (BE), ladite unité de commande (BE) comportant une unité de prise d'images (BD) et ledit appareil de terrain (FG) envoyant des données d'identification (ID) à intervalles réguliers et/ou fixes par l'intermédiaire d'une interface de communication sans fil (KS), comportant : - l'application d'un marqueur (MK) sur l'appareil de terrain (FG), le marqueur (MK) comportant au moins une structure bidimensionnelle définie (ST); - la détection de la structure (ST) du marqueur (MK) au moyen de l'unité de prise d'images (BD); - la détermination de la position du marqueur (MK) dans l'espace tridimensionnel à l'aide de la structure détectée (ST) du marqueur (MK); - la détermination du type de marqueur (MK) à l'aide de la structure détectée (ST) du marqueur (MK); - la réception des données d'identification (ID) de l'appareil de terrain (FG) au moyen de l'unité de commande (BE); - la détermination d'un modèle virtuel, tridimensionnel de l'appareil de terrain (FG) ou d'une description de modèle, qui contient des métadonnées (MD) de l'appareil de terrain (FG) à l'aide des données d'identification (ID) pour identifier l'appareil de terrain (FG); et - la superposition de l'appareil de terrain (FG) avec le modèle virtuel de l'appareil de terrain (FG), ou avec au moins une des métadonnées (MD), sur le champ de vision de l'opérateur, dans le cas où l'appareil de terrain (FG) se trouve dans la zone de vision de l'opérateur, le position du modèle virtuel ou des métadonnées (MD) sur l'unité d'affichage étant déterminée par une position absolue du marqueur (MK) dans l'espace tridimensionnel et le type du marqueur (MK).
PCT/EP2019/081214 2018-12-19 2019-11-13 Procédé pour faire fonctionner un appareil de terrain de la technique d'automatisation dans un environnement de réalité augmentée/de réalité mixte WO2020126240A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018132921.0 2018-12-19
DE102018132921.0A DE102018132921A1 (de) 2018-12-19 2018-12-19 Verfahren zum Betreiben eines Feldgeräts der Automatisierungstechnik in einer Augmented-Reality/Mixed-Reality- Umgebung

Publications (1)

Publication Number Publication Date
WO2020126240A1 true WO2020126240A1 (fr) 2020-06-25

Family

ID=68610209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/081214 WO2020126240A1 (fr) 2018-12-19 2019-11-13 Procédé pour faire fonctionner un appareil de terrain de la technique d'automatisation dans un environnement de réalité augmentée/de réalité mixte

Country Status (2)

Country Link
DE (1) DE102018132921A1 (fr)
WO (1) WO2020126240A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112917457A (zh) * 2021-01-27 2021-06-08 南京航空航天大学 一种基于增强现实技术的工业机器人快速精准示教系统及方法
CN112936261A (zh) * 2021-01-27 2021-06-11 南京航空航天大学 一种基于增强现实技术的工业机器人现场仿真系统与方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020120297A1 (de) * 2020-07-31 2022-02-03 Endress+Hauser Process Solutions Ag Verfahren zum Erweitern der Bedienfunktionalität eines Feldgeräts der Automatisierungstechnik von einem Bediengerät auf zumindest ein weiteres Bediengerät

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009059823A1 (de) * 2009-12-21 2011-04-14 Deutsches Zentrum für Luft- und Raumfahrt e.V. Marker zur Bestimmung der Orientierung eines Objektes in einem Bild
EP3206174A1 (fr) * 2016-02-09 2017-08-16 Siemens Schweiz AG Procédé et système de mise en service d'un système de immotique
WO2018130337A1 (fr) * 2017-01-13 2018-07-19 Endress+Hauser SE+Co. KG Appareil de terrain de la technique de l'automatisation

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11064009B2 (en) * 2015-08-19 2021-07-13 Honeywell International Inc. Augmented reality-based wiring, commissioning and monitoring of controllers
HUE037927T2 (hu) * 2016-02-02 2018-09-28 Grieshaber Vega Kg Értékek proaktív átvitele mobil terminálra
DE102017130138A1 (de) * 2017-12-15 2019-06-19 Endress+Hauser SE+Co. KG Verfahren zur vereinfachten Inbetriebnahme eines Feldgeräts

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009059823A1 (de) * 2009-12-21 2011-04-14 Deutsches Zentrum für Luft- und Raumfahrt e.V. Marker zur Bestimmung der Orientierung eines Objektes in einem Bild
EP3206174A1 (fr) * 2016-02-09 2017-08-16 Siemens Schweiz AG Procédé et système de mise en service d'un système de immotique
WO2018130337A1 (fr) * 2017-01-13 2018-07-19 Endress+Hauser SE+Co. KG Appareil de terrain de la technique de l'automatisation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112917457A (zh) * 2021-01-27 2021-06-08 南京航空航天大学 一种基于增强现实技术的工业机器人快速精准示教系统及方法
CN112936261A (zh) * 2021-01-27 2021-06-11 南京航空航天大学 一种基于增强现实技术的工业机器人现场仿真系统与方法
CN112936261B (zh) * 2021-01-27 2022-07-08 南京航空航天大学 一种基于增强现实技术的工业机器人现场仿真系统与方法

Also Published As

Publication number Publication date
DE102018132921A1 (de) 2020-06-25

Similar Documents

Publication Publication Date Title
DE102019002898A1 (de) Robotorsimulationsvorrichtung
DE102018109463C5 (de) Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung
DE60127644T2 (de) Lehrvorrichtung für einen Roboter
EP1447770B1 (fr) Procédé et système de visualisation d'information assisté par ordinateur
WO2020126240A1 (fr) Procédé pour faire fonctionner un appareil de terrain de la technique d'automatisation dans un environnement de réalité augmentée/de réalité mixte
DE102019111868A1 (de) Instandhaltung von gebäudesystemen unter verwendung von mixed reality
EP1701233B1 (fr) Génération de réalités virtuelles sur la base d'un environnement réel
EP2989872B1 (fr) Dispositif et procédé de placement
EP1763845A1 (fr) Procede et dispositif pour determiner des superpositions optiques d'objets virtuels
DE102007033486A1 (de) Verfahren und System zur Ermittlung der Position und Orientierung einer Kamera relativ zu einem realen Objekt
DE102019213199A1 (de) Virtuelle räumlich registrierte video überlagerungsanzeige
DE112016001829T5 (de) Automatische Verbindung von Bildern unter Verwendung visueller Eigenschaftsquerverweise auf zugehörige Anwendungen
DE102021120250A1 (de) Schnellaktivierungstechniken für industrielle Augmented-Reality-Anwendungen
DE102018113336A1 (de) Verfahren zum Verwenden mit einer Maschine zum Einstellen einer Erweiterte-Realität-Anzeigeumgebung
DE102019116834B4 (de) Augmentierte Fotoaufnahme
DE102006006001B3 (de) Verfahren und Anordnung zum Einblenden ortsbezogener Informationen in eine visuelle Darstellung oder Ansicht einer Szene
DE112020000410T5 (de) Orientierungsausrichtung von erweiterten realitätsmodellen
DE102017010683B4 (de) Verfahren zur automatischen Wiederherstellung eines eingemessenen Zustands eines Projektionssystems
EP3974936B1 (fr) Configuration d'un dispositif de visualisation pour une zone de machine
EP3418839B1 (fr) Procédé pour surveiller une installation d'automatisation
DE102004061841B4 (de) Markerloses Tracking System für Augmented Reality Anwendungen
WO2013034133A1 (fr) Interaction avec un scénario tridimensionnel virtuel
EP1487616B1 (fr) Commande de processus automatique
DE102005060980A1 (de) Verfahren und System zur Bestimmung eines dreidimensionalen Raumes gegenüber einer realen Umgebung
DE102018123635A1 (de) 3d-kartierung einer prozesssteuerungsumgebung

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19805611

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19805611

Country of ref document: EP

Kind code of ref document: A1