EP2132706A1 - Verfahren und vorrichtung zur generierung von trackingkonfigurationen für augmented-reality-anwendungen - Google Patents
Verfahren und vorrichtung zur generierung von trackingkonfigurationen für augmented-reality-anwendungenInfo
- Publication number
- EP2132706A1 EP2132706A1 EP07712485A EP07712485A EP2132706A1 EP 2132706 A1 EP2132706 A1 EP 2132706A1 EP 07712485 A EP07712485 A EP 07712485A EP 07712485 A EP07712485 A EP 07712485A EP 2132706 A1 EP2132706 A1 EP 2132706A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- tracking
- module
- modules
- data
- memory
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 21
- 230000008569 process Effects 0.000 claims description 10
- 230000003416 augmentation Effects 0.000 claims description 6
- 239000000463 material Substances 0.000 claims description 3
- 238000007794 visualization technique Methods 0.000 claims description 2
- 230000004807 localization Effects 0.000 claims 2
- 230000008901 benefit Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- NQLVQOSNDJXLKG-UHFFFAOYSA-N prosulfocarb Chemical compound CCCN(CCC)C(=O)SCC1=CC=CC=C1 NQLVQOSNDJXLKG-UHFFFAOYSA-N 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the invention relates to a method and a device for generating tracking configurations for augmented reality applications.
- Augmented reality applications relate to a form of human-machine interaction, which pops up a person, for example via a data glasses information in the field and thus the perceived reality this person expan ⁇ tert.
- tracking methods are used. These can work with both hardware and software.
- optical, inertial, Sound ⁇ tables and / or magnetic systems are used. These systems must be supplied with data to determine the position of the user.
- These data are representative example ⁇ as three-dimensional models of an observed object to images of the object being viewed from different positions to recognize distinctive points, lines or color regions or specially mounted coded markers. All for that
- Tracking necessary information is fed to the tracking process at the beginning.
- Augmented reality applications in the areas of industry, medicine and consumer use come in many different tracking environments.
- the process of creating tracking ⁇ configurations is complex and costly. This hinders the spread of augmented reality applications in the mentioned areas.
- a creation of tracking configurations for a given environment is complicated, because in known tracking methods, the entire environment is used or evaluated. Even small differences between two similar, but not completely coincident environments often make the tracking method used in each case fail because, for example, contradictory data on the positions of prominent points, lines or color regions are present, so that consistency with the real scene can not be achieved.
- Creating or authoring tracking configurations is based on generating parameter sets and data derived, for example, from reduced CAD models of the environment under consideration. For example, prominent points, edges and / or color regions are extracted and a subset thereof selected so that the tracking methods used can operate efficiently. Subsequently, the selection made has to be tested in the real environment and, if necessary, adapted to it. The creator or author must also select the distinctive points, edges and / or color regions so that they are evenly distributed in the real environment. Only then is the stability of the respective tracking process ensured. These steps require ⁇ least the author had good knowledge of the behavior of jewei ⁇ time tracking process and are associated with high costs for the author. A high-quality overlay of the augmentations over the real environment is therefore highly dependent on the know-how and the care of the author.
- a system and a method for displaying augmented reality information are already known.
- objects are captured as image information of a section of an environment by means of a camera.
- the detected objects are identified and images of the detected objects are reconstructed in a virtual three-dimensional space on the basis of associated tracking information stored in the system.
- Ortsko- be ordinates of objects is calculated and the position of the Benut ⁇ decomp and its viewing angle to the object in ermit ⁇ telt.
- user information is assigned to the location coordinates, which are then placed in the correct position in the field of view of the user User.
- Said Trackinginforma- functions are read out contactlessly by means of a read / write device from at ⁇ least one mobile data memory.
- the mobile data memory is attached to each object to be detected.
- the object of the invention is to provide a method and a device for generating tracking configurations for augmented reality applications, in which less effort is required.
- the advantages of the invention are, in particular, that the generation of tracking configurations based on known tracking data of individual modules or components that are present in the real environment takes place automatically. This is done in the sense of an online operation during the use of the system by the operator. Consequently, there is no need for an engineering step to generate tracking configurations. In practice, this means like a serious ⁇ che reduction for generating tracking configurations to be driven effort.
- An online method requires only knowledge of the list of modules present in a real environment and associated tracking data. This list does not have to be complete.
- the tracking method according to the present invention starts with a search for a first module in a camera-generated image of the existing real environment.
- the stored tracking data associated with the first module is compared with features contained in the camera image by means of a computer unit. Is this step of initialization successful? richly completed, then in the sense of a tracking, a constant calculation of the position and the orientation of the already identified module in the camera image takes place. This allows a correct position display of augmentation information by means of smart glasses, as long as the user views the augmentation information in the area of the detected module and this remains in his field of vision.
- additional modules from the list of modules in the current camera image are searched for in addition to the first module mentioned, wherein this search is carried out using stored tracing data assigned to these further modules.
- this further module is inserted into the first module with respect to the generated Coordina ⁇ tensystem.
- the features get by new posi ⁇ tions in this real environment. This procedure is repeated until the entire real environment is re ⁇ constructed.
- the tracking is robust. Furthermore, the created tracking configurations can be stored for later use of tracking and also for documentation of the actual structure of the real environment.
- the on-line method for generating tracking configurations described above advantageously allows a structure of a tracking construction kit to be used over and over again in order to make any number of real environments trackable, ie. H. to generate reliable tracking configurations for any real environment. All that is needed is information about which modules are installed in the respective real environment.
- This information is extracted in an advantageous manner from already existing CAD plans, parts lists or technical contexts, so that a manual entry of these Information is not necessary. This further reduces the effort required to generate tracking configurations.
- FIG 1 is a perspective sketch of a cabinet with modules arranged therein and
- Figure 2 is a block diagram illustrating a method according to the invention.
- the invention relates to a method and a device in which tracking configurations for augmented reality applications are determined.
- the starting point for a method according to the invention is a real environment.
- This real environment consists of that in the
- FIG. 1 shown embodiment of a cabinet
- tracking data are also stored in a further memory 9. These tracking data are generated by the supplier of the respective module and are made available to the operator of the control cabinet 1 in addition to a manual describing the module and stored by the operator in the memory 9. For example, it is at the module 2 by a numerical control, wherein the module 3 to an I / O module and the module 4 to a ⁇ An operation control.
- the tracking data contains information about the respective module, for example information about edges, recesses, colored areas, etc.
- the operator starts an augmented reality-based application. He uses at least least a camera 12 by means of which 11 provided images of the real environment a computer unit, wherein ⁇ game as images of Heidelbergschran- shown in the Figure 1 kit. 1
- the tasks of the computer unit 11 are, using the stored in the memory 10 parts list of the built-in cabinet 1 modules 2, 3, 4 and using the memory 9 stored in the memory tracking data of the modules 2, 3, 4 automatically perform a tracking method 7, the Implementation of the results provided by the tracking method 7 in an authoring process 8 in tracking configurations and supply tracking information determined a visualization ⁇ 6 method.
- Its task is to forward augmented reality information in the correct position to the present Augmented Reality application 5.
- the named augmented reality information is inserted in the correct position in the field of vision of a person wearing data glasses.
- the operator of the system via an input El a bill of materials installed in the cabinet 1 modules 2, 3, 4 corresponding data in the memory 10 marrie ⁇ ben. This is done for example by means of a keyboard. Furthermore, the tracking data of the modules 2, 3, 4 provided by the suppliers of the modules 2, 3, 4 are read into the memory 9 via the input E2. This tracking data is stored by the supplier of the modules on a data carrier and forwarded to the operator of the system from trä ⁇ ger via the input E2 to the memory. 9
- one of the installed modules 2, 3, 4 is selected from the stored in the memory 10 BOM example ⁇ as the module 2.
- the memory 10 BOM example ⁇ which are in storage 9 stored tracking data supplied to the tracking method 7 and compared in the context of this tracking method with the image supplied by the camera 12 to identify the module 2 in this image and then define this module 2 associated three-dimensional coordinate system. If this module 2 is identified in the camera image, then a pose is calculated by the computer unit 11. This pose defi ⁇ defined the position and orientation of the camera 12 with respect to the detected real environment, approximately, for example, in the present execution in respect to the cabinet 1.
- This pose which is detected at predetermined time intervals again, is sent to the visualization method 6 and used by the latter for a correct representation of the augmentation information in the present augmented reality application.
- Augment istsinfoma- features which are shown to the user via a data goggles as an overlay ⁇ tion of the field of view are stored in its real-world environment, service manual to the user in question by the modules of the cabinet. 1
- the computer unit 11 automatically selects another module from the parts list stored in the memory 10 as part of the authoring process.
- This is followed by a transmission of the tracking data of the further module, for example of the module 3, stored in the memory 9 to the tracking method 7.
- the further module 12 in the camera image is searched. If the further module 3 is localized in the camera image, then its features are transformed such that they are placed in the coordinate system of the first module 2.
- the configurations with respect to the ers ⁇ th module 2 and the second module 3 then determined tracking form a unit and make it possible to make a larger space reliable trackable.
- the computer unit 11 selects as part of the authoring process 8 more modules from the memory 10 in extractive processing 8 .
- the computer unit 11 selects as part of the authoring process 8 more modules from the memory 10 in extractspei ⁇ cherten BOM, searches in the gelling of the camera 12 ferten image after these other modules and also registers these other modules in the coordinate system of the first module. 2
- the search for further modules and their registration in the coordinate system of the first module 2 takes place until a uniform coverage of the cabinet 1 is reached. This uniform coverage allows stable tracking.
- the created tracking configurations are transmitted from the authoring method 8 or from the computer unit 11 to the memory 9 and stored there, so that they are available for later use.
- the selection of other modules from the list is preferably carried out also taking into account the additional criterion around that are primarily various modules in the camera image ge ⁇ investigated.
- This has the advantage that in the selection of the assignment of localized modules to the modules of the list is unique. Join ambiguities, this will be ⁇ solve with the help of additional features such as the vorlie- constricting network topology and the hardware configuration.
- This information is stored in memory 10, for example, in addition to the list of modules.
- tracking data associated with the modules 2, 3, 4 and produced by the supplier of these modules can also be transmitted via the Internet a data be retrieved from the supplier's bank and stored in memory 9.
- this parts list can also be extracted from an already existing CAD plan, an already existing parts list or from an already existing technical description.
- the present invention after all, relates to a method for determining tracking configurations for augmented reality applications, in which the tracking configurations are automatically created based on known tracking data of individual modules of the real environment. This is done in the sense of taking place with the operator of the system online ⁇ operation. Consequently, no engineering step is required to determine the tracking configurations for the entire existing real environment. This means a substantial reduction of the effort required to create tracking configurations.
- An online method according to the invention requires only the knowledge about the list of existing modules in the present real environment and associated tracking data. All other information needed for tracking is automatically created using this startup information.
- An on-line method according to the invention allows a construction of a tracking building kit that can be used over and over again to track any real environment. For this purpose, only information about the parts list of the modules installed in the respectively present real environment is required in each case.
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2007/052175 WO2008107021A1 (de) | 2007-03-08 | 2007-03-08 | Verfahren und vorrichtung zur generierung von trackingkonfigurationen für augmented-reality-anwendungen |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2132706A1 true EP2132706A1 (de) | 2009-12-16 |
Family
ID=38657545
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP07712485A Ceased EP2132706A1 (de) | 2007-03-08 | 2007-03-08 | Verfahren und vorrichtung zur generierung von trackingkonfigurationen für augmented-reality-anwendungen |
Country Status (3)
Country | Link |
---|---|
US (1) | US8390534B2 (de) |
EP (1) | EP2132706A1 (de) |
WO (1) | WO2008107021A1 (de) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8327279B2 (en) * | 2004-12-14 | 2012-12-04 | Panasonic Corporation | Information presentation device and information presentation method |
US8970690B2 (en) * | 2009-02-13 | 2015-03-03 | Metaio Gmbh | Methods and systems for determining the pose of a camera with respect to at least one object of a real environment |
JP5728159B2 (ja) | 2010-02-02 | 2015-06-03 | ソニー株式会社 | 画像処理装置、画像処理方法及びプログラム |
US9361729B2 (en) * | 2010-06-17 | 2016-06-07 | Microsoft Technology Licensing, Llc | Techniques to present location information for social networks using augmented reality |
JP5799521B2 (ja) * | 2011-02-15 | 2015-10-28 | ソニー株式会社 | 情報処理装置、オーサリング方法及びプログラム |
ES2570852T3 (es) * | 2011-05-23 | 2016-05-20 | Lego As | Un sistema de construcción de juguete para realidad aumentada |
KR101897311B1 (ko) | 2011-05-23 | 2018-10-24 | 레고 에이/에스 | 구축 요소 모델을 위한 조립 설명서 생성 |
US9674419B2 (en) | 2012-07-31 | 2017-06-06 | Hewlett-Packard Development Company, L.P. | Web-linked camera device with unique association for augmented reality |
US9448407B2 (en) * | 2012-12-13 | 2016-09-20 | Seiko Epson Corporation | Head-mounted display device, control method for head-mounted display device, and work supporting system |
US9818150B2 (en) | 2013-04-05 | 2017-11-14 | Digimarc Corporation | Imagery and annotations |
US9615177B2 (en) | 2014-03-06 | 2017-04-04 | Sphere Optics Company, Llc | Wireless immersive experience capture and viewing |
DE102015007624A1 (de) * | 2015-06-16 | 2016-12-22 | Liebherr-Components Biberach Gmbh | Verfahren zum Montieren von elektrischen Schaltanlagen sowie Montagehilfsvorrichtung zum Erleichtern der Montage solcher Schaltanlagen |
WO2016205386A1 (en) * | 2015-06-16 | 2016-12-22 | Briggs & Stratton Corporation | Lithium-ion battery |
US9964765B2 (en) * | 2015-09-11 | 2018-05-08 | The Boeing Company | Virtual display of the real-time position of a robotic device to a human operator positioned on an opposing side of an object |
US20180357922A1 (en) | 2017-06-08 | 2018-12-13 | Honeywell International Inc. | Apparatus and method for assessing and tracking user competency in augmented/virtual reality-based training in industrial automation systems and other systems |
US11874653B2 (en) * | 2020-10-29 | 2024-01-16 | Oliver Crispin Robotics Limited | Systems and methods of servicing equipment |
US11685051B2 (en) | 2020-10-29 | 2023-06-27 | General Electric Company | Systems and methods of servicing equipment |
US11935290B2 (en) | 2020-10-29 | 2024-03-19 | Oliver Crispin Robotics Limited | Systems and methods of servicing equipment |
US11938907B2 (en) | 2020-10-29 | 2024-03-26 | Oliver Crispin Robotics Limited | Systems and methods of servicing equipment |
US11915531B2 (en) | 2020-10-29 | 2024-02-27 | General Electric Company | Systems and methods of servicing equipment |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102005046762A1 (de) * | 2005-09-29 | 2007-04-05 | Siemens Ag | System und Verfahren zur Darstellung von Benutzerinformationen, insbesondere von Augmented-Reality-Informationen, mit Hilfe von in RFID-Datenspeichern hinterlegten Trackinginformationen |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6765569B2 (en) * | 2001-03-07 | 2004-07-20 | University Of Southern California | Augmented-reality tool employing scene-feature autocalibration during camera motion |
RU2008110056A (ru) * | 2005-08-15 | 2009-09-27 | Конинклейке Филипс Электроникс Н.В. (Nl) | Система, устройство и способ для очков расширенной реальности для программирования конечным пользователем |
-
2007
- 2007-03-08 EP EP07712485A patent/EP2132706A1/de not_active Ceased
- 2007-03-08 US US12/530,256 patent/US8390534B2/en not_active Expired - Fee Related
- 2007-03-08 WO PCT/EP2007/052175 patent/WO2008107021A1/de active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102005046762A1 (de) * | 2005-09-29 | 2007-04-05 | Siemens Ag | System und Verfahren zur Darstellung von Benutzerinformationen, insbesondere von Augmented-Reality-Informationen, mit Hilfe von in RFID-Datenspeichern hinterlegten Trackinginformationen |
Non-Patent Citations (3)
Title |
---|
FRIEDRICH W: "ARVIKA, Augmented Reality für Entwicklung, Produktion und Service", part Seiten 52-56,92,93 2004, PUBLICIS CORPORATE PUBLISHING, ISBN: 3-89578-239-4 * |
PAELKE V., REIMANN, C.: "Authoring von Augmented-Reality-Anwendungen", DEUTSCHE GESELLSCHAFT FÜR KARTOGRAPHIE, KARTOGRAPHISCHE SCHRIFTEN, BAND 10: AKTUELLE ENTWICKLUNGEN IN GEOINFORMATION UND VISUALISIERUNG. BEITRÄGE DES SEMINARS GEOVIS 2006, vol. 10, 5 May 2006 (2006-05-05) - 6 April 2006 (2006-04-06), pages 37 - 45 * |
WEIDENHAUSEN J-M: "Mobile Mixed Reality Platform (Dissertation)", part Seiten 7-11,47,48,53-56,71,75,76,84,86,128-134 2006, UNIVERSITÄT DARMSTADT * |
Also Published As
Publication number | Publication date |
---|---|
US20100033404A1 (en) | 2010-02-11 |
US8390534B2 (en) | 2013-03-05 |
WO2008107021A1 (de) | 2008-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2132706A1 (de) | Verfahren und vorrichtung zur generierung von trackingkonfigurationen für augmented-reality-anwendungen | |
DE69730372T2 (de) | Rechnergrafikanimation | |
DE69632578T2 (de) | Computer-grafiksystem zum schaffen und verbessern von texturabbildungssystemen | |
DE102016113060A1 (de) | Verfahren zum Steuern eines Objekts | |
EP1701233A2 (de) | Generierung virtueller Welten auf Basis einer realen Umgebung | |
DE102016220670A1 (de) | Verfahren und System zum Testen von Software für autonome Fahrzeuge | |
DE112008003963B4 (de) | System und Verfahren zur Off-line-Programmierung eines Industrieroboters | |
DE102012220884A1 (de) | Maschinensichtsystemprogramm-Bearbeitungsumgebung mit Echtzeitkontexterzeugungsmerkmalen | |
DE102016105496A1 (de) | System zur Prüfung von Objekten mittels erweiterter Realität | |
DE102006043390A1 (de) | Vorrichtung und Verfahren zur Simulation eines Ablaufs zur Bearbeitung eines Werkstücks an einer Werkzeugmaschine | |
DE102006019292A1 (de) | Modellieren programmierbarer Einrichtungen | |
WO2020126240A1 (de) | Verfahren zum betreiben eines feldgeräts der automatisierungstechnik in einer augmented-reality/mixed-reality-umgebung | |
DE102006014849A1 (de) | Vorrichtung und Verfahren zum Vorbereiten von Daten, um ein Simulationsmodell zu erzeugen, sowie ein Programm dafür | |
EP0859987A2 (de) | Computergestütztes arbeits- und informationssystem und zugehöriger baustein | |
EP1092210B1 (de) | Vorrichtung und verfahren zur erstellung eines virtuellen anlagenmodells | |
DE102004016329A1 (de) | System und Verfahren zur Durchführung und Visualisierung von Simulationen in einer erweiterten Realität | |
DE102005010225A1 (de) | Verfahren zum Vergleich eines realen Gegenstandes mit einem digitalen Modell | |
DE102018118422A1 (de) | Verfahren und system zur darstellung von daten von einer videokamera | |
Peebles | Modelling alternative strategies for mental rotation | |
EP2191338B1 (de) | System zur erstellung eines simulationsprogramms | |
DE102014009389B3 (de) | Prüfungsmodul für eine kombinierte Fräs-Dreh-Maschine | |
EP1178438A1 (de) | Verfahren und System zum automatischen Überprüfen der Baubarkeit eines aus mehreren rechnergestützt konstruierten Bauteilen bestehenden Gesamtprodukts | |
DE102017108622A1 (de) | System zur unterstützung von teamarbeit mittels augmented reality | |
DE102022104805A1 (de) | Verfahren zum Trainieren eines künstliche Intelligenz-Systems; System, aufweisend ein künstliche Intelligenz-System, Computerprogrammprodukt | |
DE102018205007A1 (de) | Verfahren, Vorrichtung und computerlesbares Speichermedium mit Instruktionen zum Erstellen einer Virtual-Reality-Anwendung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20090527 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
17Q | First examination report despatched |
Effective date: 20100322 |
|
DAX | Request for extension of the european patent (deleted) | ||
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SIEMENS AKTIENGESELLSCHAFT |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SIEMENS AKTIENGESELLSCHAFT |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20180516 |