EP3811286A1 - Procédé et dispositif pour la detection d'objets sur des installations - Google Patents

Procédé et dispositif pour la detection d'objets sur des installations

Info

Publication number
EP3811286A1
EP3811286A1 EP19766186.1A EP19766186A EP3811286A1 EP 3811286 A1 EP3811286 A1 EP 3811286A1 EP 19766186 A EP19766186 A EP 19766186A EP 3811286 A1 EP3811286 A1 EP 3811286A1
Authority
EP
European Patent Office
Prior art keywords
image
images
dimensional representation
pcd
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19766186.1A
Other languages
German (de)
English (en)
Inventor
Josef Alois Birchbauer
Vlad Comanelea-Serban
Olaf KÄHLER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Energy Global GmbH and Co KG
Original Assignee
Siemens Energy Global GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Energy Global GmbH and Co KG filed Critical Siemens Energy Global GmbH and Co KG
Publication of EP3811286A1 publication Critical patent/EP3811286A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02GINSTALLATION OF ELECTRIC CABLES OR LINES, OR OF COMBINED OPTICAL AND ELECTRIC CABLES OR LINES
    • H02G1/00Methods or apparatus specially adapted for installing, maintaining, repairing or dismantling electric cables or lines
    • H02G1/02Methods or apparatus specially adapted for installing, maintaining, repairing or dismantling electric cables or lines for overhead lines or cables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30184Infrastructure

Definitions

  • the invention relates to a method according to the preamble of claim 1 and an arrangement according to the preamble of claim 10.
  • the operating resources are recognized and the positions of the operating resources are determined taking into account the position of the first vehicle, whereby detailed images of the operating facilities are determined by means of a second vehicle with a detail camera that is aligned with the respective positions of the operating means means are generated.
  • a single aircraft such as a drone or a helicopter is used to detect masts and isolators when flying over an overhead line using the overview camera, determine the position of the isolators and then use the detail camera to obtain high-resolution images of the isolators. In this way, defective insulators can be identified easily and reliably.
  • Overhead lines have been used overflights with helicopters and image recordings to detect damage or objects on the overhead line.
  • the decision whether an object such as A bird's nest, a balloon or a kite (children's toys) lying on the conductor ropes or below on the ground is difficult to hit with a pure aerial view and prone to errors. So far, this has usually been done by manually evaluating the image recordings. If objects on the line are mistakenly recognized, this results in useless costs and efforts for triggered maintenance.
  • Detections in one or more image recordings cannot always be clearly assigned to a specific 3D object. Since the individual 2D images do not contain any depth information, the distance to an object along the line of sight cannot be determined. Detections can therefore not be restricted to the relevant areas of 3D space, which can lead to irrelevant false detections
  • the object of the invention is to provide a method with which objects in systems can be recognized automatically and reliably.
  • the invention solves this problem with a method according to claim 1.
  • the problem of object recognition is solved by using the 3D information associated with the respective 2D points. Due to the parallax effect, objects under the system such as an overhead line is shown in the pictures at different points in relation to the line. A parallax effect arises when an observer shifts his own position and an apparent change in the position of an object occurs. The effect of parallax is described in detail on Wikipedia, for example (permanent link:
  • a system can be, for example, an electrical system such as an overhead line or an overhead line. However, it can also be a pipeline.
  • an object can be a bird's nest, a car, or a kite.
  • the object of the invention is to provide 3D information in conjunction with 2D detections in 3D space in order to Reduce alarm rate compared to simple 2D detections.
  • the invention it is possible to differentiate safely, quickly and automatically between objects on a system - ie above ground level - and objects below the system - ie close to the ground level. This is an advantage because of dangers or damage to a system such as an overhead line must be removed immediately by maintenance technicians. If an object such as If a bird's nest or a kite is incorrectly recognized as being on the line, the line is switched off and / or maintenance is unnecessarily triggered, which causes costs and reduces the availability of the system.
  • the three-dimensional representation of the system provided is used to restrict a search space for the system or to include the recognized object as a component assign the system in the three-dimensional representation.
  • the use of the three-dimensional representation already available makes it possible, in particular, to reduce the computing power required for image evaluation by restricting the search space.
  • a recognized object can be assigned to a component contained in the three-dimensional representation, so that relevant and irrelevant objects can be separated easily and reliably for further evaluation.
  • Both options have proven to be very beneficial. For example, the first option allows a false negative rate to be feared in the case in question, at the expense of a slightly higher computing power.
  • the second option allows, for example, by means of a less computation-intensive pre-classification, to identify potential problem areas while the images are being taken and, if necessary, to carry out a more precise inspection of these objects automatically.
  • a combination of these two options has also proven to be advantageous.
  • the three-dimensional representation is recorded as a three-dimensional point cloud (PCD), the three-dimensional point cloud (PCD) being semantically segmented in order to restrict a search space for the system in the three-dimensional point cloud (PCD).
  • PCD three-dimensional point cloud
  • a restriction of the search space in this way has proven to be particularly advantageous for typical applications, such as those that occur in the inspection of overhead lines.
  • the three-dimensional representation is obtained by means of a “light detection and ranging (LIDAR)” sensor and recorded as a three-dimensional point cloud (PCD).
  • LIDAR light detection and ranging
  • PCD three-dimensional point cloud
  • the three-dimensional point cloud is semantically segmented in order to restrict a search space for the system in the three-dimensional point cloud (PCD).
  • This is an advantage because the object recognition is restricted to the relevant area, which considerably reduces the calculation requirements and / or increases the speed of the calculations. The complexity is reduced because the search space is restricted to relevant scene content. If the evaluation is carried out on board the aircraft, weight can be saved in this embodiment because a less powerful computer device is required.
  • a typical example is LIDAR data of a high-voltage line, in which (automatically) those points are determined that belong to the overhead lines or are approximated by a parametric model of a chain line.
  • An example of a method for segmenting image data is known from the publication "Mask R-CNN" by Kaiming He et al.
  • a classic 2D detector is used in this restricted search area, which is pre-trained for certain accident classes.
  • an anomaly detection automatically becomes a model of the norm
  • the conductor region is determined (e.g. using auto encoders) and outliers are detected. Both approaches determine in
  • the image space can not only be limited to visible light, but can also extend to adjacent spectral ranges such as (thermal) infrared and ultraviolet light.
  • the detection answers or pixel color values in the individual images for each 3D point of the system can not only be limited to visible light, but can also extend to adjacent spectral ranges such as (thermal) infrared and ultraviolet light.
  • the 3D point cloud is optionally semantically segmented (optional, since the entire PCD can also be semantically relevant for the inspection task);
  • detection results are either generated selectively or the existing detection results are read out (the latter if the detection in the image space was carried out across the board);
  • the remaining 3D points can again be projected back into the image space and thus result in the final detection result in the image space.
  • an overhead line is used as the system, and the semantic segmentation is carried out by using a model of a chain line for detecting conductor lines of the overhead line.
  • the position and orientation of the representation are determined by means of a position determining device. This can be done, for example, by means of a receiver for "Global Positioning System (GPS)" signals, the orientation depending on the viewing direction of the sensor arrangement (LIDAR or camera). The viewing direction can be determined, for example, by means of a tilt sensor in conjunction with a compass, which are provided in the aircraft.
  • GPS Global Positioning System
  • the images are recorded by means of a camera for visible light.
  • the light visible to humans is usually specified with wavelengths between 380 nm to 780nm (permanent link:
  • the camera is guided along the system with an aircraft in order to record the first and the second image at the two different positions.
  • the three-dimensional representation of the system is projected into the two images in order to determine the sections from each.
  • the evaluation device is advises provided. This is an advantage because an evaluation and object detection can take place directly during an overflight.
  • the images and coordinates of the recognized objects can be saved and transmitted to the operator of the system after the flight.
  • the detected objects can be transmitted by radio data communication during the flight.
  • the evaluation device is provided as a central server. This is an advantage because it saves weight and space in the aircraft. For example, all the data recorded by the camera and the LIDAR can be temporarily stored on a data memory and, after the end of the flight, transmitted to the evaluation device for evaluation. Alternatively, the data can be transmitted to the evaluation device by means of radio data communication even during the flight.
  • Figure 1 shows an example of a semantic segmentation of
  • Figure 2 shows an example of images of an overhead line in different frequency ranges
  • Figure 3 shows an example of anomaly detection of objects on an overhead line
  • Figure 4 shows an example of a detection of the position of
  • Figure 1 shows an example of a semantic segmentation of lidar image data.
  • the viewing angle cp of the LIDAR with respect to the location coordinate x is shown.
  • a color scale 3 shows how strongly the LIDAR signals were received. It can be seen that after a successful segmentation of the overhead line cable, line 1 is highlighted using a model of a chain function. The other lines 2 remain in the background.
  • Figure 2 shows an example of images of an overhead line in different frequency ranges.
  • An image in the visible frequency range (VIS), in the infra-red frequency range (IF) and in the ultraviolet frequency range (UV) is shown from left to right.
  • VIS visible frequency range
  • IF infrared area
  • UV ultraviolet frequency range
  • FIG. 3 shows an example of an anomaly detection of artificially inserted objects on an overhead line.
  • the picture is taken from above during an overflight.
  • cable 1 run over wooded areas and a road 4, which bifurcates in the upper part of the picture.
  • a kite 6 is arranged on one of the conductor ropes.
  • the evaluation algorithm correctly marks the objects as deviating from the expected course of the ropes.
  • the algorithm cannot easily obtain the depth information, ie it cannot decide whether the car and in particular whether the kite is on the line or below on the ground.
  • Figure 4 shows two scenes side by side.
  • Two masts 9 each carry an overhead line.
  • Trees 10 can be seen below the overhead line.
  • a first and a second image are recorded at two different positions 7, 8 during an overflight of the line.
  • both images target a section 11 of the line in the line of sight. If an object is arranged directly on or on the line, the object appears on the line at the same location from both perspectives. It is different in the right picture for tree 10.
  • tree 10 does not appear at the same location on the line, but because of the parallax effect from viewing direction 7 on section 11 and from viewing direction 8 on section 12 of the Management. This means that the tree 10 does not have to be arranged at the same height as the line, but rather below it. This principle enables a simple automated distinction between objects arranged on or on a system and objects arranged on the ground.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

L'objet de la présente invention est un procédé de reconnaissance d'objets sur des installations, comprenant les étapes suivantes : - fourniture d'une représentation tridimensionnelle de l'installation, la position et l'orientation de la représentation et de l'installation étant connues, et - acquisition d'une première image et d'une deuxième image de l'installation, les deux images étant prises depuis des positions différentes au-dessus de l'installation, caractérisé en ce qu'une comparaison de la première et de la deuxième image est effectuée pour plusieurs sections de l'installation en utilisant respectivement un effet de parallaxe, un objet sur l'installation étant reconnu si les images sont en concordance dans une zone entourant l'installation. L'invention concerne également un agencement correspondant.
EP19766186.1A 2018-08-21 2019-08-20 Procédé et dispositif pour la detection d'objets sur des installations Pending EP3811286A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP18190055.6A EP3614299A1 (fr) 2018-08-21 2018-08-21 Procédé et dispositif de détection des objets dans des installations
PCT/EP2019/072269 WO2020038944A1 (fr) 2018-08-21 2019-08-20 Procédé et dispositif pour la detection d'objets sur des installations

Publications (1)

Publication Number Publication Date
EP3811286A1 true EP3811286A1 (fr) 2021-04-28

Family

ID=63350461

Family Applications (2)

Application Number Title Priority Date Filing Date
EP18190055.6A Withdrawn EP3614299A1 (fr) 2018-08-21 2018-08-21 Procédé et dispositif de détection des objets dans des installations
EP19766186.1A Pending EP3811286A1 (fr) 2018-08-21 2019-08-20 Procédé et dispositif pour la detection d'objets sur des installations

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP18190055.6A Withdrawn EP3614299A1 (fr) 2018-08-21 2018-08-21 Procédé et dispositif de détection des objets dans des installations

Country Status (4)

Country Link
US (1) US11989870B2 (fr)
EP (2) EP3614299A1 (fr)
BR (1) BR112021002143A2 (fr)
WO (1) WO2020038944A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10748038B1 (en) * 2019-03-31 2020-08-18 Cortica Ltd. Efficient calculation of a robust signature of a media unit
US11703457B2 (en) * 2020-12-29 2023-07-18 Industrial Technology Research Institute Structure diagnosis system and structure diagnosis method
NO347027B1 (en) * 2021-06-02 2023-04-24 Kleon Solutions As Method and system for detecting a line above ground from a helicopter

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020014533A1 (en) 1995-12-18 2002-02-07 Xiaxun Zhu Automated object dimensioning system employing contour tracing, vertice detection, and forner point detection and reduction methods on 2-d range data maps
WO2013029675A1 (fr) 2011-08-31 2013-03-07 Metaio Gmbh Procédé d'estimation d'un déplacement de caméra et de détermination d'un modèle tridimensionnel d'un environnement réel
CN102930246B (zh) 2012-10-16 2015-04-08 同济大学 一种基于点云片段分割的室内场景识别方法
JP2014089548A (ja) 2012-10-30 2014-05-15 Sharp Corp 路面段差検出方法、路面段差検出装置、路面段差検出装置を備えた車両
US9449227B2 (en) * 2014-01-08 2016-09-20 Here Global B.V. Systems and methods for creating an aerial image
EP3228984B1 (fr) 2014-05-05 2023-07-05 Hexagon Technology Center GmbH Système d'arpentage de terrain
JP6397801B2 (ja) 2015-06-30 2018-09-26 日立オートモティブシステムズ株式会社 物体検知装置
US10970877B2 (en) 2015-09-30 2021-04-06 Sony Corporation Image processing apparatus, image processing method, and program
JP6299720B2 (ja) 2015-10-02 2018-03-28 トヨタ自動車株式会社 物体認識装置及び煙判定方法
US11156573B2 (en) * 2016-06-30 2021-10-26 Skydio, Inc. Solar panel inspection using unmanned aerial vehicles
JP6794243B2 (ja) 2016-12-19 2020-12-02 日立オートモティブシステムズ株式会社 物体検出装置
CN107729878A (zh) 2017-11-14 2018-02-23 智车优行科技(北京)有限公司 障碍物检测方法和装置、设备、车辆、程序及存储介质
CN108229548A (zh) 2017-12-27 2018-06-29 华为技术有限公司 一种物体检测方法及装置
CN108364304A (zh) 2018-04-11 2018-08-03 湖南城市学院 一种用于单目机载目标检测的系统和方法

Also Published As

Publication number Publication date
BR112021002143A2 (pt) 2021-05-04
US20210174061A1 (en) 2021-06-10
EP3614299A1 (fr) 2020-02-26
US11989870B2 (en) 2024-05-21
WO2020038944A1 (fr) 2020-02-27
CN112639803A (zh) 2021-04-09

Similar Documents

Publication Publication Date Title
DE102009015142B4 (de) Fahrzeugumgebungs-Erkennungs-Vorrichtung und Steuerungssystem zum Verfolgen eines vorausfahrenden Fahrzeugs
DE102014209137B4 (de) Verfahren und Vorrichtung zur Kalibrierung eines Kamerasystems eines Kraftfahrzeugs
EP3376213A1 (fr) Procédé et système pour une surveillance de l'état d'une installation comprenant des éléments d'exploitation
EP3811286A1 (fr) Procédé et dispositif pour la detection d'objets sur des installations
DE102011111440A1 (de) Verfahren zur Umgebungsrepräsentation
EP3782117A1 (fr) Procédé, dispositif et support d'enregistrement lisible par ordinateur comprenant des instructions pour le traitement de données de capteur
DE102013012930A1 (de) Verfahren zum Bestimmen eines aktuellen Abstands und/oder einer aktuellen Geschwindigkeit eines Zielobjekts anhand eines Referenzpunkts in einem Kamerabild, Kamerasystem und Kraftfahrzeug
WO2020048669A1 (fr) Procédé servant à définir une information relative au changement de voie d'un véhicule, support de stockage lisible par ordinateur, et véhicule
WO2018095640A1 (fr) Procédé et système de détection d'un objet saillant se trouvant à l'intérieur d'un parc de stationnement
DE102016223094A1 (de) Verfahren und System zum Detektieren eines sich innerhalb eines Parkplatzes befindenden erhabenen Objekts
AT511460A4 (de) Verfahren zur bestimmung der position eines luftfahrzeugs
WO2020038984A1 (fr) Procédé et arrangement pour reconnaître des décharges en couronne d'une installation ayant des moyens d'exploitation
WO2023083620A1 (fr) Test de système de capteurs d'environnement et/ou de perception de l'environnement d'un véhicule
DE102018121158A1 (de) Verfahren zum Erfassen von Bodenabtastpunkten und Fahrerunterstützungssystem, das dafür konfiguriert ist, ein derartiges Verfahren auszuführen
DE102018202753A1 (de) Verfahren zur Ermittlung einer Entfernung zwischen einem Kraftfahrzeug und einem Objekt
DE102022103324A1 (de) System und Verfahren zur Erkennung von Inkonsistenzen in Wahrnehmungssystemen von automatisierten Fahrzeugen.
DE102019220616B4 (de) Verfahren zur simultanen lokalisierung und abbildung
DE102020200876B4 (de) Verfahren zum Verarbeiten von Sensordaten einer Sensorik eines Fahrzeugs
DE102020127315A1 (de) System und Verfahren zum Annotieren von Automobilradardaten
DE102019127140A1 (de) Bestimmung eines Nickwinkels eines Umgebungssensors eines Fahrzeugs
DE102022201639B3 (de) Verfahren zur Kollisionsvermeidung von Wasserfahrzeugen und Einrichtung zur Durchführung des Verfahrens
DE102019102423A1 (de) Verfahren zur Live-Annotation von Sensordaten
DE102016223144A1 (de) Verfahren und System zum Detektieren eines sich innerhalb eines Parkplatzes befindenden erhabenen Objekts
DE10065180A1 (de) Sensorsystem mit optischen sensoren zur Kollisionsvermeidung von Flugzeugen sowie ein Verfahren zu deren Durchführung
DE102016110691A1 (de) Vorrichtung und Verfahren zum Bestimmen einer jeweiligen Position einer Mehrzahl von auf einer Flughafenfläche befindlichen Luftfahrzeugen

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210121

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230201