EP3594624B1 - Procédé de mesure de l'itinéraire ainsi que système de mesure de l'itinéraire - Google Patents

Procédé de mesure de l'itinéraire ainsi que système de mesure de l'itinéraire Download PDF

Info

Publication number
EP3594624B1
EP3594624B1 EP19179365.2A EP19179365A EP3594624B1 EP 3594624 B1 EP3594624 B1 EP 3594624B1 EP 19179365 A EP19179365 A EP 19179365A EP 3594624 B1 EP3594624 B1 EP 3594624B1
Authority
EP
European Patent Office
Prior art keywords
images
sensor device
distance
semantic information
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP19179365.2A
Other languages
German (de)
English (en)
Other versions
EP3594624A3 (fr
EP3594624C0 (fr
EP3594624A2 (fr
Inventor
Dirk Raproeger
Paul Robert Herzog
Lidia Rosario Torres Lopez
Uwe Brosch
Paul-Sebastian Lauer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP3594624A2 publication Critical patent/EP3594624A2/fr
Publication of EP3594624A3 publication Critical patent/EP3594624A3/fr
Application granted granted Critical
Publication of EP3594624B1 publication Critical patent/EP3594624B1/fr
Publication of EP3594624C0 publication Critical patent/EP3594624C0/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/36Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
    • G01P3/38Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light using photographic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior

Definitions

  • the invention relates to a method for measuring distances.
  • the invention further relates to a system for measuring distances.
  • the present invention is generally applicable to any system for distance measurement, the present invention is described in relation to optical distance measurement using a camera in a driver assistance system.
  • Optical systems for distance measurement are used, for example, in the area of driver assistance systems. This assumes, for example, that the optical system moves over a surface at a constant height. By observing the ground, for example using images taken at different times, using a camera and based on the height of the camera above the ground, the distance traveled can be determined.
  • the font DE 10 2016 223 435 A1 discloses a distance and speed measurement using image recordings, in particular for determining the distance traveled by a track-bound vehicle.
  • the document EP 3 040 726 A1 discloses a method for determining a vehicle speed, wherein the movement speed and / or direction of the vehicle is determined based on at least one or more differences in the image data.
  • WO 2012/168424 A1 discloses a method for locating and/or measuring the speed of a vehicle traveling along a railroad track formed of two rails.
  • the font WO 2006/063546 A1 discloses a method for determining a
  • Speed of a vehicle wherein at least two images of a vehicle environment are recorded one after the other using a mono camera of the vehicle and between the images changes in a position and / or a size of at least one object contained in the images are determined and from the changes a speed of the vehicle relative to the Object is determined.
  • the font WO 2007/091072 A1 discloses a system for measuring speed and/or determining the position of a train.
  • the invention provides a system for distance measurement, comprising a non-contact sensing sensor device for recording at least two time-sequential images, a semantic unit for assigning semantic information to at least one image area in the at least two images, a selection unit for selecting at least one Partial area in the image areas of the at least two images, which is suitable for a distance determination, based on the assigned semantic information, a flow unit for determining the optical flow based on the at least one selected partial area in the at least two images, and a path measuring unit for determining the distance traveled based on of the determined optical flow.
  • One of the advantages achieved in this way is that a complex identification of disturbances in the ultimately available measurement signal for determining the distance can be avoided.
  • Another advantage is the quick and reliable determination of a route.
  • semantic information is assigned to different image areas using an artificial neural network. This allows fast and reliable assignment of semantic information to different image areas.
  • the at least one partial area is selected that shows a floor level in the images. This improves the reliability and accuracy of distance measurement.
  • the recorded images are provided digitally, with the semantic information being assigned area by region, preferably with pixel precision. Approximations are possible here as areas, such as a determination of polygons that delimit areas in the image and/or provide areas with semantic information. This further increases the reliability and robustness of the determination of the route, in particular the pixel-precise semantic information.
  • the distance traveled is determined based on a height of the non-contact sensing sensor device above a ground level, the determined optical flow and based on the focal length of the non-contact sensing sensor device. This allows a route to be determined quickly and easily at the same time.
  • the non-contact sensing sensor device is provided with a camera. Reliable recording of images is possible using a camera. An infrared camera and/or a camera in the visible wavelength range can be provided here.
  • the non-contact sensing sensor device comprises at least one optical sensor;
  • the optical sensor is a camera, in particular a near-field digital camera. This allows time sequences of images to be provided quickly and easily.
  • the semantic unit has an artificial neural network, in particular a convolutional neural network. This enables particularly reliable recognition of semantically different areas in the images.
  • Figure 1 shows a system according to an embodiment of the present invention.
  • a system 1 is shown with an optical sensor device 2, comprising a camera 2a and a calibration unit 2b.
  • images 100' are recorded one after the other, so that an input image sequence 100 results.
  • the input image sequence 100 is transmitted to a neural network 3, which assigns semantic information to the images 100 'of the input image sequence 100 and creates a semantic image sequence 101 with images 101' with added semantic information.
  • the input image sequence 100 is further transmitted to a flow unit 4, which determines the optical flow in the images 100 'of the input image sequence 100.
  • the flow unit 4 determines images 102' from the input image sequence 100, which represent the optical flow; So an optical flow image sequence 102 is created.
  • a floor selection unit 5 determines corresponding areas in the images that show a floor, a floor level or the like.
  • the ground selection unit 5 transmits the images to a ground flow unit 6, which determines the optical flow on the ground, the ground flow, based on the corresponding areas in the images.
  • the ground flow is transmitted together with information from the calibration unit 2b, for example the height of the camera 2a above the ground, to a distance determination unit 7, which then determines the distance traveled based on the information transmitted to it.
  • Figure 2 shows a captured image provided with semantic information according to an embodiment of the present invention.
  • FIG. 2 An image recorded by a camera is shown, which essentially shows the environment in front of a vehicle. Other vehicles, houses, trees, a parking lot and a street can be seen in the image.
  • the semantic unit 3 in the form of the neural network now evaluates an area 20 which includes semantic information is provided, which is in Figure 2 is identified by means of different hatching in the area 20.
  • the vehicles 21, a floor area 23, and the wider surroundings 22 are hatched differently. Of course, other identification is also possible, for example using different colors or the like.
  • the floor area 23 can further be divided into smaller sub-areas 24. In particular, the use of the close range of a camera increases the robustness of the system and can be used to determine a partial area 24.
  • Figure 3 shows in schematic form a projection rule for a camera according to an embodiment of the present invention.
  • a camera in a projection center 40 has a viewing beam 31b, which can always be placed in the optical axis of the camera. This means that the set of rays from a perspective camera can be used to determine a distance.
  • a camera in the projection center 40 takes images at different times along the viewing rays 31a, 31b.
  • the visible beams 31a, 31b sweep over the floor (floor level 42).
  • the optical flow 32 can be determined in the images in the image plane 41.
  • the distance traveled 30 can then be determined using the set of rays.
  • the optical flow is determined by comparing two consecutive images and the path traveled is determined based on the set of rays.
  • correspondences between the two images to the physically same location are referred to as optical flow.
  • Figure 4 shows steps of a method according to an embodiment of the present invention.
  • a first step S1 at least two time-sequential images 100 are recorded using an optical sensor device 2, 2a.
  • semantic information 21, 22, 23 is assigned to at least one image area 20 in the at least two images.
  • At least one partial area 24 in one or more image areas in the at least two images 101, which is suitable for determining a route, is selected based on the assigned semantic information. Further partial areas 24 can be optionally selected, particularly in the close range of the optical sensor device, in particular a camera.
  • the optical flow 32 is determined.
  • the optical flow 32 can in particular be evaluated either in the entire selected area 23 or only in one or more of the selected partial areas 24 in the at least two images 102.
  • the distance traveled 30 is determined based on the determined optical flow, and in particular the camera height and its focal length.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Power Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Claims (7)

  1. Procédé de mesure d'itinéraire, comprenant les étapes suivantes
    - enregistrement (S1) d'au moins deux images (100) séquentielles dans le temps au moyen d'un dispositif capteur (2, 2a) à détection sans contact,
    - affectation (S2) d'informations sémantiques (21, 22, 23) à au moins une zone d'image (20) dans les au moins deux images, l'affectation (S2) d'informations sémantiques à différentes zones d'image s'effectuant au moyen d'un réseau neuronal artificiel (3),
    - sélection (S3) d'au moins une zone partielle (24) dans une ou plusieurs zones d'image des au moins deux images (101), laquelle est appropriée pour une détermination d'itinéraire, à l'aide des informations sémantiques affectées, l'au moins une zone partielle (24) étant sélectionnée qui représente un plan de sol dans les images (102),
    - détermination (S4) du flux optique (32) à l'aide de l'au moins une zone partielle (24) sélectionnée dans les au moins deux images (102), et
    - détermination (S5) de l'itinéraire parcouru (30) à l'aide du flux optique déterminé, l'itinéraire parcouru (30) étant déterminé à l'aide d'une hauteur (43) du dispositif capteur (2, 2a) à détection sans contact au-dessus du plan de sol (42), du flux optique (32) déterminé et à l'aide de la distance focale (40a) du dispositif capteur (2, 2a) à détection sans contact.
  2. Procédé selon la revendication 1, plusieurs zones partielles (24) étant utilisées pour la détermination (S4) du flux optique (32).
  3. Procédé selon l'une des revendications 1 ou 2, la fourniture des images (100) enregistrées s'effectuant sous forme numérique et l'affectation (S2) des informations sémantiques s'effectuant par zone, de préférence avec une précision au pixel.
  4. Procédé selon l'une des revendications 1 à 3, le dispositif capteur (2, 2a) à détection sans contact étant fourni avec une caméra (2a).
  5. Système de mesure d'itinéraire, comprenant
    - un dispositif capteur (2, 2a) à détection sans contact destiné à enregistrer au moins deux images (100) séquentielles dans le temps, le dispositif capteur (2, 2a) à détection sans contact comprenant au moins une caméra (2a),
    - une unité sémantique (3) destinée à affecter des informations sémantiques (21, 22, 23) à au moins une zone d'image (20) dans les au moins deux images (100), l'unité sémantique (3) possédant un réseau neuronal artificiel (3), l'affectation d'informations sémantiques à différentes zones d'image s'effectuant au moyen du réseau neuronal artificiel (3),
    - une unité de sélection (5) destinée à sélectionner au moins une zone partielle (24) dans une ou plusieurs zones d'image des au moins deux images, laquelle est appropriée pour une détermination d'itinéraire, à l'aide des informations sémantiques affectées, l'au moins une zone partielle (24) étant sélectionnée qui représente un plan de sol dans les images (102),
    - une unité de flux (4, 6) destinée à déterminer le flux optique (32) à l'aide de l'au moins une zone partielle (24) sélectionnée dans les au moins deux images, et
    - une unité de mesure de course (7) destinée à déterminer l'itinéraire parcouru (30) à l'aide du flux optique déterminé, l'itinéraire parcouru (30) étant déterminé à l'aide d'une hauteur (43) du dispositif capteur (2, 2a) à détection sans contact au-dessus du plan de sol (42), du flux optique (32) déterminé et à l'aide de la distance focale (40a) du dispositif capteur (2, 2a) à détection sans contact.
  6. Système selon la revendication 5, le dispositif capteur (2, 2a) à détection sans contact comprenant une caméra numérique à champ proche.
  7. Système selon l'une des revendications 5 ou 6, l'unité sémantique (3) comprenant un réseau neuronal convolutif.
EP19179365.2A 2018-07-10 2019-06-11 Procédé de mesure de l'itinéraire ainsi que système de mesure de l'itinéraire Active EP3594624B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102018211329.7A DE102018211329A1 (de) 2018-07-10 2018-07-10 Verfahren zur Wegstreckenmessung sowie System zur Wegstreckenmessung

Publications (4)

Publication Number Publication Date
EP3594624A2 EP3594624A2 (fr) 2020-01-15
EP3594624A3 EP3594624A3 (fr) 2020-04-29
EP3594624B1 true EP3594624B1 (fr) 2023-11-29
EP3594624C0 EP3594624C0 (fr) 2023-11-29

Family

ID=66821053

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19179365.2A Active EP3594624B1 (fr) 2018-07-10 2019-06-11 Procédé de mesure de l'itinéraire ainsi que système de mesure de l'itinéraire

Country Status (2)

Country Link
EP (1) EP3594624B1 (fr)
DE (1) DE102018211329A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114018215B (zh) * 2022-01-04 2022-04-12 智道网联科技(北京)有限公司 基于语义分割的单目测距方法、装置、设备及存储介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004060402A1 (de) * 2004-12-14 2006-07-13 Adc Automotive Distance Control Systems Gmbh Verfahren und Vorrichtung zur Ermittlung einer Fahrzeuggeschwindigkeit
GB0602448D0 (en) * 2006-02-07 2006-03-22 Shenton Richard System For Train Speed, Position And Integrity Measurement
FR2976355B1 (fr) * 2011-06-09 2013-06-21 Jean Luc Desbordes Dispositif de mesure de vitesse et de position d'un vehicule se deplacant le long d'une voie de guidage, procede et produit programme d'ordinateur correspondant.
DE102012219569A1 (de) * 2012-10-25 2014-04-30 Robert Bosch Gmbh Aktualisierung eines gespeicherten Umfangs eines Rades
KR102094506B1 (ko) * 2013-10-14 2020-03-27 삼성전자주식회사 피사체 추적 기법을 이용한 카메라와 피사체 사이의 거리 변화 측정방법 상기 방법을 기록한 컴퓨터 판독 가능 저장매체 및 거리 변화 측정 장치.
EP3040726A1 (fr) * 2014-12-29 2016-07-06 General Electric Company Procédé et système pour déterminer la vitesse d'un véhicule
WO2017015947A1 (fr) * 2015-07-30 2017-02-02 Xiaogang Wang Système et procédé pour un suivi d'objet
US10108864B2 (en) * 2015-12-29 2018-10-23 Texas Instruments Incorporated Stationary-vehicle structure from motion
DE102017108255A1 (de) * 2016-04-19 2017-10-19 GM Global Technology Operations LLC Parallele detektion von primitiven in einer szene unter verwendung eines rundum-kamerasystems
DE102016223435A1 (de) * 2016-11-25 2018-05-30 Siemens Aktiengesellschaft Wegstrecken- und Geschwindigkeitsmessung mit Hilfe von Bildaufnahmen

Also Published As

Publication number Publication date
DE102018211329A1 (de) 2020-01-16
EP3594624A3 (fr) 2020-04-29
EP3594624C0 (fr) 2023-11-29
EP3594624A2 (fr) 2020-01-15

Similar Documents

Publication Publication Date Title
EP3584663B1 (fr) Procédé de guidage transversal automatique d'un véhicule suiveur dans un peloton de véhicules
DE102017206847B4 (de) Verfahren zum automatischen Erstellen und Aktualisieren eines Datensatzes für ein autonomes Fahrzeug
EP1825276B1 (fr) Procede et dispositif pour determiner la vitesse d'un vehicule
EP1886093B1 (fr) Procede pour determiner la geometrie d'un tronçon de route
DE102016208025B4 (de) Verfahren zur Erfassung von Verkehrszeichen
EP3488606A1 (fr) Dispositif de prise de vues et procédé pour capter une zone environnante d'un véhicule équipé dudit dispositif
EP2963631B1 (fr) Procédé de détermination d'un stationnement à partir d'une pluralité de points de mesure
DE102012014397A1 (de) Verfahren zum Ermitteln einer Position eines Fahrzeugs und Fahrzeug
DE102008041679A1 (de) Vorrichtung und Verfahren zur erinnerungsbasierten Umfelderkennung
DE102018117290A1 (de) Verfahren zur Kalibrierung und/oder Justierung mindestens einer Sensoreinheit eines Fahrzeugs
DE102014212232A1 (de) Vorrichtung und Verfahren zur Bestimmung mindestens einer Eigenschaft eines Gleises für ein Schienenfahrzeug sowie Schienenfahrzeug
DE102007021576A1 (de) Verfahren und Vorrichtung zum Bestimmen der Position eines Verkehrszeichens
DE102014111012A1 (de) Verfahren zum Unterstützen eines Fahrers eines Kraftfahrzeugs beim Ausparken, Fahrerassistenzsystem und Kraftfahrzeug
EP3488607A1 (fr) Dispositif de prise de vues servant à capter une zone environnante d'un véhicule équipé dudit dispositif et procédé permettant d'obtenir une fonction d'aide à la conduite
DE102017002637A1 (de) Verfahren zur Ermittlung der Eigenbewegung eines Fahrzeugs, insbesondere eines Kraftfahrzeugs
DE102008025773A1 (de) Verfahren zur Schätzung eines Orts- und Bewegungszustands eines beobachteten Objekts
DE102006044615A1 (de) Verfahren zur Kalibrierung von Bilderfassungseinrichtungen in Fahrzeugen
EP3594624B1 (fr) Procédé de mesure de l'itinéraire ainsi que système de mesure de l'itinéraire
DE102014202503A1 (de) Verfahren und Vorrichtung zum Bestimmen eines Abstands eines Fahrzeugs zu einem verkehrsregelnden Objekt
DE102014212233A1 (de) Vorrichtung und Verfahren zur Erfassung einer Fahrzeugumgebung eines Schienenfahrzeugs sowie Schienenfahrzeug
DE102018213994A1 (de) Verfahren und System zur Bestimmung der Bewegung eines Kraftfahrzeuges
DE102022130172B4 (de) Verfahren und Fahrerassistenzsystem zur Unterstützung eines Fahrers beim Fahren in einem Proximitätsbereich einer Trajektorie
DE102022204086A1 (de) Verfahren zur Ermittlung einer dynamischen Fremdobjekt-Fahrkorridor-Assoziation
DE10349823A1 (de) Verfahren zum Messen von Abständen zwischen einem sich bewegenden Kraftfahrzeug und Objekten
DE102017207609A1 (de) Aufbau und Verfahren zur Höhenkalibrierung einer monokularen Fahrzeugkamera

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ROBERT BOSCH GMBH

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/246 20170101ALI20200321BHEP

Ipc: G01P 3/38 20060101ALI20200321BHEP

Ipc: G01C 22/00 20060101AFI20200321BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20201029

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210614

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20230620

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502019010011

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

U01 Request for unitary effect filed

Effective date: 20231207

U07 Unitary effect registered

Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT SE SI

Effective date: 20231213

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240329

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240329

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240301

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129