EP4309141A1 - Procédé, programme informatique, unité de commande et véhicule automobile pour réaliser une fonction de conduite automatisée - Google Patents
Procédé, programme informatique, unité de commande et véhicule automobile pour réaliser une fonction de conduite automatiséeInfo
- Publication number
- EP4309141A1 EP4309141A1 EP22706531.5A EP22706531A EP4309141A1 EP 4309141 A1 EP4309141 A1 EP 4309141A1 EP 22706531 A EP22706531 A EP 22706531A EP 4309141 A1 EP4309141 A1 EP 4309141A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- motor vehicle
- noddj
- computer
- object classes
- control unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000004590 computer program Methods 0.000 title claims description 20
- 238000001514 detection method Methods 0.000 claims abstract description 32
- 238000013528 artificial neural network Methods 0.000 claims description 19
- 230000007613 environmental effect Effects 0.000 claims description 18
- 230000006870 function Effects 0.000 description 15
- 241001465754 Metazoa Species 0.000 description 8
- 238000011161 development Methods 0.000 description 7
- 230000018109 developmental process Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 230000006399 behavior Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000007796 conventional method Methods 0.000 description 3
- 230000002787 reinforcement Effects 0.000 description 3
- 241000282994 Cervidae Species 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 238000012913 prioritisation Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 230000001932 seasonal effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/19—Recognition using electronic means
- G06V30/191—Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
- G06V30/19113—Selection of pattern recognition techniques, e.g. of classifiers in a multi-classifier system
Definitions
- a computer-implemented method for performing an automated driving function, a computer program product, a control unit and a motor vehicle are described here.
- the first semi-automated vehicles (corresponds to SAE Level 2 according to SAE J3016) have reached series maturity in recent years.
- a sensor fusion system including a sensor system for providing environmental status information and a folded neural network (CNN) are planned.
- the CNN includes a receiving interface configured to receive the environmental state information from the sensor system, a common convolutional layer configured to extract traffic information from the received environmental state information, and a plurality of fully connected layers configured to to recognize objects belonging to different object classes based on the extracted traffic information, wherein the object classes include at least one of a road feature class, a static object class and a dynamic object class.
- DNNs deep neural networks
- the task therefore arises of further developing computer-implemented methods for carrying out an automated driving function, computer program products, control units and motor vehicles of the type mentioned at the outset in such a way that greater computing efficiency can be ensured compared to conventional methods.
- the object is achieved by a computer-implemented method for performing an automated driving function according to claim 1, a computer program product according to the independent claim 10, a control unit according to the independent claim 12 and a motor vehicle according to the independent claim 13. Further refinements and developments are the subject of the dependent Expectations.
- a computer-implemented method for performing an automated driving function of an autonomous or semi-autonomous motor vehicle is described below using at least one algorithm, the algorithm being executed in at least one control unit, the control unit intervening in aggregates of the motor vehicle on the basis of input data, the algorithm has a trained neural network which is designed for object recognition, with a detection algorithm being provided which uses an object class library of different object classes (nODDJ, nADD_i) for recognizing objects in the area surrounding the motor vehicle objects, wherein a) a number of base object classes (nODDJ) is selected, the number of base object classes (nODDJ) being smaller than the total number of object classes (nODDJ, nADD_i) in the object class library; b) at least one input signal with information about a section of road ahead of the motor vehicle is received, at least one additional object class (nADD_i) being determined from the information, the at least one additional object class (nADD_i) being used in addition to the basic object classes (nODDJ) to detect im Sur
- the control unit can be a separate control unit of the motor vehicle or part of a control unit with additional functions not described here. For reasons of operational safety, a corresponding control unit can have a discrete design. Provision can also be made for the control unit to be configured redundantly or for the method described here to be executed redundantly on one or more computing units.
- the detection algorithm can be part of the algorithm executing the method or can be designed separately. This makes it possible to carry out the detection algorithm entirely or partially in a decentralized manner, e.g. in the firmware of a camera used accordingly.
- current data relating to the environment can be considered as input data, for example data obtained by means of various sensors.
- sensors can include, for example, cameras, radar, lidar and/or ultrasonic sensors, but also other sensors such as position sensors, e.g. GPS, magnetic field detecting sensors and the like.
- color planning data for example from a Navigation destination can be obtained, and possibly traffic data that determine the traffic flow on the route.
- Further data can be communication data, for example, obtained from car-to-car or car-to-infrastructure systems, e.g. on traffic light phases or similar.
- the information could thus be obtained that there are people on the road in a specific area ahead.
- This causes at least one additional object class, eg people and animals, to be loaded and used in the object recognition.
- the relevant object data which can be recorded via the motor vehicle sensor system, is used as input data for the detection algorithm.
- the detection algorithm has a deep neural network.
- the self-learning neural network can be based on various learning principles, in particular it can use methods of reinforcement learning.
- Reinforcement learning stands for a set of methods of machine learning in which the neural network autonomously learns a strategy in order to maximize the rewards received.
- the neural network is not shown which action is the best in which situation, but receives a reward at certain points in time, which can also be negative. Using these rewards, it approximates a utility function that describes the value of a particular state or action.
- deep neural networks In addition to an input level and an output level, deep neural networks have at least one hidden level and are able to analyze and evaluate complex situations.
- n_ODD_i the number of basic object classes
- Such driving situations can be route-, time-of-day-, season- and region-specific. For example, it may be appropriate to keep an "animals" object class always loaded when driving at night, since there are frequent deer crossings at night.
- Another example is routes that are crossed by railway lines.
- the corresponding object class will not be used on other routes, so it is not necessary to process the input data there using a “rail vehicles” object class.
- nODDJ base object classes
- nADD_i additional object class
- the at least one additional object class (nADD_i) is deselected when the motor vehicle leaves the section of the route.
- nODDJ basic object classes
- the probability of people crossing the lane is greater than on freeways, so it makes sense to always use the object class people in inner-city areas.
- the detection algorithm uses detection thresholds for the object classes (nODDJ, nADD_i), with at least one detection threshold for the at least one additional object class determined from the information being lowered compared to a standard value.
- the input signal is transmitted by a receiving system, the information being traffic information and/or the input signal containing a sensor signal from a motor vehicle sensor.
- current traffic information can be processed in the first case, for example information relating to people or animals or other objects on a roadway.
- environmental information can be obtained from motor vehicle sensors as an alternative or in addition, e.g. signs can be recognized by cameras, pointing out objects, for example temporary construction site signs or seasonal deer crossing signs.
- the input signal contains map data of a map of the surroundings.
- Such map data can come from a map stored in the motor vehicle or a map received by means of long-distance communication, the map being enriched with additional information, for example regions with strong changes in the landscape or regions in which other obstacles, e.g. quarries, moraines or sand drifts, can occasionally occur .
- a first independent subject relates to a device for performing an automated driving function of an autonomous or semi-autonomous motor vehicle using at least one algorithm, the algorithm being executed in at least one control unit, the control unit being integrated into aggregates of the motor vehicle on the basis of input data intervenes, the algorithm having a trained neural network which is designed for object recognition, with a detection algorithm being provided which uses an object class library of different object classes (nODDJ, nADD_i) to recognize objects in the area surrounding the motor vehicle, the Device is set up to: a) select a number of base object classes (nODDJ), the number of base object classes (nODDJ) being smaller than the total number of object classes (nODDJ, nADDJ) in the object class library; b) to receive at least one input signal with information about a route section in front of the motor vehicle by means of an input and to determine at least one additional object class (nADDJ) from the information, the at least one additional object class (nADDJ) in addition to the basic object classes (nODDJ)
- the device is set up to select the number of basic object classes (n_ODD_i) depending on the driving situation.
- the device is set up to recognize if at least one of the objects in the area cannot be recognized using the base object classes (nODDJ) and the at least one additional object class (nADD_i). other known object classes (nODDJ).
- the device is set up to deselect the at least one additional object class (nADD_i) when the motor vehicle leaves the route section.
- the device is set up to make the selection of the basic object classes (nODDJ) depending on the type of route traveled.
- the detection algorithm uses detection thresholds for the object classes (nODDJ, nADDJ), the device being set up to lower at least one detection threshold for the at least one additional object class determined from the information compared to a standard value.
- the input signal is transmitted by a receiving system, the information being traffic information and/or the input signal containing a sensor signal from a motor vehicle sensor.
- the input signal contains map data of a map of the surroundings.
- Another independent subject relates to a computer program product with a permanent, computer-readable storage medium on which instructions are embedded which, when executed by at least one processing unit, cause the at least processing unit to be set up to perform the method of the aforementioned type .
- the method can be distributed on one or more computing units, so that certain method steps are executed on one computing unit and other method steps are executed on at least one other computing unit, with calculated data being able to be transmitted between the computing units if necessary.
- the processing unit can be part of the control unit.
- the commands have the computer program product module of the type described above.
- Another independent subject relates to a control unit with a permanent, computer-readable storage medium, with a computer program product of the type described above being stored on the storage medium.
- Another independent subject relates to a motor vehicle with a control unit of the type described above.
- the computing unit is part of the control unit.
- control unit provision can be made for the control unit to be networked with environmental sensors and a receiving system.
- 1 shows a motor vehicle that is set up for automated or autonomous driving
- FIG. 2 shows a control unit of the motor vehicle from FIG. 1 ;
- Fig. 3 an environment with the motor vehicle from Fig. 1, and
- Fig. 1 shows a motor vehicle 2, which is set up for automated or autonomous driving.
- the motor vehicle 2 has a control unit 4 with a computing unit 6 and a memory 8 .
- a computer program product is stored in memory 8, which will be described in more detail below in connection with FIGS.
- the control unit 4 is connected on the one hand to a series of environmental sensors that allow the current position of the motor vehicle 2 and the respective traffic situation to be detected. These include environmental sensors 10, 11 at the front of the motor vehicle 2, environmental sensors 12, 13 at the rear of the motor vehicle 2, a camera 14 and a GPS module 16.
- the environmental sensors 10 to 13 can, for example, be radar, lidar and/or Include ultrasonic sensors.
- sensors for detecting the state of motor vehicle 2 are provided, including wheel speed sensors 16, acceleration sensors 18 and pedal sensors 20, which are connected to control unit 4.
- the current state of motor vehicle 2 can be reliably detected with the aid of this motor vehicle sensor system.
- the computing unit 6 has loaded the computer program product stored in the memory 8 and executes it. On the basis of an algorithm and the input signals, the arithmetic unit 6 decides on the control of the motor vehicle 2, which the arithmetic unit 6 on intervention in the steering 22, Motor control 24 and brakes 26 would reach, which are each connected to the control unit 4.
- Data from the sensors 10 to 20 are continuously buffered in the memory 8 and discarded after a predetermined period of time so that these environmental data can be made available for further evaluation.
- the algorithm was trained according to the procedure described below.
- RDS Radio Data System
- other communication paths can be used to receive relevant traffic information, for example via a car-to-infrastructure or a car-to-car network or a cellular network.
- Fig. 2 shows the control unit 4 from Fig. 1.
- the control device 4 has the computing unit 6 in which a computer program product module 28 (framed in dashed lines) is executed, which has an algorithm 30 for executing an autonomous driving function.
- the autonomous driving function can be a traffic jam assistant, a brake assistant, a collision warning assistant or the like, for example.
- a component of the algorithm 30 is a detection algorithm 32 (framed by dashed lines), which has a neural network 34 .
- the neural network 34 is a deep neural network (DNN) that has at least one hidden layer in addition to an input layer and an output layer.
- DNN deep neural network
- the self-learning neural network 34 learns using methods of reinforcement learning, ie the algorithm 30 tries in a training phase to obtain rewards for improved behavior according to one or more metrics or benchmarks, i.e. for improvements in the algorithm 30, by varying the neural network 34.
- known learning methods of the monitored and unsupervised learning and combinations of these learning methods are known.
- the neural network 34 can essentially be a matrix of values, usually called weights, which define a complex filter function which determines the behavior of the algorithm 34 depending on input variables which are presently received via the environmental sensors 10-20 and control signals for controlling the motor vehicle 2 are generated.
- the computer program product module 28 can be used both in the motor vehicle 2 and outside of the motor vehicle 2 . It is thus possible to train the computer program product module 28 both in a real environment and in a simulation environment.
- the object recognition algorithm 32 accesses an object class library 36 (framed in dashed lines) stored in the memory 8, which has a number of base object classes nODDJ (nODD_1, nODD_2, ... nODDJ), and additional object classes (nADD_1, nADD_2, ... nADD_i).
- nODDJ base object classes
- nADD_1, nADD_2, ... nADD_i additional object classes
- the relevant object classes nODDJ, nADDJ are loaded situationally by the detection algorithm 32 and used for object recognition.
- the situational selection of the object classes nODDJ, nADDJ depends, among other things, on the route.
- Another input variable is traffic reports, which are received by the RDS module 27 and from which information relating to a route ahead or a route section ahead is extracted.
- Fig. 3 shows a highway 38.
- two motor vehicles 40, 42 are driving on the freeway 38 in different lanes. There is an animal 46 on the roadway on a stretch of road 44 ahead (identified by dashed lines across the freeway 38).
- the method described here only loads and uses a selection of base object classes nODDJ suitable for freeways, which significantly speeds up the object recognition process compared to conventional methods in which all object classes are used. From the data obtained via the RDS module 27, it is possible to isolate the information that the animal 46 represents a danger to traffic in the route section 44. On the basis of this information, the detection algorithm 32 can load at least one additional object class nADD_i relating to animals from the memory 8 and use it in the object recognition.
- a route for motor vehicle 2 is first retrieved.
- Route types are extracted from the route of the motor vehicle 2 and base object classes are selected on the basis of the route types. With the aid of the selected basic object classes, the detection then takes place using the detection algorithm.
- traffic information is called up and analyzed at regular intervals. If there are indications of danger in the traffic information, for example due to objects in the sections of road ahead, relevant object classes are extracted in a subsequent step.
- the additional object classes are then loaded and detection thresholds for the loaded additional object classes are lowered.
- Object recognition is then carried out using the base object classes used and the additional object classes.
- the route section can be a route section defined by the route or by the traffic information.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne un procédé d'exécution d'une fonction de conduite automatisée d'un véhicule automobile au moyen d'au moins un algorithme, lequel algorithme a un algorithme de détection qui utilise une bibliothèque de classes d'objets comprenant diverses classes d'objets pour reconnaître des objets situés dans l'environnement du véhicule automobile, dans lequel : un certain nombre de classes d'objets de base sont sélectionnées et des informations de circulation concernant une section de la route située devant le véhicule automobile sont reçues ; au moins une classe d'objet supplémentaire est extraite des informations de circulation ; et la ou les classes d'objets supplémentaires sont sélectionnées en plus des classes d'objets de base pour reconnaître des objets situés dans l'environnement du véhicule automobile.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102021202526.9A DE102021202526A1 (de) | 2021-03-16 | 2021-03-16 | Computerimplementiertes Verfahren zum Durchführen einer automatisierten Fahrfunktion, Computerprogrammprodukt, Steuereinheit sowie Kraftfahrzeug |
PCT/EP2022/052454 WO2022194440A1 (fr) | 2021-03-16 | 2022-02-02 | Procédé, programme informatique, unité de commande et véhicule automobile pour réaliser une fonction de conduite automatisée |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4309141A1 true EP4309141A1 (fr) | 2024-01-24 |
Family
ID=80623987
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP22706531.5A Pending EP4309141A1 (fr) | 2021-03-16 | 2022-02-02 | Procédé, programme informatique, unité de commande et véhicule automobile pour réaliser une fonction de conduite automatisée |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4309141A1 (fr) |
DE (1) | DE102021202526A1 (fr) |
WO (1) | WO2022194440A1 (fr) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10395144B2 (en) | 2017-07-24 | 2019-08-27 | GM Global Technology Operations LLC | Deeply integrated fusion architecture for automated driving systems |
CN111133447B (zh) * | 2018-02-18 | 2024-03-19 | 辉达公司 | 适于自主驾驶的对象检测和检测置信度的方法和系统 |
DE102019213061A1 (de) * | 2019-08-29 | 2021-03-04 | Volkswagen Aktiengesellschaft | Klassifizierung von KI-Modulen |
-
2021
- 2021-03-16 DE DE102021202526.9A patent/DE102021202526A1/de active Pending
-
2022
- 2022-02-02 EP EP22706531.5A patent/EP4309141A1/fr active Pending
- 2022-02-02 WO PCT/EP2022/052454 patent/WO2022194440A1/fr active Application Filing
Also Published As
Publication number | Publication date |
---|---|
DE102021202526A1 (de) | 2022-09-22 |
WO2022194440A1 (fr) | 2022-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3160813B1 (fr) | Procédé de création d'un modèle d'environnement d'un véhicule | |
DE102019104974A1 (de) | Verfahren sowie System zum Bestimmen eines Fahrmanövers | |
DE102016007899B4 (de) | Verfahren zum Betreiben einer Einrichtung zur Verkehrssituationsanalyse, Kraftfahrzeug und Datenverarbeitungseinrichtung | |
DE102016210534A1 (de) | Verfahren zum Klassifizieren einer Umgebung eines Fahrzeugs | |
EP3063732A1 (fr) | Analyse de la situation pour un système d'assistance au conducteur | |
DE102017211556A1 (de) | Verfahren zur Routenplanung für ein Kraftfahrzeug mit einem automatisierten Fahrzeugsystem und Kraftfahrzeug mit einem automatisierten Fahrzeugsystem | |
DE102020122837A1 (de) | Verfahren und vorrichtung zum erzeugen eines wendewegs in einem mehrschichtiges-lernen-basierten autonomen fahrzeug | |
DE102019003963A1 (de) | Verfahren zur Bestimmung einer Fahrstrategie eines Fahrzeuges, insbesondere eines Nutzfahrzeuges | |
EP2964503B1 (fr) | Estimation de la vitesse future et/ou distance d'un véhicule d'un point de référence et estimation de l'accélération future | |
DE102018133457B4 (de) | Verfahren und System zum Bereitstellen von Umgebungsdaten | |
DE102007015227B4 (de) | Verfahren und Anordnung zur näherungsweisen Bestimmung einer von einem Fahrzeug aktuell befahrenen Fahrspur | |
DE102017211387A1 (de) | System und Verfahren zum automatisierten Manövrieren eines Ego-Fahrzeugs | |
DE102020108508B3 (de) | Verfahren zur Bewertung von Streckenabschnitten | |
DE102016003935A1 (de) | Verfahren zur Ermittlung einer Randbebauungsinformation in einem Kraftfahrzeug und Kraftfahrzeug | |
DE102020201931A1 (de) | Verfahren zum Trainieren wenigstens eines Algorithmus für ein Steuergerät eines Kraftfahrzeugs, Verfahren zur Optimierung eines Verkehrsflusses in einer Region, Computerprogrammprodukt sowie Kraftfahrzeug | |
DE102020129802A1 (de) | Fahrzeugbetriebskennzeichnung | |
DE102005043838A1 (de) | Verfahren und Einrichtung zur automatischen Ermittlung von Fahrmanövern | |
EP4309141A1 (fr) | Procédé, programme informatique, unité de commande et véhicule automobile pour réaliser une fonction de conduite automatisée | |
DE102018218172B3 (de) | Verfahren und Vorrichtung zur Bewertung einer einem Fahrzeug in Fahrtrichtung vorausliegenden Fahrsituation zur automatisierten Umfahrung eines vorausfahrenden Verkehrshindernisses | |
DE102021000792A1 (de) | Verfahren zum Betrieb eines Fahrzeuges | |
DE102019129737A1 (de) | Verfahren zum Klassifizieren einer Umgebung eines Fahrzeugs | |
DE102019004931A1 (de) | Verfahren zum Betrieb eines Fahrzeugs | |
DE102019002598A1 (de) | Verfahren zur Führung eines autonom fahrenden Fahrzeuges auf einer schneebedeckten Straße sowie ein autonom fahrendes Fahrzeug | |
DE102022000390B3 (de) | Verfahren zum Betrieb eines zumindest teilautomatisiert steuerbaren Fahrzeugs | |
DE102017007777A1 (de) | Verfahren zum Betreiben eines Assistenzsystems für ein Kraftfahrzeug, Assistenzsystem, eingerichtet zur Durchführung eines solchen Verfahrens, und Kraftfahrzeug mit einem solchen Assistenzsystem |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20230718 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |