EP3673418A1 - Verfahren und vorrichtung zum effizienten ermitteln von ausgangssignalen eines maschinellen lernsystems - Google Patents
Verfahren und vorrichtung zum effizienten ermitteln von ausgangssignalen eines maschinellen lernsystemsInfo
- Publication number
- EP3673418A1 EP3673418A1 EP18753121.5A EP18753121A EP3673418A1 EP 3673418 A1 EP3673418 A1 EP 3673418A1 EP 18753121 A EP18753121 A EP 18753121A EP 3673418 A1 EP3673418 A1 EP 3673418A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sequence
- signals
- layer
- signal
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0499—Feedforward networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
Definitions
- the invention relates to a method for efficiently determining output signals of a machine learning system, a measuring system and an actuator control system in which the method is used, a computer program, and a machine-readable storage medium.
- the neural network comprises input units, hidden units and output units, an output of each of said input units being connected to an input of each of the hidden units, and an output of each of the hidden units being connected to an input of each of the output units.
- the neural network generates an output signal at each of the output units indicating whether a vehicle has been detected in a detection zone.
- the method with the features of independent claim 1 has the advantage that it is particularly efficient parallel executable.
- the invention in a first aspect, relates to a method for efficiently determining output signals of a sequence of output signals by means of a series of layers of a machine learning system, in particular of a neural network, from a sequence of input signals, wherein the neural network successively successively inputs the input signals in a sequence of discrete time steps be fed to the sequence of input signals, and wherein at the discrete time steps in each case in the network adjacent signals, a layer of the sequence of layers are further propagated.
- That New input signals are already being supplied to the neural network at any given time, even while the previous input signals are being propagated through the neural network.
- This method has the advantage that the arithmetic operations necessary for the propagation through the respective layers can be parallelized particularly well in terms of memory efficiency.
- initialization of the neural network is particularly easy since it only needs to be done once, not again for each of the input signals.
- Input signals and / or output signals can, as usual, be one-dimensional or multidimensional variables.
- the method is therefore executed on a computer having a plurality of arithmetic units, for example a plurality of independently operable processors or arithmetic cores.
- each layer of the sequence of layers determines its respective output depending on the input signals applied to it, and passes that output to each of the subsequent layers which in turn receive that output as an input.
- Such a method is particularly parallel to parallel storage.
- a signal applied to a preceding in particular immediately preceding, instantaneous, applied signal.
- a fixed number of clock cycles can be predetermined for each layer, within which the propagation of signals through this layer is reliably completed, and at its output in each case after the predeterminable number of clock cycles a newly propagated by this layer signal is created and there for the predetermined number of clocks is applied.
- the machine learning system comprises at least one recycled layer.
- the signals applied to their inputs which are derived from different input signals of the sequence of input signals, mix at their output. Therefore, a sequence of successive input signals, such as images of a video sequence, is particularly easy to analyze in this way.
- the computation steps associated with different layers are assigned to different arithmetic units. In this way, a partial or complete parallelization of the arithmetic operations in the propagation of the signals is achieved.
- one of the aforementioned methods can be used if the input signals were determined from the sequence of input signals by means of a sensor, in particular an imaging sensor.
- an actuator is actuated depending on the sequence of output signals.
- the invention relates to a measuring system comprising the sensor, in particular the imaging sensor, by means of which the sequence of input signals is determined and a single or a plurality of processors and at least one machine-readable storage medium, on which instructions are stored which, when they are executed on the processors, causing the measuring system to perform one of the aforementioned methods.
- the invention relates to an actuator control system for driving the actuator comprising a single or a plurality of processors and at least one machine-readable storage medium having stored thereon instructions which, when executed on the processors, cause the actuator control system to to determine the sequence of output signals with one of the aforementioned methods and to control the actuator depending on the sequence of output signals.
- the invention relates to a computer program which is adapted to carry out one of the aforementioned methods, when executed on a computer, a machine-readable storage medium on which this computer program is stored (this storage medium Of course, it may be spatially distributed, eg distributed in parallel execution over several computers).
- FIG 1 shows schematically possible applications of the invention
- FIG. 2 shows by way of example a neural network in the implementation of which the invention can be used
- Figure 3 shows schematically a signal propagation according to the prior art by the neural network shown in Figure 2;
- FIG. 4 shows schematically a signal propagation according to an embodiment of the invention by the neural network shown in FIG. 2;
- FIG. 5 shows schematically a signal propagation according to a further embodiment of the invention through a further neural network
- FIG. 6 is a flow chart of an embodiment of the invention. Description of the embodiments
- Figure 1 shows an actuator 10 in its environment 20 in interaction with an actuator control system 40.
- Actuator 10 and environment 20 are collectively referred to below as the actuator system.
- a state of the actuator system is detected by a sensor 30, which may also be provided by a plurality of sensors.
- the actuator control system 40 receives a sequence of sensor signals S.
- the actuator control system 40 determines therefrom a sequence of drive signals A, which the actuator 10 receives.
- the actuator 10 may, for example, be a (partially) autonomous robot, for example a (partially) autonomous motor vehicle.
- the sensor 30 may be, for example, one or more video sensors and / or one or more radar sensors and / or one or more ultrasonic sensors and / or one or more position sensors (for example GPS). Alternatively or additionally, the sensor 30 may also include an information system that determines information about a state of the actuator system, such as a weather information system that determines a current or future state of the weather in the environment 20.
- an information system that determines information about a state of the actuator system, such as a weather information system that determines a current or future state of the weather in the environment 20.
- the robot may be a manufacturing robot and the sensor 30 may then be, for example, an optical sensor that detects characteristics of manufacturing products of the manufacturing robot.
- the actuator 10 may be a release system configured to enable or disable the activity of a device.
- the sensor 30 may be an optical sensor (for example, for capturing image or video data) configured to detect a face.
- the actuator 10 determines a release signal depending on the sequence of drive signals A, which can be used to enable the device depending on the value of the enable signal.
- the device may be, for example, a physical or logical access control. Depending on the value of the drive signal A, the access control can then provide that access is granted or not.
- the actuator control system 40 receives the sequence of sensor signals S of the sensor in an optional receiving unit 50, which converts the sequence of sensor signals S into a sequence of input signals x (alternatively, the sensor signal S can also be taken over as the input signal x).
- the input signal x may be, for example, a section or a further processing of the sensor signal S.
- the input signal x is fed to a machine learning system 60, for example a neural network.
- the first machine learning system 60 determines output signals y from the input signals x.
- the output signals y are fed to an optional conversion unit 80 which determines therefrom activation signals A, which are supplied to the actuator 10.
- a measuring system 41 differs from the actuator control system 40 only in that the optional forming unit 80 determines no drive signal A. It can, for example, store or display the output signal y, for example as a visual or audible signal.
- the actuator control system 40 and / or the measuring system 41 comprises the sensor 30. In still further embodiments, the actuator control system 40 alternatively or additionally also comprises the actuator 10.
- the actuator control system 40 and / or the measurement system 41 comprises a single or multiple processors 45 and at least one machine-readable storage medium 46, on which instructions are stored which, when executed on the processors 45, the actuator control system 40 and / or cause the measuring system 41 to carry out the method 8 according to the invention.
- Figure 2 illustrates the structure of a layered machine learning system 60, which is given in the embodiment by a neural network.
- the input signal x is fed to an input layer 100, which determines therefrom, in particular by means of linear or non-linear transformation, a signal zi which is supplied to a hidden layer 200.
- Intermediate layer 200 thereby determines a further signal z 2 which is optionally supplied to further, subsequent hidden layers which, analogously to the hidden layer 200, determine further signals z i.
- the last of these signals z n is fed to an output layer 300, which determines the output signal y from this.
- the intermediate layer 200 can be formed recurrently by the optional further signal m 2 .
- the successive processing of the input signal x layer by layer is also known as propagation of the input signal x through the neural network 60.
- Figure 3 shows a time sequence of the signal propagation through the neural network 60.
- the neural network is xo a sequence of input signals supplied xi, ..., for example, successive recordings of the sensor 30 to the time of input layer 100 is supplied with the input signal x 0, propagated through the layers 100, 200, 300, successively signals zi ; o, Z2; o and a first output signal y o of a sequence of output signals y 0 , yi is determined.
- the hidden layer 200 further determines the further signal nri2.
- the initialization of the further signal ⁇ 2 before it is first supplied to the hidden layer 200 is indicated by the dashed box.
- the next input signal xi is fed to the input layer 100 at the next time ti and propagated analogously to the input signal x o through the subsequent layers until the next output signal yi of the sequence of output signals is determined.
- FIG. 4 shows a time sequence of the signal propagation according to an embodiment of the invention through the neural network 60 shown in FIG. 2. Shown are times t 1, t 1, t 2, t 3 of a preferably equidistant time grid. At each of the times to, ti, t2, t3 of the equidistant time grid, the input layer 100 is supplied with the respective next input variable from the sequence of input variables xo, xi, X2, X3. Hidden layer 200 and output layer 300 are initialized at time to, which is indicated by the dashed line.
- the computation steps necessary in the layers 100, 200, 300 are parallelized, ie they run simultaneously on different arithmetic units, such as different processors or arithmetic cores.
- a further parallelization within the respective ones is in each case
- FIG. 5 shows a time sequence of signal propagation through a pure feedforward network with layers 101, 201, 301, 401, 501, 601.
- the signal propagation is analogous to that shown in FIG 4, a number of clock cycles are specified for each of the layers, each indicating how many time slices between successive time steps are to be reserved for carrying out the calculation of the respective propagation through the respective slice 101, 202, 401, 501, 601 equal to one, for layer 301 equal to two.
- the propagation of the input signal reaches x 0 the third hidden layer 301.
- the third time h propagate the layers 101, 201, 401, 501, 601, the number of which the value one is assumed and therefore continue to propagate at any time, each just at the input adjacent signals one layer further.
- the third hidden layer 301 suspends the further propagation at this point in time, and propagates the signal present at its input only at the time step after next (corresponding to the number 2 assigned to it).
- FIG. 6 shows a flowchart for an embodiment of the invention.
- the predeterminable number of magazines is determined for each of the layers, for example read from a file, and all layers except the input layer 100 are initialized.
- the next applied time to, ti, t2, ... is determined, and by means of the determined predeterminable number, those layers are determined, by which the signals present at their respective input are to be propagated at that time, i. their corresponding calculation begins at this time.
- each layer is assigned those layers whose outputs are connected to the inputs of the layer in question.
- the signal applied to the respective input is selected in each case as the signal of the connected preceding layer which was last marked as "completely determined.”
- the next input signal of the sequence of input signals is applied at this time.
- the propagation is initiated for each of the layers determined in step 1010.
- the method can be implemented in software and stored, for example, on a memory present in the actuator control system 40 or measuring system 41. It can also be implemented in hardware, or in a hybrid of software and hardware.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Fuzzy Systems (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Feedback Control In General (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102017214524 | 2017-08-21 | ||
| DE102018209316.4A DE102018209316A1 (de) | 2017-08-21 | 2018-06-12 | Verfahren und Vorrichtung zum effizienten Ermitteln von Ausgangssignalen eines maschinellen Lernsystems |
| PCT/EP2018/071044 WO2019038050A1 (de) | 2017-08-21 | 2018-08-02 | Verfahren und vorrichtung zum effizienten ermitteln von ausgangssignalen eines maschinellen lernsystems |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP3673418A1 true EP3673418A1 (de) | 2020-07-01 |
Family
ID=65235270
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP18753121.5A Withdrawn EP3673418A1 (de) | 2017-08-21 | 2018-08-02 | Verfahren und vorrichtung zum effizienten ermitteln von ausgangssignalen eines maschinellen lernsystems |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11524409B2 (de) |
| EP (1) | EP3673418A1 (de) |
| CN (1) | CN110998609A (de) |
| DE (1) | DE102018209316A1 (de) |
| WO (1) | WO2019038050A1 (de) |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US544848A (en) | 1895-08-20 | Centrifugal butter-worker | ||
| US9014850B2 (en) * | 2012-01-13 | 2015-04-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Methods and computer-program products for evaluating grasp patterns, and robots incorporating the same |
| US9696699B2 (en) * | 2012-11-15 | 2017-07-04 | Cybomedical, Inc. | Self-organizing sensing and actuation for automatic control |
| CN104833852A (zh) * | 2015-05-11 | 2015-08-12 | 重庆大学 | 一种基于人工神经网络的电力系统谐波信号估计测量方法 |
| EP3568782A1 (de) * | 2017-01-13 | 2019-11-20 | Massachusetts Institute Of Technology | Auf maschinellem lernen basierendes antikörperdesign |
| CN106951960A (zh) * | 2017-03-02 | 2017-07-14 | 平顶山学院 | 一种神经网络及该神经网络的学习方法 |
-
2018
- 2018-06-12 DE DE102018209316.4A patent/DE102018209316A1/de not_active Ceased
- 2018-08-02 WO PCT/EP2018/071044 patent/WO2019038050A1/de not_active Ceased
- 2018-08-02 US US16/639,766 patent/US11524409B2/en active Active
- 2018-08-02 EP EP18753121.5A patent/EP3673418A1/de not_active Withdrawn
- 2018-08-02 CN CN201880054012.9A patent/CN110998609A/zh active Pending
Non-Patent Citations (3)
| Title |
|---|
| HUIMIN LI ET AL: "A high performance FPGA-based accelerator for large-scale convolutional neural networks", 2016 26TH INTERNATIONAL CONFERENCE ON FIELD PROGRAMMABLE LOGIC AND APPLICATIONS (FPL), EPFL, 29 August 2016 (2016-08-29), pages 1 - 9, XP032971527, DOI: 10.1109/FPL.2016.7577308 * |
| See also references of WO2019038050A1 * |
| ZHEN LIN ET AL: "A hybrid architecture for efficient FPGA-based implementation of multilayer neural network", CIRCUITS AND SYSTEMS (APCCAS), 2010 IEEE ASIA PACIFIC CONFERENCE ON, IEEE, 6 December 2010 (2010-12-06), pages 616 - 619, XP031875897, ISBN: 978-1-4244-7454-7, DOI: 10.1109/APCCAS.2010.5774961 * |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2019038050A1 (de) | 2019-02-28 |
| US11524409B2 (en) | 2022-12-13 |
| DE102018209316A1 (de) | 2019-02-21 |
| CN110998609A (zh) | 2020-04-10 |
| US20200206939A1 (en) | 2020-07-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| DE202018104373U1 (de) | Vorrichtung, die zum Betreiben eines maschinellen Lernsystems eingerichtet ist | |
| EP4000010B1 (de) | Vorrichtung und computerimplementiertes verfahren für die verarbeitung digitaler sensordaten | |
| EP3746850B1 (de) | Verfahren zum ermitteln eines zeitlichen verlaufs einer messgrösse, prognosesystem, aktorsteuerungssystem, verfahren zum trainieren des aktorsteuerungssystems, trainingssystem, computerprogramm und maschinenlesbares speichermedium | |
| EP3701434A1 (de) | Verfahren und vorrichtung zum automatischen erzeugen eines künstlichen neuronalen netzes | |
| DE102016211554A1 (de) | Verfahren und Vorrichtung zur Gestaltung eines Produktionsprozesses zum Produzieren eines aus mehreren Teilprodukten zusammengesetzten Produkts | |
| EP3953865B1 (de) | Verfahren, vorrichtung und computerprogramm zum betreiben eines tiefen neuronalen netzes | |
| EP3782081B1 (de) | Verfahren zur erzeugung eines testdatensatzes, verfahren zum testen, verfahren zum betreiben eines systems, vorrichtung, steuerungssystem, computerprogrammprodukt, computerlesbares medium, erzeugung und verwendung | |
| EP3179372A1 (de) | Verfahren und vorrichtung zum testen einer mehrzahl von steuereinheiten einer technischen einheit | |
| EP3785178B1 (de) | Verfahren und vorrichtung zum ermitteln einer netzkonfiguration eines neuronalen netzes | |
| DE102016001034A1 (de) | Vorrichtung und Verfahren zum Steuern einer elektronischen Feststellbremse | |
| EP3673418A1 (de) | Verfahren und vorrichtung zum effizienten ermitteln von ausgangssignalen eines maschinellen lernsystems | |
| DE102020205962B3 (de) | Vorrichtung und Verfahren zum Betreiben eines Prüfstands | |
| EP3786853A1 (de) | Komprimieren eines tiefen neuronalen netzes | |
| DE102018122115A1 (de) | Verfahren zur Umgebungserfassung eines Fahrzeugs | |
| DE102017213771A1 (de) | Verfahren und Vorrichtung zum Ermitteln von Anomalien in einem Kommunikationsnetzwerk | |
| DE102017214610B4 (de) | Verfahren zum Überprüfen von zumindest einer Fahrzeugfunktion sowie Prüfvorrichtung | |
| DE102017218773A1 (de) | Verfahren und Vorrichtung zum Ansteuern eines Aktors | |
| DE102021125498A1 (de) | Systemvalidierung mit verbesserter Handhabung von Protokollierungsdaten | |
| DE102017218143A1 (de) | Verfahren und Vorrichtung zum Ansteuern eines fahrzeugelektronischen Planungsmodules | |
| WO2022063663A1 (de) | Verfahren, datenverarbeitungsmodul und datenverarbeitungsnetzwerk zur verarbeitung von daten | |
| DE102020205964B3 (de) | Vorrichtung und Verfahren zum Betreiben eines Prüfstands | |
| DE102023102152A1 (de) | Erzeugen eines statischen Bildes und Führen eines Fahrzeugs | |
| DE102020209985A1 (de) | Vorrichtung und Verfahren zum Ermitteln einer Umfeldinformation | |
| DE102023201104A1 (de) | Verfahren zum Erzeugen von zusätzlichen Trainingsdaten zum Trainieren eines Algorithmus des maschinellen Lernens | |
| DE102020201984A1 (de) | Verfahren zum Freigeben einer Fahrt eines Kraftfahrzeugs |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20200323 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| AX | Request for extension of the european patent |
Extension state: BA ME |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) | ||
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20210111 |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20240301 |