EP3891998A1 - Method and device for automatically evaluating and providing video signals of an event - Google Patents
Method and device for automatically evaluating and providing video signals of an eventInfo
- Publication number
- EP3891998A1 EP3891998A1 EP19821249.0A EP19821249A EP3891998A1 EP 3891998 A1 EP3891998 A1 EP 3891998A1 EP 19821249 A EP19821249 A EP 19821249A EP 3891998 A1 EP3891998 A1 EP 3891998A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- video signals
- data
- time
- meta
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/44—Event detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/21805—Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/10—Recognition assisted with metadata
Definitions
- the invention relates to a method and a device for the automatic evaluation and provision of video signals of an event.
- Sports events such as ball games, races and races or the like, in which cameras are used to record events at certain, also changing locations, such as, for example, ball positions, shots on goal, overtaking, target runs or the like.
- cameras are usually used to record what is happening at different locations, from different positions and from different angles. All video signals recorded by the cameras are recorded in the
- the object of the invention is achieved by a generic method with the following steps: the video signals associated with the time signal, camera parameters, meta Data and / or the provided part of the video signals are stored on a data carrier.
- the camera parameters being at least one parameter of the group: position of the camera, acceleration of the camera, orientation of the camera, camera angle kel, magnetic field, field of view, air pressure, volume, brightness, time, current power consumption; Acquisition of meta data by means of at least one meta sensor and automatic assignment of the time signal (s) to the meta data, the meta data being at least one parameter of the group: geographic data, object positions, shipment data, object-specific data , Statistics, databases, local volume, user-defined parameters;
- a device Transmission of the video signals, camera parameters and meta data assigned to the time signals to a data processing device; Automatic evaluation of the video signals assigned to the time signals depending on the camera parameters assigned to the time signal, the meta data assigned to the time signal and by user input and provision of at least part of the video Signals depending on the evaluation.
- the stated object is further achieved by a device according to the gat, with data acquisition devices such as a camera for recording video signals, at least one camera sensor for recording local camera parameters, at least one meta sensor for recording meta data and a data processing device for receiving the video signals, camera parameters and meta data assigned to a time signal, for evaluating the video signals and for providing at least some of the video signals, the camera, the camera sensor and the meta sensor is connected to the data processing device and the camera sensor is connected to the camera.
- data acquisition devices such as a camera for recording video signals, at least one camera sensor for recording local camera parameters, at least one meta sensor for recording meta data and a data processing device for receiving the video signals, camera parameters and meta data assigned to a time signal
- the invention is based on the basic consideration that, as an alternative to the known image processing algorithms, the video signals are evaluated as a function of the camera parameters and as a function of the meta data.
- the video signals, the camera parameters and the meta data can be clearly assigned to one another and synchronized. This in particular enables automatic evaluation of the video signals associated with the time signal on the basis of the camera parameters and the meta data, so that the part of the video signals of interest can be made available to a user.
- the time signal in the sense of the invention is, for example, a time indication, in particular the time at the location of the camera, measured in milliseconds.
- the time signal can be designed as a stopwatch and have an incremental counter.
- the time signal is preferably designed such that each frame of the video Signal can be assigned a unique time value. In this respect, the time signal has the function of a clear time stamp.
- Camera parameters in the sense of the invention are parameters which characterize properties of the assigned camera, for example the currently set camera angle, the inclination and the position of the camera, the latter being measurable by means of a GPS sensor.
- At least one camera sensor is provided, which can be equipped, for example, to record the camera angle as a gyroscope and / or to record the camera orientation as an (electronic) compass.
- the camera parameters are preferably assigned to them simultaneously with the recording of the video signals.
- meta data are, in particular, parameters of the event.
- a football game is, for example, the current position of the ball and / or the current positions of the players, which can be recorded using common tracking methods.
- Current scores are also meta data.
- meta data are, for example, the current positions of the drivers or vehicles and / or their current positions.
- the meta data is determined, for example, using common tracking methods, interfaces and / or using GPS sensors.
- metadata can also be broadcast data that provide information as to whether a certain part of the video signal was broadcast, for example, as part of a television broadcast.
- Meta data in the sense of the invention are also user-defined parameters such as players and / or vehicle names, individual statistics and other information about players and / or vehicles can be imported from databases, for example, the goal shot rate of a football player.
- the volume information measured by a microphone, in particular spatially separated from the camera, is meta-data in the sense of the invention.
- volume information of a fan curve in a football stage can be recorded as meta data in the sense of the invention.
- the invention includes that meta data ] *.
- - Also - can be captured as a camera via other devices or paths and that metadata, video and audio signals are processed by a central data processing device. It is therefore preferably a - central - data processing device independent of - local - recording devices, such as cameras, for processing metadata.
- the data of all data sources are thus connected to the common global time and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
- the video signals, the camera parameters and the meta data can be transmitted to the data processing device by means of cables and / or wirelessly, in which case the transmission can be carried out, for example, using WLAN, Bluetooth and / or radio.
- a user input for the evaluation of the video signals according to the invention is, for example, a query for a combination of several criteria, for example the query at what time a particular vehicle was captured by a particular camera.
- the video signals are recorded using a single camera, the position and camera angle of which are changed during the event and recorded as camera data.
- volume information of a spectator area is recorded as meta data by means of several external microphones, with a high volume indicating an important event, for example a shot on goal. If the special moments of an event - highlights - are requested by user input, the answer is provided to those parts of the video signal in which the volume of the meta data is significantly increased.
- the acquisition of the time signal, the acquisition of the video signals, the acquisition of the camera parameters and the acquisition of the meta data are synchronized in time, so that the assignment of the video signals, the camera parameters and the meta -Data for the time signal is simplified.
- camera parameters and meta data can be recorded whenever a single image of the video signals is recorded.
- the video signals of several cameras can be recorded in synchronized time.
- the time signal can be assigned to the video signals simultaneously with the recording of the video signals. Analogously, this can apply to the assignment of the time signal to the camera parameters and / or to the meta data.
- the time signal is recorded and the video signals and are recorded capturing the camera parameters over the entire duration of the event in order to be able to access the complete data record that was generated during the entire event during the automatic evaluation.
- the meta data are preferably additionally recorded over the entire duration of the event.
- the various data sources preferably work with a global time source and thus a global time such as GPS time, NTP (Network Time Protocol) or PTP (Precision Time Protocol), so that the metadata with image, video or audio signals are central in terms of time can be connected without processing in a detection device.
- All data are provided with the common global time.
- the data from all data sources are thus connected to the common global part and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
- the meta data can only be acquired if a parameter of the meta data falls below and / or exceeds a user-defined limit. This avoids the accumulation of too much unused data.
- a local volume can only be recorded if the sound level is above a user-defined limit.
- a high volume may indicate a significant event, such as a foul or overtaking.
- the steps of evaluating and providing the part of the video signals take place during the event, so that the part of the video signals desired by the user can be made available during the event, in particular continuously.
- there is a user request for the highlights of the event determines at what time signal an increased volume is detected in a fan area, which indicates a significant highlight. By determining the time signals of all highlights, the corresponding parts of the video signals can be made available to the user before the end of the event.
- Further meta data are preferably generated when the video signals are evaluated.
- object or person-related statistics can be created or supplemented. This step can be carried out automatically, so that the newly created statistics can be available again as meta data when evaluating the video signals.
- the video signals, the camera parameters, the meta data and / or the part of the video signals provided can be stored on a data carrier, preferably in the form of a database, so that archiving and / or later evaluation is possible.
- the meta sensor can be provided spatially separated from the camera sensor.
- each camera is preferably assigned a camera sensor, which is integrated in particular with the camera assigned to it.
- at least one camera is arranged on a missile, in particular on a drone, so that the camera can be moved quickly and easily.
- Fig. 1 The device according to the invention in a schematic representation
- Fig. 2 is a flowchart of the invention
- Fig. 1 shows a schematic sketch of a running route 10, for example a running route 10 for a medium distance run, in which runners not shown in FIG. 1 run along the running route 10 while they are being run by two cameras 11, 12 Edge of the running track 10 are filmed. 1 shows a first camera 11 and a second camera 12, which are arranged at different positions on the edge of the running route 10. During the run, the first camera 11 records a first video signal 15 and the second camera 12 records a second video signal 16, which is outlined in the flowchart in FIG. 2.
- Both cameras 11, 12 are each provided with an integrated camera sensor 13, 14, the first camera sensor 13 with the first camera 11 and the second camera Sensor 14 are connected to the second camera 12.
- the first camera sensor 13 detects local camera parameters of the first camera 11 during the run.
- local camera parameters are the geographical position of the camera, its orientation and its camera angle.
- the geographic position of the camera is measured with a GPS sensor, the orientation with an electrical compass and the camera angle with an electrical gyroscope in combination with a software interface to the camera.
- the GPS sensor, the electrical compass, the electrical gyroscope and the software interface are integrally formed as the first camera sensor 13, which outputs the recorded camera parameters via a further interface.
- the second camera sensor 14 detects the local camera parameters of the second camera 12 during the run.
- the first camera 11 has a first camera angle 17, which in particular detects a curve 18 of the racetrack 10 and is larger than a second camera angle 19 of the second camera 12.
- the second camera angle 19 of the second camera 12 aligned to a target area 20 of the running route 10.
- the camera angle 17, 19 denotes the geographical area that is captured by the camera 11, 12. Since the second camera angle 19 is smaller than the first camera angle 17, an enlarged image is obtained in order to be able to better judge which of the runners is the first to cross the target area 20.
- the camera angles 17, 19 of the cameras 11, 12 can be changed over time and are continuously recorded as camera parameters by the cameras 11, 12 assigned to the cameras 11, 12, respectively.
- the runners not shown in FIG.
- a meta sensor 21 in the form of a GPS sensor in order to record the geographic positions of the runners on the running route 10 at any time during the run.
- a GPS sensor 21 is arranged in the right-hand area of FIG. 1.
- a second meta sensor 22 in the form of a microphone is arranged on the left-hand side of FIG. 1 in order to measure the volume of the spectators during the run.
- the cameras 11, 12, the camera sensors 13, 14 and the metal sensors 21, 22 are each connected to a data processing device 23, the connections in FIG. 1 being represented by connecting lines 24, but alternatively also designed wirelessly could be.
- FIGS. 1 and 2 show a schematic flow diagram.
- a continuous time signal from a global time system such as GPS time, NTPO or PTP is recorded in every data acquisition device, such as every camera, in particular locally, from the start signal, which preferably indicates the time in milliseconds and consequently as uniform timestamp.
- a global time system such as GPS time, NTPO or PTP
- every data acquisition device such as every camera, in particular locally, from the start signal, which preferably indicates the time in milliseconds and consequently as uniform timestamp.
- the two cameras 11, 12 of FIG. 1 each take continuous video signals 15 during the run
- Each frame of the video signals 15, 16 is automatically assigned the respective time signal. This process step is marked on the upper left side of FIG. 2 as B.
- meta data such as, in particular, the positions, orientations and camera angles of the two cameras 11, 12 are recorded as camera parameters by the camera sensors 13, 14 and these automatically also receive the corresponding parameters global time signal assigned (section C).
- the GPS sensor 21 continuously detects the current position of the runner assigned to it and the microphone 22 detects the current volume of the audience. Both meta data are automatically assigned the current time signal when they are captured by the meta sensors 21, 22 (section D).
- next process step E the video signals, camera data and meta data associated with the time signal are transmitted to the data processing device 23 during the event.
- the runner's trainer is interested in the performance during the run and therefore makes a user input to the data processing device 23 in a next method step F by requesting video signals in which this particular runner is to see is.
- This user input is registered in the data processing device 23, so that the recorded video signals 15, 16 are analyzed as to whether the runner can be seen in the video signals 15, 16, see.
- the data processing device 23 For the first camera 11, this is the case, for example, when the geographic position of the runner, which is continuously detected by the GPS sensor 21, from the first camera angle 17 is covered. In this case, the data processing device 23 only provides the part of the first video signal 15 in which the runner can be seen there. The procedure is analogous with the automatic evaluation of the second video signals 16 from the second camera 12. The evaluation of the video signals 15, 16 takes place during the event and simultaneously for all video signals 15, 16. In a last method step H of FIG. 2, the data processing device 23 provides the user with the desired parts of the video signals 15, 16 riding on which the runner can be seen.
- the user input is a request from a broadcaster for the highlights of the race.
- This user input is interpreted by the data processing device 23 in such a way that time signals are searched in which the microphone 22 at the edge of the running route 10 has detected significantly high volumes as meta data. This indicates a particularly significant event.
- the data processing device 23 After the data processing device 23 has determined the time signals in which high volumes were measured, the first video signals 15 of the first camera 11 assigned to the time signals are determined, since the first camera 11 is arranged closest to the microphone 22 .
- the rest of the evaluation and provision of the desired part of the video signals 15, 16 are carried out analogously to the previous example. In this way, who provides the user with the highlights of the event.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Remote Sensing (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Television Signal Processing For Recording (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018009571.2A DE102018009571A1 (en) | 2018-12-05 | 2018-12-05 | Method and device for the automatic evaluation and provision of video signals of an event |
PCT/EP2019/000332 WO2020114623A1 (en) | 2018-12-05 | 2019-12-04 | Method and device for automatically evaluating and providing video signals of an event |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3891998A1 true EP3891998A1 (en) | 2021-10-13 |
Family
ID=68916468
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19821249.0A Withdrawn EP3891998A1 (en) | 2018-12-05 | 2019-12-04 | Method and device for automatically evaluating and providing video signals of an event |
Country Status (4)
Country | Link |
---|---|
US (1) | US11689691B2 (en) |
EP (1) | EP3891998A1 (en) |
DE (1) | DE102018009571A1 (en) |
WO (1) | WO2020114623A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102019216419B4 (en) | 2019-10-24 | 2024-06-20 | Carl Zeiss Industrielle Messtechnik Gmbh | Sensor arrangement for detecting workpieces and method for operating such a sensor arrangement |
US20220017095A1 (en) * | 2020-07-14 | 2022-01-20 | Ford Global Technologies, Llc | Vehicle-based data acquisition |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3762149B2 (en) * | 1998-07-31 | 2006-04-05 | キヤノン株式会社 | Camera control system, camera server, camera server control method, camera control method, and computer-readable recording medium |
US6748158B1 (en) * | 1999-02-01 | 2004-06-08 | Grass Valley (U.S.) Inc. | Method for classifying and searching video databases based on 3-D camera motion |
GB0029893D0 (en) * | 2000-12-07 | 2001-01-24 | Sony Uk Ltd | Video information retrieval |
US7133070B2 (en) * | 2001-09-20 | 2006-11-07 | Eastman Kodak Company | System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data |
JP2004194159A (en) * | 2002-12-13 | 2004-07-08 | Canon Inc | Video communication system |
WO2008046243A1 (en) | 2006-10-16 | 2008-04-24 | Thomson Licensing | Method and device for encoding a data stream, method and device for decoding a data stream, video indexing system and image retrieval system |
US20100007730A1 (en) * | 2008-07-09 | 2010-01-14 | Lin Meng-Te | Surveillance Display Apparatus, Surveillance System, and Control Method Thereof |
KR20110132884A (en) | 2010-06-03 | 2011-12-09 | 한국전자통신연구원 | Apparatus for intelligent video information retrieval supporting multi channel video indexing and retrieval, and method thereof |
WO2015162548A1 (en) | 2014-04-22 | 2015-10-29 | Batchu Krishnaiahsetty Sumana | An electronic system and method for marking highlights in a multimedia file and manipulating the multimedia file using the highlights |
US10074013B2 (en) * | 2014-07-23 | 2018-09-11 | Gopro, Inc. | Scene and activity identification in video summary generation |
WO2016029170A1 (en) * | 2014-08-22 | 2016-02-25 | Cape Productions Inc. | Methods and apparatus for automatic editing of video recorded by an unmanned aerial vehicle |
US9313556B1 (en) * | 2015-09-14 | 2016-04-12 | Logitech Europe S.A. | User interface for video summaries |
CN108287924A (en) | 2018-02-28 | 2018-07-17 | 福建师范大学 | One kind can the acquisition of positioning video data and organizing search method |
-
2018
- 2018-12-05 DE DE102018009571.2A patent/DE102018009571A1/en active Pending
-
2019
- 2019-12-04 WO PCT/EP2019/000332 patent/WO2020114623A1/en unknown
- 2019-12-04 US US17/298,176 patent/US11689691B2/en active Active
- 2019-12-04 EP EP19821249.0A patent/EP3891998A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO2020114623A1 (en) | 2020-06-11 |
US11689691B2 (en) | 2023-06-27 |
DE102018009571A1 (en) | 2020-06-10 |
US20220103779A1 (en) | 2022-03-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE60216693T2 (en) | Device for distributing video and device for receiving video | |
EP1864153B1 (en) | Object-tracking and situation-analysis system | |
DE60213913T2 (en) | System and method of content presentation | |
EP3891998A1 (en) | Method and device for automatically evaluating and providing video signals of an event | |
EP2044573A1 (en) | Monitoring camera, method for calibrating the monitoring camera, and use of the monitoring camera | |
DE102006006667A1 (en) | Sports competition e.g. marathon walk, result determining method, involves checking combination of characteristics based on preset criteria using cameras, when frames are plausible based on preset criteria | |
DE102009020997A1 (en) | Method for recording and processing journey data of vehicle, involves determining position and orientation of vehicle by satellite supported positioning system, where stereo camera is installed in vehicle for recording journey images | |
DE10029463A1 (en) | Position and/or movement detection device uses evaluation of signals provided by several transmitters detecting electromagnetic or sonar waves provided by transmitter attached to object | |
DE102014224120A1 (en) | Output audio contributions for a vehicle | |
DE60123786T2 (en) | Method and system for automatic production of video sequences | |
EP0973445B1 (en) | Lameness diagnosis | |
DE102019203614A1 (en) | Apparatus and method for displaying event information detected from video data | |
DE102008026657A1 (en) | Method for imaged representation of three dimensional acoustic objects as measuring object, involves bringing images in relation to acoustic reference image of measuring object immediately or at time point | |
DE102020213288A1 (en) | Display device for a video surveillance system, video surveillance system and method | |
DE102013103557A1 (en) | Media scene rendering system and method and their recording media | |
DE102017123068A1 (en) | System for synchronizing audio or video recordings | |
DE112019004282T5 (en) | Information processing apparatus, information processing method and program | |
CH708459B1 (en) | A method for recording and reproducing the movements of an athlete. | |
EP3843419B1 (en) | Method for controlling a microphone array and device for controlling a microphone array | |
EP1434184B1 (en) | Control of a multicamera system | |
WO2002030053A1 (en) | Method and system for transmitting information between a server and a mobile customer | |
DE102021110268A1 (en) | Method and system for scene-synchronous selection and playback of audio sequences for a motor vehicle | |
EP3389805A1 (en) | Method and system for live determining of a sports device | |
EP0583441B1 (en) | Device for measuring time, especially sporting times | |
DE102007054088A1 (en) | Method and device for image processing, in particular image measurement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210614 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20221114 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20230525 |