EP3891998A1 - Procédé et dispositif d'évaluation et de fourniture automatiques de signaux vidéo d'un événement - Google Patents

Procédé et dispositif d'évaluation et de fourniture automatiques de signaux vidéo d'un événement

Info

Publication number
EP3891998A1
EP3891998A1 EP19821249.0A EP19821249A EP3891998A1 EP 3891998 A1 EP3891998 A1 EP 3891998A1 EP 19821249 A EP19821249 A EP 19821249A EP 3891998 A1 EP3891998 A1 EP 3891998A1
Authority
EP
European Patent Office
Prior art keywords
camera
video signals
data
time
meta
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19821249.0A
Other languages
German (de)
English (en)
Inventor
Philipp Lawo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lawo Holding AG
Original Assignee
Lawo Holding AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lawo Holding AG filed Critical Lawo Holding AG
Publication of EP3891998A1 publication Critical patent/EP3891998A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/10Recognition assisted with metadata

Definitions

  • the invention relates to a method and a device for the automatic evaluation and provision of video signals of an event.
  • Sports events such as ball games, races and races or the like, in which cameras are used to record events at certain, also changing locations, such as, for example, ball positions, shots on goal, overtaking, target runs or the like.
  • cameras are usually used to record what is happening at different locations, from different positions and from different angles. All video signals recorded by the cameras are recorded in the
  • the object of the invention is achieved by a generic method with the following steps: the video signals associated with the time signal, camera parameters, meta Data and / or the provided part of the video signals are stored on a data carrier.
  • the camera parameters being at least one parameter of the group: position of the camera, acceleration of the camera, orientation of the camera, camera angle kel, magnetic field, field of view, air pressure, volume, brightness, time, current power consumption; Acquisition of meta data by means of at least one meta sensor and automatic assignment of the time signal (s) to the meta data, the meta data being at least one parameter of the group: geographic data, object positions, shipment data, object-specific data , Statistics, databases, local volume, user-defined parameters;
  • a device Transmission of the video signals, camera parameters and meta data assigned to the time signals to a data processing device; Automatic evaluation of the video signals assigned to the time signals depending on the camera parameters assigned to the time signal, the meta data assigned to the time signal and by user input and provision of at least part of the video Signals depending on the evaluation.
  • the stated object is further achieved by a device according to the gat, with data acquisition devices such as a camera for recording video signals, at least one camera sensor for recording local camera parameters, at least one meta sensor for recording meta data and a data processing device for receiving the video signals, camera parameters and meta data assigned to a time signal, for evaluating the video signals and for providing at least some of the video signals, the camera, the camera sensor and the meta sensor is connected to the data processing device and the camera sensor is connected to the camera.
  • data acquisition devices such as a camera for recording video signals, at least one camera sensor for recording local camera parameters, at least one meta sensor for recording meta data and a data processing device for receiving the video signals, camera parameters and meta data assigned to a time signal
  • the invention is based on the basic consideration that, as an alternative to the known image processing algorithms, the video signals are evaluated as a function of the camera parameters and as a function of the meta data.
  • the video signals, the camera parameters and the meta data can be clearly assigned to one another and synchronized. This in particular enables automatic evaluation of the video signals associated with the time signal on the basis of the camera parameters and the meta data, so that the part of the video signals of interest can be made available to a user.
  • the time signal in the sense of the invention is, for example, a time indication, in particular the time at the location of the camera, measured in milliseconds.
  • the time signal can be designed as a stopwatch and have an incremental counter.
  • the time signal is preferably designed such that each frame of the video Signal can be assigned a unique time value. In this respect, the time signal has the function of a clear time stamp.
  • Camera parameters in the sense of the invention are parameters which characterize properties of the assigned camera, for example the currently set camera angle, the inclination and the position of the camera, the latter being measurable by means of a GPS sensor.
  • At least one camera sensor is provided, which can be equipped, for example, to record the camera angle as a gyroscope and / or to record the camera orientation as an (electronic) compass.
  • the camera parameters are preferably assigned to them simultaneously with the recording of the video signals.
  • meta data are, in particular, parameters of the event.
  • a football game is, for example, the current position of the ball and / or the current positions of the players, which can be recorded using common tracking methods.
  • Current scores are also meta data.
  • meta data are, for example, the current positions of the drivers or vehicles and / or their current positions.
  • the meta data is determined, for example, using common tracking methods, interfaces and / or using GPS sensors.
  • metadata can also be broadcast data that provide information as to whether a certain part of the video signal was broadcast, for example, as part of a television broadcast.
  • Meta data in the sense of the invention are also user-defined parameters such as players and / or vehicle names, individual statistics and other information about players and / or vehicles can be imported from databases, for example, the goal shot rate of a football player.
  • the volume information measured by a microphone, in particular spatially separated from the camera, is meta-data in the sense of the invention.
  • volume information of a fan curve in a football stage can be recorded as meta data in the sense of the invention.
  • the invention includes that meta data ] *.
  • - Also - can be captured as a camera via other devices or paths and that metadata, video and audio signals are processed by a central data processing device. It is therefore preferably a - central - data processing device independent of - local - recording devices, such as cameras, for processing metadata.
  • the data of all data sources are thus connected to the common global time and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
  • the video signals, the camera parameters and the meta data can be transmitted to the data processing device by means of cables and / or wirelessly, in which case the transmission can be carried out, for example, using WLAN, Bluetooth and / or radio.
  • a user input for the evaluation of the video signals according to the invention is, for example, a query for a combination of several criteria, for example the query at what time a particular vehicle was captured by a particular camera.
  • the video signals are recorded using a single camera, the position and camera angle of which are changed during the event and recorded as camera data.
  • volume information of a spectator area is recorded as meta data by means of several external microphones, with a high volume indicating an important event, for example a shot on goal. If the special moments of an event - highlights - are requested by user input, the answer is provided to those parts of the video signal in which the volume of the meta data is significantly increased.
  • the acquisition of the time signal, the acquisition of the video signals, the acquisition of the camera parameters and the acquisition of the meta data are synchronized in time, so that the assignment of the video signals, the camera parameters and the meta -Data for the time signal is simplified.
  • camera parameters and meta data can be recorded whenever a single image of the video signals is recorded.
  • the video signals of several cameras can be recorded in synchronized time.
  • the time signal can be assigned to the video signals simultaneously with the recording of the video signals. Analogously, this can apply to the assignment of the time signal to the camera parameters and / or to the meta data.
  • the time signal is recorded and the video signals and are recorded capturing the camera parameters over the entire duration of the event in order to be able to access the complete data record that was generated during the entire event during the automatic evaluation.
  • the meta data are preferably additionally recorded over the entire duration of the event.
  • the various data sources preferably work with a global time source and thus a global time such as GPS time, NTP (Network Time Protocol) or PTP (Precision Time Protocol), so that the metadata with image, video or audio signals are central in terms of time can be connected without processing in a detection device.
  • All data are provided with the common global time.
  • the data from all data sources are thus connected to the common global part and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
  • the meta data can only be acquired if a parameter of the meta data falls below and / or exceeds a user-defined limit. This avoids the accumulation of too much unused data.
  • a local volume can only be recorded if the sound level is above a user-defined limit.
  • a high volume may indicate a significant event, such as a foul or overtaking.
  • the steps of evaluating and providing the part of the video signals take place during the event, so that the part of the video signals desired by the user can be made available during the event, in particular continuously.
  • there is a user request for the highlights of the event determines at what time signal an increased volume is detected in a fan area, which indicates a significant highlight. By determining the time signals of all highlights, the corresponding parts of the video signals can be made available to the user before the end of the event.
  • Further meta data are preferably generated when the video signals are evaluated.
  • object or person-related statistics can be created or supplemented. This step can be carried out automatically, so that the newly created statistics can be available again as meta data when evaluating the video signals.
  • the video signals, the camera parameters, the meta data and / or the part of the video signals provided can be stored on a data carrier, preferably in the form of a database, so that archiving and / or later evaluation is possible.
  • the meta sensor can be provided spatially separated from the camera sensor.
  • each camera is preferably assigned a camera sensor, which is integrated in particular with the camera assigned to it.
  • at least one camera is arranged on a missile, in particular on a drone, so that the camera can be moved quickly and easily.
  • Fig. 1 The device according to the invention in a schematic representation
  • Fig. 2 is a flowchart of the invention
  • Fig. 1 shows a schematic sketch of a running route 10, for example a running route 10 for a medium distance run, in which runners not shown in FIG. 1 run along the running route 10 while they are being run by two cameras 11, 12 Edge of the running track 10 are filmed. 1 shows a first camera 11 and a second camera 12, which are arranged at different positions on the edge of the running route 10. During the run, the first camera 11 records a first video signal 15 and the second camera 12 records a second video signal 16, which is outlined in the flowchart in FIG. 2.
  • Both cameras 11, 12 are each provided with an integrated camera sensor 13, 14, the first camera sensor 13 with the first camera 11 and the second camera Sensor 14 are connected to the second camera 12.
  • the first camera sensor 13 detects local camera parameters of the first camera 11 during the run.
  • local camera parameters are the geographical position of the camera, its orientation and its camera angle.
  • the geographic position of the camera is measured with a GPS sensor, the orientation with an electrical compass and the camera angle with an electrical gyroscope in combination with a software interface to the camera.
  • the GPS sensor, the electrical compass, the electrical gyroscope and the software interface are integrally formed as the first camera sensor 13, which outputs the recorded camera parameters via a further interface.
  • the second camera sensor 14 detects the local camera parameters of the second camera 12 during the run.
  • the first camera 11 has a first camera angle 17, which in particular detects a curve 18 of the racetrack 10 and is larger than a second camera angle 19 of the second camera 12.
  • the second camera angle 19 of the second camera 12 aligned to a target area 20 of the running route 10.
  • the camera angle 17, 19 denotes the geographical area that is captured by the camera 11, 12. Since the second camera angle 19 is smaller than the first camera angle 17, an enlarged image is obtained in order to be able to better judge which of the runners is the first to cross the target area 20.
  • the camera angles 17, 19 of the cameras 11, 12 can be changed over time and are continuously recorded as camera parameters by the cameras 11, 12 assigned to the cameras 11, 12, respectively.
  • the runners not shown in FIG.
  • a meta sensor 21 in the form of a GPS sensor in order to record the geographic positions of the runners on the running route 10 at any time during the run.
  • a GPS sensor 21 is arranged in the right-hand area of FIG. 1.
  • a second meta sensor 22 in the form of a microphone is arranged on the left-hand side of FIG. 1 in order to measure the volume of the spectators during the run.
  • the cameras 11, 12, the camera sensors 13, 14 and the metal sensors 21, 22 are each connected to a data processing device 23, the connections in FIG. 1 being represented by connecting lines 24, but alternatively also designed wirelessly could be.
  • FIGS. 1 and 2 show a schematic flow diagram.
  • a continuous time signal from a global time system such as GPS time, NTPO or PTP is recorded in every data acquisition device, such as every camera, in particular locally, from the start signal, which preferably indicates the time in milliseconds and consequently as uniform timestamp.
  • a global time system such as GPS time, NTPO or PTP
  • every data acquisition device such as every camera, in particular locally, from the start signal, which preferably indicates the time in milliseconds and consequently as uniform timestamp.
  • the two cameras 11, 12 of FIG. 1 each take continuous video signals 15 during the run
  • Each frame of the video signals 15, 16 is automatically assigned the respective time signal. This process step is marked on the upper left side of FIG. 2 as B.
  • meta data such as, in particular, the positions, orientations and camera angles of the two cameras 11, 12 are recorded as camera parameters by the camera sensors 13, 14 and these automatically also receive the corresponding parameters global time signal assigned (section C).
  • the GPS sensor 21 continuously detects the current position of the runner assigned to it and the microphone 22 detects the current volume of the audience. Both meta data are automatically assigned the current time signal when they are captured by the meta sensors 21, 22 (section D).
  • next process step E the video signals, camera data and meta data associated with the time signal are transmitted to the data processing device 23 during the event.
  • the runner's trainer is interested in the performance during the run and therefore makes a user input to the data processing device 23 in a next method step F by requesting video signals in which this particular runner is to see is.
  • This user input is registered in the data processing device 23, so that the recorded video signals 15, 16 are analyzed as to whether the runner can be seen in the video signals 15, 16, see.
  • the data processing device 23 For the first camera 11, this is the case, for example, when the geographic position of the runner, which is continuously detected by the GPS sensor 21, from the first camera angle 17 is covered. In this case, the data processing device 23 only provides the part of the first video signal 15 in which the runner can be seen there. The procedure is analogous with the automatic evaluation of the second video signals 16 from the second camera 12. The evaluation of the video signals 15, 16 takes place during the event and simultaneously for all video signals 15, 16. In a last method step H of FIG. 2, the data processing device 23 provides the user with the desired parts of the video signals 15, 16 riding on which the runner can be seen.
  • the user input is a request from a broadcaster for the highlights of the race.
  • This user input is interpreted by the data processing device 23 in such a way that time signals are searched in which the microphone 22 at the edge of the running route 10 has detected significantly high volumes as meta data. This indicates a particularly significant event.
  • the data processing device 23 After the data processing device 23 has determined the time signals in which high volumes were measured, the first video signals 15 of the first camera 11 assigned to the time signals are determined, since the first camera 11 is arranged closest to the microphone 22 .
  • the rest of the evaluation and provision of the desired part of the video signals 15, 16 are carried out analogously to the previous example. In this way, who provides the user with the highlights of the event.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Television Signal Processing For Recording (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé et un dispositif pour évaluer et fournir des signaux vidéo d'un événement. Un signal de temps uniforme est associé aux signaux vidéo enregistrés de l'événement, aux paramètres de caméra et aux méta-données acquis, à la suite de quoi une évaluation automatique des signaux vidéo s'effectue en fonction d'une entrée utilisateur dans un dispositif de traitement de données. De cette manière, une partie souhaitée des signaux vidéo peut être fournie à l'utilisateur en fonction de l'évaluation.
EP19821249.0A 2018-12-05 2019-12-04 Procédé et dispositif d'évaluation et de fourniture automatiques de signaux vidéo d'un événement Withdrawn EP3891998A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018009571.2A DE102018009571A1 (de) 2018-12-05 2018-12-05 Verfahren und Vorrichtung zur automatischen Auswertung und Bereitstellung von Video-Signalen eines Ereignisses
PCT/EP2019/000332 WO2020114623A1 (fr) 2018-12-05 2019-12-04 Procédé et dispositif d'évaluation et de fourniture automatiques de signaux vidéo d'un événement

Publications (1)

Publication Number Publication Date
EP3891998A1 true EP3891998A1 (fr) 2021-10-13

Family

ID=68916468

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19821249.0A Withdrawn EP3891998A1 (fr) 2018-12-05 2019-12-04 Procédé et dispositif d'évaluation et de fourniture automatiques de signaux vidéo d'un événement

Country Status (4)

Country Link
US (1) US11689691B2 (fr)
EP (1) EP3891998A1 (fr)
DE (1) DE102018009571A1 (fr)
WO (1) WO2020114623A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019216419B4 (de) 2019-10-24 2024-06-20 Carl Zeiss Industrielle Messtechnik Gmbh Sensoranordnung zur Erfassung von Werkstücken und Verfahren zum Betreiben einer derartigen Sensoranordnung
US20220017095A1 (en) * 2020-07-14 2022-01-20 Ford Global Technologies, Llc Vehicle-based data acquisition

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3762149B2 (ja) * 1998-07-31 2006-04-05 キヤノン株式会社 カメラ制御システム、カメラサーバ、カメラサーバの制御方法、カメラ制御方法、及びコンピュータ読み取り可能な記録媒体
US6748158B1 (en) * 1999-02-01 2004-06-08 Grass Valley (U.S.) Inc. Method for classifying and searching video databases based on 3-D camera motion
GB0029893D0 (en) * 2000-12-07 2001-01-24 Sony Uk Ltd Video information retrieval
US7133070B2 (en) * 2001-09-20 2006-11-07 Eastman Kodak Company System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
JP2004194159A (ja) * 2002-12-13 2004-07-08 Canon Inc 映像通信システム
WO2008046243A1 (fr) 2006-10-16 2008-04-24 Thomson Licensing Procédé et dispositif de codage d'un flux de données, procédé et dispositif de décodage d'un flux de données, système d'indexage video et système d'extraction d'images
US20100007730A1 (en) * 2008-07-09 2010-01-14 Lin Meng-Te Surveillance Display Apparatus, Surveillance System, and Control Method Thereof
KR20110132884A (ko) 2010-06-03 2011-12-09 한국전자통신연구원 다중 동영상 색인 및 검색이 가능한 지능형 영상 정보 검색 장치 및 방법
WO2015162548A1 (fr) 2014-04-22 2015-10-29 Batchu Krishnaiahsetty Sumana Système électronique et procédé de marquage d'extraits mis en évidence dans un fichier multimédia et de manipulation du fichier à l'aide des extraits mis en évidence
US10074013B2 (en) * 2014-07-23 2018-09-11 Gopro, Inc. Scene and activity identification in video summary generation
WO2016029170A1 (fr) * 2014-08-22 2016-02-25 Cape Productions Inc. Procédés et appareil pour le montage automatique d'une vidéo enregistrée par un véhicule aérien sans pilote
US9313556B1 (en) * 2015-09-14 2016-04-12 Logitech Europe S.A. User interface for video summaries
CN108287924A (zh) 2018-02-28 2018-07-17 福建师范大学 一种可定位视频数据采集与组织检索方法

Also Published As

Publication number Publication date
WO2020114623A1 (fr) 2020-06-11
US11689691B2 (en) 2023-06-27
DE102018009571A1 (de) 2020-06-10
US20220103779A1 (en) 2022-03-31

Similar Documents

Publication Publication Date Title
DE60216693T2 (de) Gerät zum Verteilen von Video und Gerät zum Empfangen von Video
EP1864153B1 (fr) Systeme de poursuite d'objet et d'analyse de situation
DE60213913T2 (de) System und Verfahren zur Inhaltsdarstellung
EP3891998A1 (fr) Procédé et dispositif d'évaluation et de fourniture automatiques de signaux vidéo d'un événement
EP2044573A1 (fr) Caméra de surveillance, procédé d'étalonnage et d'utilisation de la caméra de surveillance
DE102006006667A1 (de) Verfahren und Vorrichtung zur automatischen Ermittlung der Ergebnise sportlicher Wettbewerbe
DE102009020997A1 (de) Verfahren zum Aufzeichnen und Verarbeiten von Fahrtdaten eines Kraftfahrzeugs
DE10029463A1 (de) Auswerteeinheit und Verfahren zur Auswertung von statischen Zuständen und/oder Bewegungsabläufen
DE102014224120A1 (de) Ausgeben von Audiobeiträgen für ein Fahrzeug
DE60123786T2 (de) Verfahren und System zur automatischen Produktion von Videosequenzen
EP0973445B1 (fr) Diagnostic de la boiterie
DE102019203614A1 (de) Einrichtung und Verfahren zur Anzeige von Ereignisinformation, die aus Videodaten detektiert wird
DE102008026657A1 (de) Verfahren und Vorrichtung zur bildgebenden Darstellung von akustischen Objekten
DE102020213288A1 (de) Anzeigevorrichtung für ein Videoüberwachungssystem, Videoüberwachungssystem sowie Verfahren
DE102013103557A1 (de) Medienszenenwiedergabesystem und -verfahren sowie deren Aufzeichnungsmedien
DE102017123068A1 (de) System zum Synchronisieren von Ton- oder Videoaufnahmen
DE112019004282T5 (de) Informationsverarbeitungsvorrichtung, Informationsverarbeitungsverfahren und Programm
CH708459B1 (de) Verfahren zur Aufzeichnung und Wiedergabe von Bewegungsvorgängen eines Sportlers.
EP3843419B1 (fr) Procédé de commande d'un réseau de microphones et dispositif de commande d'un réseau de microphone
EP1434184B1 (fr) Commande d'un système multicaméra
WO2002030053A1 (fr) Procede et systeme pour transmettre des informations entre un serveur et un client mobile
DE102021110268A1 (de) Verfahren und System zur szenensynchronen Auswahl und Wiedergabe von Audiosequenzen für ein Kraftfahrzeug
EP3389805A1 (fr) Procédé et système de détermination en temps réel d'un équipement sportif
EP0583441B1 (fr) Dispositif pour mesurer des temps, notamment dans le domaine sportif
DE102007054088A1 (de) Verfahren und Vorrichtung zur Bildverarbeitung insbesondere Bildmessung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210614

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20221114

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20230525