EP3891998A1 - Method and device for automatically evaluating and providing video signals of an event - Google Patents

Method and device for automatically evaluating and providing video signals of an event

Info

Publication number
EP3891998A1
EP3891998A1 EP19821249.0A EP19821249A EP3891998A1 EP 3891998 A1 EP3891998 A1 EP 3891998A1 EP 19821249 A EP19821249 A EP 19821249A EP 3891998 A1 EP3891998 A1 EP 3891998A1
Authority
EP
European Patent Office
Prior art keywords
camera
video signals
data
time
meta
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19821249.0A
Other languages
German (de)
French (fr)
Inventor
Philipp Lawo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lawo Holding AG
Original Assignee
Lawo Holding AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lawo Holding AG filed Critical Lawo Holding AG
Publication of EP3891998A1 publication Critical patent/EP3891998A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/10Recognition assisted with metadata

Definitions

  • the invention relates to a method and a device for the automatic evaluation and provision of video signals of an event.
  • Sports events such as ball games, races and races or the like, in which cameras are used to record events at certain, also changing locations, such as, for example, ball positions, shots on goal, overtaking, target runs or the like.
  • cameras are usually used to record what is happening at different locations, from different positions and from different angles. All video signals recorded by the cameras are recorded in the
  • the object of the invention is achieved by a generic method with the following steps: the video signals associated with the time signal, camera parameters, meta Data and / or the provided part of the video signals are stored on a data carrier.
  • the camera parameters being at least one parameter of the group: position of the camera, acceleration of the camera, orientation of the camera, camera angle kel, magnetic field, field of view, air pressure, volume, brightness, time, current power consumption; Acquisition of meta data by means of at least one meta sensor and automatic assignment of the time signal (s) to the meta data, the meta data being at least one parameter of the group: geographic data, object positions, shipment data, object-specific data , Statistics, databases, local volume, user-defined parameters;
  • a device Transmission of the video signals, camera parameters and meta data assigned to the time signals to a data processing device; Automatic evaluation of the video signals assigned to the time signals depending on the camera parameters assigned to the time signal, the meta data assigned to the time signal and by user input and provision of at least part of the video Signals depending on the evaluation.
  • the stated object is further achieved by a device according to the gat, with data acquisition devices such as a camera for recording video signals, at least one camera sensor for recording local camera parameters, at least one meta sensor for recording meta data and a data processing device for receiving the video signals, camera parameters and meta data assigned to a time signal, for evaluating the video signals and for providing at least some of the video signals, the camera, the camera sensor and the meta sensor is connected to the data processing device and the camera sensor is connected to the camera.
  • data acquisition devices such as a camera for recording video signals, at least one camera sensor for recording local camera parameters, at least one meta sensor for recording meta data and a data processing device for receiving the video signals, camera parameters and meta data assigned to a time signal
  • the invention is based on the basic consideration that, as an alternative to the known image processing algorithms, the video signals are evaluated as a function of the camera parameters and as a function of the meta data.
  • the video signals, the camera parameters and the meta data can be clearly assigned to one another and synchronized. This in particular enables automatic evaluation of the video signals associated with the time signal on the basis of the camera parameters and the meta data, so that the part of the video signals of interest can be made available to a user.
  • the time signal in the sense of the invention is, for example, a time indication, in particular the time at the location of the camera, measured in milliseconds.
  • the time signal can be designed as a stopwatch and have an incremental counter.
  • the time signal is preferably designed such that each frame of the video Signal can be assigned a unique time value. In this respect, the time signal has the function of a clear time stamp.
  • Camera parameters in the sense of the invention are parameters which characterize properties of the assigned camera, for example the currently set camera angle, the inclination and the position of the camera, the latter being measurable by means of a GPS sensor.
  • At least one camera sensor is provided, which can be equipped, for example, to record the camera angle as a gyroscope and / or to record the camera orientation as an (electronic) compass.
  • the camera parameters are preferably assigned to them simultaneously with the recording of the video signals.
  • meta data are, in particular, parameters of the event.
  • a football game is, for example, the current position of the ball and / or the current positions of the players, which can be recorded using common tracking methods.
  • Current scores are also meta data.
  • meta data are, for example, the current positions of the drivers or vehicles and / or their current positions.
  • the meta data is determined, for example, using common tracking methods, interfaces and / or using GPS sensors.
  • metadata can also be broadcast data that provide information as to whether a certain part of the video signal was broadcast, for example, as part of a television broadcast.
  • Meta data in the sense of the invention are also user-defined parameters such as players and / or vehicle names, individual statistics and other information about players and / or vehicles can be imported from databases, for example, the goal shot rate of a football player.
  • the volume information measured by a microphone, in particular spatially separated from the camera, is meta-data in the sense of the invention.
  • volume information of a fan curve in a football stage can be recorded as meta data in the sense of the invention.
  • the invention includes that meta data ] *.
  • - Also - can be captured as a camera via other devices or paths and that metadata, video and audio signals are processed by a central data processing device. It is therefore preferably a - central - data processing device independent of - local - recording devices, such as cameras, for processing metadata.
  • the data of all data sources are thus connected to the common global time and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
  • the video signals, the camera parameters and the meta data can be transmitted to the data processing device by means of cables and / or wirelessly, in which case the transmission can be carried out, for example, using WLAN, Bluetooth and / or radio.
  • a user input for the evaluation of the video signals according to the invention is, for example, a query for a combination of several criteria, for example the query at what time a particular vehicle was captured by a particular camera.
  • the video signals are recorded using a single camera, the position and camera angle of which are changed during the event and recorded as camera data.
  • volume information of a spectator area is recorded as meta data by means of several external microphones, with a high volume indicating an important event, for example a shot on goal. If the special moments of an event - highlights - are requested by user input, the answer is provided to those parts of the video signal in which the volume of the meta data is significantly increased.
  • the acquisition of the time signal, the acquisition of the video signals, the acquisition of the camera parameters and the acquisition of the meta data are synchronized in time, so that the assignment of the video signals, the camera parameters and the meta -Data for the time signal is simplified.
  • camera parameters and meta data can be recorded whenever a single image of the video signals is recorded.
  • the video signals of several cameras can be recorded in synchronized time.
  • the time signal can be assigned to the video signals simultaneously with the recording of the video signals. Analogously, this can apply to the assignment of the time signal to the camera parameters and / or to the meta data.
  • the time signal is recorded and the video signals and are recorded capturing the camera parameters over the entire duration of the event in order to be able to access the complete data record that was generated during the entire event during the automatic evaluation.
  • the meta data are preferably additionally recorded over the entire duration of the event.
  • the various data sources preferably work with a global time source and thus a global time such as GPS time, NTP (Network Time Protocol) or PTP (Precision Time Protocol), so that the metadata with image, video or audio signals are central in terms of time can be connected without processing in a detection device.
  • All data are provided with the common global time.
  • the data from all data sources are thus connected to the common global part and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
  • the meta data can only be acquired if a parameter of the meta data falls below and / or exceeds a user-defined limit. This avoids the accumulation of too much unused data.
  • a local volume can only be recorded if the sound level is above a user-defined limit.
  • a high volume may indicate a significant event, such as a foul or overtaking.
  • the steps of evaluating and providing the part of the video signals take place during the event, so that the part of the video signals desired by the user can be made available during the event, in particular continuously.
  • there is a user request for the highlights of the event determines at what time signal an increased volume is detected in a fan area, which indicates a significant highlight. By determining the time signals of all highlights, the corresponding parts of the video signals can be made available to the user before the end of the event.
  • Further meta data are preferably generated when the video signals are evaluated.
  • object or person-related statistics can be created or supplemented. This step can be carried out automatically, so that the newly created statistics can be available again as meta data when evaluating the video signals.
  • the video signals, the camera parameters, the meta data and / or the part of the video signals provided can be stored on a data carrier, preferably in the form of a database, so that archiving and / or later evaluation is possible.
  • the meta sensor can be provided spatially separated from the camera sensor.
  • each camera is preferably assigned a camera sensor, which is integrated in particular with the camera assigned to it.
  • at least one camera is arranged on a missile, in particular on a drone, so that the camera can be moved quickly and easily.
  • Fig. 1 The device according to the invention in a schematic representation
  • Fig. 2 is a flowchart of the invention
  • Fig. 1 shows a schematic sketch of a running route 10, for example a running route 10 for a medium distance run, in which runners not shown in FIG. 1 run along the running route 10 while they are being run by two cameras 11, 12 Edge of the running track 10 are filmed. 1 shows a first camera 11 and a second camera 12, which are arranged at different positions on the edge of the running route 10. During the run, the first camera 11 records a first video signal 15 and the second camera 12 records a second video signal 16, which is outlined in the flowchart in FIG. 2.
  • Both cameras 11, 12 are each provided with an integrated camera sensor 13, 14, the first camera sensor 13 with the first camera 11 and the second camera Sensor 14 are connected to the second camera 12.
  • the first camera sensor 13 detects local camera parameters of the first camera 11 during the run.
  • local camera parameters are the geographical position of the camera, its orientation and its camera angle.
  • the geographic position of the camera is measured with a GPS sensor, the orientation with an electrical compass and the camera angle with an electrical gyroscope in combination with a software interface to the camera.
  • the GPS sensor, the electrical compass, the electrical gyroscope and the software interface are integrally formed as the first camera sensor 13, which outputs the recorded camera parameters via a further interface.
  • the second camera sensor 14 detects the local camera parameters of the second camera 12 during the run.
  • the first camera 11 has a first camera angle 17, which in particular detects a curve 18 of the racetrack 10 and is larger than a second camera angle 19 of the second camera 12.
  • the second camera angle 19 of the second camera 12 aligned to a target area 20 of the running route 10.
  • the camera angle 17, 19 denotes the geographical area that is captured by the camera 11, 12. Since the second camera angle 19 is smaller than the first camera angle 17, an enlarged image is obtained in order to be able to better judge which of the runners is the first to cross the target area 20.
  • the camera angles 17, 19 of the cameras 11, 12 can be changed over time and are continuously recorded as camera parameters by the cameras 11, 12 assigned to the cameras 11, 12, respectively.
  • the runners not shown in FIG.
  • a meta sensor 21 in the form of a GPS sensor in order to record the geographic positions of the runners on the running route 10 at any time during the run.
  • a GPS sensor 21 is arranged in the right-hand area of FIG. 1.
  • a second meta sensor 22 in the form of a microphone is arranged on the left-hand side of FIG. 1 in order to measure the volume of the spectators during the run.
  • the cameras 11, 12, the camera sensors 13, 14 and the metal sensors 21, 22 are each connected to a data processing device 23, the connections in FIG. 1 being represented by connecting lines 24, but alternatively also designed wirelessly could be.
  • FIGS. 1 and 2 show a schematic flow diagram.
  • a continuous time signal from a global time system such as GPS time, NTPO or PTP is recorded in every data acquisition device, such as every camera, in particular locally, from the start signal, which preferably indicates the time in milliseconds and consequently as uniform timestamp.
  • a global time system such as GPS time, NTPO or PTP
  • every data acquisition device such as every camera, in particular locally, from the start signal, which preferably indicates the time in milliseconds and consequently as uniform timestamp.
  • the two cameras 11, 12 of FIG. 1 each take continuous video signals 15 during the run
  • Each frame of the video signals 15, 16 is automatically assigned the respective time signal. This process step is marked on the upper left side of FIG. 2 as B.
  • meta data such as, in particular, the positions, orientations and camera angles of the two cameras 11, 12 are recorded as camera parameters by the camera sensors 13, 14 and these automatically also receive the corresponding parameters global time signal assigned (section C).
  • the GPS sensor 21 continuously detects the current position of the runner assigned to it and the microphone 22 detects the current volume of the audience. Both meta data are automatically assigned the current time signal when they are captured by the meta sensors 21, 22 (section D).
  • next process step E the video signals, camera data and meta data associated with the time signal are transmitted to the data processing device 23 during the event.
  • the runner's trainer is interested in the performance during the run and therefore makes a user input to the data processing device 23 in a next method step F by requesting video signals in which this particular runner is to see is.
  • This user input is registered in the data processing device 23, so that the recorded video signals 15, 16 are analyzed as to whether the runner can be seen in the video signals 15, 16, see.
  • the data processing device 23 For the first camera 11, this is the case, for example, when the geographic position of the runner, which is continuously detected by the GPS sensor 21, from the first camera angle 17 is covered. In this case, the data processing device 23 only provides the part of the first video signal 15 in which the runner can be seen there. The procedure is analogous with the automatic evaluation of the second video signals 16 from the second camera 12. The evaluation of the video signals 15, 16 takes place during the event and simultaneously for all video signals 15, 16. In a last method step H of FIG. 2, the data processing device 23 provides the user with the desired parts of the video signals 15, 16 riding on which the runner can be seen.
  • the user input is a request from a broadcaster for the highlights of the race.
  • This user input is interpreted by the data processing device 23 in such a way that time signals are searched in which the microphone 22 at the edge of the running route 10 has detected significantly high volumes as meta data. This indicates a particularly significant event.
  • the data processing device 23 After the data processing device 23 has determined the time signals in which high volumes were measured, the first video signals 15 of the first camera 11 assigned to the time signals are determined, since the first camera 11 is arranged closest to the microphone 22 .
  • the rest of the evaluation and provision of the desired part of the video signals 15, 16 are carried out analogously to the previous example. In this way, who provides the user with the highlights of the event.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Television Signal Processing For Recording (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to a method and a device for evaluating and providing video signals of an event. A uniform time signal is assigned to recorded video signals of the event, to acquired camera parameters and to metadata, whereupon an automatic evaluation of the video signals is carried out as a function of a user input in a data processing device. In this way, a desired part of the video signals can be provided to the user as a function of the evaluation.

Description

Verfahren und Vorrichtung zur automatischen Auswertung und Bereitstellung von Video-Signalen eines Ereignisses Method and device for the automatic evaluation and provision of video signals of an event
Die Erfindung betrifft ein Verfahren und eine Vorrichtung zur automatischen Auswertung und Bereitstellung von Video signalen eines Ereignisses. The invention relates to a method and a device for the automatic evaluation and provision of video signals of an event.
Ereignisse im Sinne der Erfindung sind beispielsweise Events in the sense of the invention are, for example
Sportereignisse wie Ballspiele, Wettläufe und -rennen oder dergleichen, bei denen Kameras zur Aufnahme von Geschehnis sen an bestimmten, auch wechselnden Orten, wie beispiels- weise Ballpositionen, Torschüsse, Überholvorgänge, Zielein läufen oder dergleichen, eingesetzt werden. Üblicherweise werden mehrere Kameras eingesetzt, um die Geschehnisse an unterschiedlichen Orten, von unterschiedlichen Positionen und aus unterschiedlichen Winkeln aufzunehmen. Sämtliche von den Kameras aufgenommenen Video-Signale werden in derSports events such as ball games, races and races or the like, in which cameras are used to record events at certain, also changing locations, such as, for example, ball positions, shots on goal, overtaking, target runs or the like. Several cameras are usually used to record what is happening at different locations, from different positions and from different angles. All video signals recorded by the cameras are recorded in the
Regel in einen Regieraum, wie auch in einen Übertragungswa gen gesendet, in dem eine verantwortliche Person das an ei nen Fernsehsender zu übertragende Video-Signal aus der Vielzahl der einzelnen Video-Signale der Kameras auswählt. Usually in a control room, as well as in a broadcast car, in which a responsible person selects the video signal to be transmitted to a TV station from the multitude of individual video signals from the cameras.
BESTAT1GUN5SK0PIE Die nicht gesendeten Video-Signale gehen in der Regel ver loren . BESTAT1GUN5SK0PIE The video signals that are not sent are usually lost.
Seit einiger Zeit wächst das Interesse an den nicht gesen deten Video-Signalen. Beispielsweise können Vereine diese Video-Signale, die etwa ihre Spieler bzw. ihre Gegner zei gen, dazu nutzen, um deren individuelle Leistungen zu beur teilen. Auch sind bei Wettlaufen oder -rennen die Trainer oder Rennställe an sämtlichen Aufnahmen ihrer Läufer, Rad fahrer, Rennwagen etc. interessiert. Daher kann jedes auf genommene Video-Signal, gerade bei Sportereignissen, für einen entsprechenden Nutzerkreis relevant sein. Um den je weils relevanten Teil eines Video-Signals für diverse Nut zergruppen herauszufiltern, gibt es die Möglichkeit, sämt liche Video-Signale manuell zu prüfen und zu schneiden, was allerdings einen immensen Aufwand bedingt. Daneben gibt es Bestrebungen, neben der rein manuellen Auswahl der Video signale technische Verfahren zu deren Auswertung zu entwi ckeln. Diese Verfahren analysieren die Video-Signale mit tels Bildverarbeitungs-Algorithmen zur Erkennung von Objek ten des Ereignisses, beispielsweise Fahrzeugen und/oder Spielern. Diese Verfahren sind aber aufwändig, langsam und teuer, so dass sie derzeit nur beschränkt wirtschaftlich sind . Interest in video signals that have not been sent has been growing for some time. For example, clubs can use these video signals, which show their players or opponents, to assess their individual performance. During races or races, the trainers or racing teams are also interested in all pictures of their runners, cyclists, racing cars etc. Therefore, every recorded video signal, especially during sporting events, can be relevant for a corresponding group of users. In order to filter out the relevant part of a video signal for various user groups, there is the option of manually checking and cutting all video signals, which, however, requires immense effort. In addition to the purely manual selection of video signals, there are also efforts to develop technical processes for their evaluation. These methods analyze the video signals using image processing algorithms to detect objects in the event, for example vehicles and / or players. However, these processes are complex, slow and expensive, so that they are currently only of limited economic viability.
Es ist daher die Aufgabe der Erfindung, unter Vermeidung der Nachteile des Standes der Technik eine verbesserte Mög lichkeit zur Bereitstellung von individuellem Videomaterial eines Ereignisses zu schaffen. It is therefore the object of the invention, while avoiding the disadvantages of the prior art, to provide an improved possibility for providing individual video material of an event.
Die Aufgabe der Erfindung wird gelöst durch ein gattungsge mäßes Verfahren mit den folgenden Schritten: Die dem Zeit- Signal zugeordneten Video-Signale, Kamera-Parameter, Meta- Daten und/oder der bereitgestellte Teil der Video-Signale werden auf einem Datenträger gespeichert. The object of the invention is achieved by a generic method with the following steps: the video signals associated with the time signal, camera parameters, meta Data and / or the provided part of the video signals are stored on a data carrier.
Erfassen mindestens eines Zeit-Signals; Detecting at least one time signal;
Aufnehmen von Video-Signalen mittels mindestens einer Kamera und automatische Zuordnung des oder der Zeit-Signale zu den Video-Signalen; Recording video signals by means of at least one camera and automatic assignment of the time signal or signals to the video signals;
Erfassen von lokalen Kamera-Parametern am Ort der Kamera und automatische Zuordnung des oder der Zeit-Signale zu den Kamera-Parametern, wobei die Kamera-Parameter mindestens ein Parameter der Gruppe sind: Position der Kamera, Beschleunigung der Kamera, Orientierung der Kamera, Kamerawin kel, Magnetfeld, Sichtfeld, Luftdruck, Lautstär ke, Helligkeit, Zeit, aktueller Stromverbrauch; Erfassen von Meta-Daten mittels mindestens eines Meta-Sensors und automatische Zuordnung des oder der Zeit-Signale zu den Meta-Daten, wobei die Me ta-Daten zumindest ein Parameter der Gruppe sind: Geographische Daten, Objektpositionen, Sendungs daten, obj ektspezifische Daten, Statistiken, Da tenbanken, lokale Lautstärke, benutzerdefinierte Parameter; Detection of local camera parameters at the location of the camera and automatic assignment of the time signal or signals to the camera parameters, the camera parameters being at least one parameter of the group: position of the camera, acceleration of the camera, orientation of the camera, camera angle kel, magnetic field, field of view, air pressure, volume, brightness, time, current power consumption; Acquisition of meta data by means of at least one meta sensor and automatic assignment of the time signal (s) to the meta data, the meta data being at least one parameter of the group: geographic data, object positions, shipment data, object-specific data , Statistics, databases, local volume, user-defined parameters;
Übermittlung der dem oder den Zeit-Signalen zuge ordneten Video-Signale, Kamera-Parameter und Me ta-Daten zu einer Datenverarbeitungseinrichtung; Automatisches Auswerten der dem oder den Zeit- Signalen zugeordneten Video-Signale in Abhängig keit von den dem Zeit-Signal zugeordneten Kamera- Parametern, von den dem Zeit-Signal zugeordneten Meta-Daten und von einer Benutzereingabe und Bereitstellung von mindestens einem Teil der Vi deo-Signale in Abhängigkeit von der Auswertung. Die genannte Aufgabe wird weiter gelöst durch eine gat tungsgemäße Vorrichtung, mit Datenerfassungsgeräten wie ei ner Kamera zum Aufnehmen von Video-Signalen, mindestens ei nen Kamera-Sensor zum Erfassen von lokalen Kamera- Parametern, mindestens einen Meta-Sensor zum Erfassen von Meta-Daten und eine Datenverarbeitungseinrichtung zum Emp fangen der einem Zeit-Signal zugeordneten Video-Signale, Kamera-Parameter und Meta-Daten, zum Auswerten der Video signale und zum Bereitstellen von mindestens einem Teil der Video-Signale, wobei die Kamera, der Kamera-Sensor und der Meta-Sensor mit der Datenverarbeitungseinrichtung verbunden sind und wobei der Kamera-Sensor mit der Kamera verbunden ist . Transmission of the video signals, camera parameters and meta data assigned to the time signals to a data processing device; Automatic evaluation of the video signals assigned to the time signals depending on the camera parameters assigned to the time signal, the meta data assigned to the time signal and by user input and provision of at least part of the video Signals depending on the evaluation. The stated object is further achieved by a device according to the gat, with data acquisition devices such as a camera for recording video signals, at least one camera sensor for recording local camera parameters, at least one meta sensor for recording meta data and a data processing device for receiving the video signals, camera parameters and meta data assigned to a time signal, for evaluating the video signals and for providing at least some of the video signals, the camera, the camera sensor and the meta sensor is connected to the data processing device and the camera sensor is connected to the camera.
Die Erfindung geht von der Grundüberlegung aus, dass als Alternative zu den bekannten Bildverarbeitungs-Algorithmen die Video-Signale in Abhängigkeit von den Kamera-Parametern und in Abhängigkeit von den Meta-Daten ausgewertet werden. Mittels der gemeinsamen Zeit-Signale können die Video signale, die Kamera-Paramater und die Meta-Daten eindeutig einander zugeordnet und synchronisiert werden. Damit ist insbesondere die automatische Auswertung der dem Zeit- Signal zugeordneten Video-Signale anhand der Kamera- Parameter und der Meta-Daten möglich, so dass einem Nutzer der interessierende Teil der Video-Signale bereitgestellt werden kann. The invention is based on the basic consideration that, as an alternative to the known image processing algorithms, the video signals are evaluated as a function of the camera parameters and as a function of the meta data. By means of the common time signals, the video signals, the camera parameters and the meta data can be clearly assigned to one another and synchronized. This in particular enables automatic evaluation of the video signals associated with the time signal on the basis of the camera parameters and the meta data, so that the part of the video signals of interest can be made available to a user.
Das Zeit-Signal im Sinne der Erfindung ist beispielsweise eine ührzeitangabe , insbesondere die Uhrzeit am Ort der Ka mera, gemessen in Millisekunden. Alternativ oder zusätzlich kann das Zeit-Signal als Stoppuhr ausgestaltet sein und ei nen inkrementellen Zähler aufweisen. Vorzugsweise ist das Zeit-Signal derart ausgestaltet, dass jedem Bild des Video- Signals ein eindeutiger Zeitwert zugewiesen werden kann. Insofern hat das Zeit-Signal die Funktion eines eindeutigen ZeitStempels . The time signal in the sense of the invention is, for example, a time indication, in particular the time at the location of the camera, measured in milliseconds. Alternatively or additionally, the time signal can be designed as a stopwatch and have an incremental counter. The time signal is preferably designed such that each frame of the video Signal can be assigned a unique time value. In this respect, the time signal has the function of a clear time stamp.
Kamera-Parameter im Sinne der Erfindung sind Parameter, die Eigenschaften der zugeordneten Kamera charakterisieren, beispielsweise den aktuell eingestellten Kamerawinkel, die Neigung und die die Position der Kamera, wobei letztere mittels eines GPS-Sensors messbar ist. Zum Erfassen der Ka mera-Parameter ist mindestens ein Kamera-Sensor vorgesehen, der beispielsweise zur Erfassung des Kamerawinkels als Gy roskop und/oder zur Erfassung der Kamera-Orientierung als (elektronischer) Kompass ausgestattet sein kann. Die Kame ra-Parameter werden vorzugsweise zeitgleich mit der Auf zeichnung der Video-Signale diesen zugeordnet. Camera parameters in the sense of the invention are parameters which characterize properties of the assigned camera, for example the currently set camera angle, the inclination and the position of the camera, the latter being measurable by means of a GPS sensor. To record the camera parameters, at least one camera sensor is provided, which can be equipped, for example, to record the camera angle as a gyroscope and / or to record the camera orientation as an (electronic) compass. The camera parameters are preferably assigned to them simultaneously with the recording of the video signals.
Meta-Daten sind im Sinne der Erfindung insbesondere Parame ter des Ereignisses. Bei einem Fußballspiel handelt es sich dabei etwa um die aktuelle Position des Balles und/oder um die aktuellen Positionen der Spieler, die mittels gängiger Tracking-Verfahren erfasst werden können. Auch aktuelle Spielstände sind Meta-Daten. Bei einem Wettrennen sind Me ta-Daten beispielsweise die aktuellen Positionen der Fahrer oder Fahrzeuge und/oder deren aktuelle Platzierungen. Die Ermittlung der Meta-Daten erfolgt beispielsweise über gän gige Tracking-Verfahren, Schnittstellen und/oder mittels GPS-Sensoren . Meta-Daten können aber auch Sendungsdaten sein, die darüber Auskunft geben, ob ein bestimmter Teil des Video-Signals beispielsweise im Rahmen einer Fernseh übertragung gesendet wurde. Meta-Daten im Sinne der Erfin dung sind auch nutzerdefinierte Parameter wie Spieler und/oder Fahrzeugnamen, individuelle Statistiken sowie sonstige Informationen zu Spielern und/oder Fahrzeugen, die aus Datenbanken eingespielt werden können, beispielsweise die Torschussquote eines Fußballspielers. Vorzugsweise sind auch die Lautstärkeinformationen, gemessen durch ein insbe sondere von der Kamera räumlich getrennt aufgestelltes Mik rofon Meta-Daten im Sinne der Erfindung. Damit können bei spielsweise Lautstärke-Informationen einer Fankurve in ei nem Fußballstadium als Meta-Daten im Sinne der Erfindung erfasst werden. For the purposes of the invention, meta data are, in particular, parameters of the event. A football game is, for example, the current position of the ball and / or the current positions of the players, which can be recorded using common tracking methods. Current scores are also meta data. In a race, meta data are, for example, the current positions of the drivers or vehicles and / or their current positions. The meta data is determined, for example, using common tracking methods, interfaces and / or using GPS sensors. However, metadata can also be broadcast data that provide information as to whether a certain part of the video signal was broadcast, for example, as part of a television broadcast. Meta data in the sense of the invention are also user-defined parameters such as players and / or vehicle names, individual statistics and other information about players and / or vehicles can be imported from databases, for example, the goal shot rate of a football player. Preferably, the volume information, measured by a microphone, in particular spatially separated from the camera, is meta-data in the sense of the invention. Thus, for example, volume information of a fan curve in a football stage can be recorded as meta data in the sense of the invention.
Die Erfindung beinhaltet, dass Meta-Daten]*. - auch - über an dere Einrichtungen oder Wege als eine Kamera erfasst werden können und dass Metadaten, Video- und Audiosignale von ei ner - zentralen- Datenverarbeitungseinrichtung verarbeitet werden. Es ist also vorzugsweise eine - zentrale - Daten verarbeitungseinrichtung unabhängig von - lokalen - Erfas sungseinrichtungen, wie Kamera, zur Verarbeiten von Meta- Daten vorgesehen. Die Daten aller Datenquellen werden so mit der gemeinsamen globalen Zeit und demgemäß miteinander verbunden, relativ zueinander blockiert oder verriegelt (locked) . Hierdurch wird die zeitliche Zuordnung oder Ver bindung sämtlicher Daten, nämlich Meta-Daten und Video- und Audiosignale sichergestellt. The invention includes that meta data ] *. - Also - can be captured as a camera via other devices or paths and that metadata, video and audio signals are processed by a central data processing device. It is therefore preferably a - central - data processing device independent of - local - recording devices, such as cameras, for processing metadata. The data of all data sources are thus connected to the common global time and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
Die Übertragung der Video-Signale, der Kamera-Parameter und der Meta-Daten an die Datenverarbeitungseinrichtung kann mittels Kabel und/oder kabellos erfolgen, wobei in letzte rem Fall die Übertragung beispielsweise mittels WLAN, Blue- tooth und/oder Funk vorgenommen werden kann. The video signals, the camera parameters and the meta data can be transmitted to the data processing device by means of cables and / or wirelessly, in which case the transmission can be carried out, for example, using WLAN, Bluetooth and / or radio.
Eine Nutzereingabe zur erfindungsgemäßen Auswertung der Vi deo-Signale ist beispielsweise eine Abfrage nach einer Kom bination von mehreren Kriterien, beispielsweise die Abfra- ge, zu welcher Zeit ein bestimmtes Fahrzeug von einer be stimmten Kamera erfasst wurde. A user input for the evaluation of the video signals according to the invention is, for example, a query for a combination of several criteria, for example the query at what time a particular vehicle was captured by a particular camera.
In einer Ausgestaltung der Erfindung werden die Video signale mittels einer einzigen Kamera aufgenommen, deren Position und Kamerawinkel während des Ereignisses verändert und als Kamera-Daten erfasst werden. Als Meta-Daten werden beispielweise Lautstärkeninformationen eines Zuschauerbe reichs mittels mehrerer externer Mikrofone erfasst, wobei eine große Lautstärke auf ein wichtiges Geschehnis, bei spielsweise auf einen Torschuss hinweisen kann. Werden durch eine Nutzereingabe die besonderen Momente eines Er eignisses - Highlights - angefragt, werden als Antwort die jenigen Teile des Video-Signale bereitgestellt, bei denen die Lautstärke der Meta-Daten signifikant erhöht ist. In one embodiment of the invention, the video signals are recorded using a single camera, the position and camera angle of which are changed during the event and recorded as camera data. For example, volume information of a spectator area is recorded as meta data by means of several external microphones, with a high volume indicating an important event, for example a shot on goal. If the special moments of an event - highlights - are requested by user input, the answer is provided to those parts of the video signal in which the volume of the meta data is significantly increased.
Vorzugsweise erfolgt das Erfassen des Zeit-Signals, das Aufnehmen der Video-Signale, das Erfassen der Kamera- Parameter und das Erfassen der der Meta-Daten zeitlich syn chronisiert, so dass die Zuordnung der Video-Signale, der Kamera-Parameter und der Meta-Daten zum Zeit-Signal verein facht ist. Beispielsweise können immer dann, wenn ein Ein zelbild der Video-Signale aufgenommen wird, Kamera- Parameter und Meta-Daten erfasst werden. Alternativ oder zusätzlich dazu können die Video-Signale mehrerer Kameras zeitlich synchronisiert aufgenommen werden. Insbesondere kann die Zuordnung des Zeit-Signals zu den Video-Signalen gleichzeitig mit dem Aufnehmen der Video-Signale erfolgen. Analog kann dies für die Zuordnung des Zeit-Signals zu den Kamera-Parametern und/oder zu den Meta-Daten gelten. Preferably, the acquisition of the time signal, the acquisition of the video signals, the acquisition of the camera parameters and the acquisition of the meta data are synchronized in time, so that the assignment of the video signals, the camera parameters and the meta -Data for the time signal is simplified. For example, camera parameters and meta data can be recorded whenever a single image of the video signals is recorded. Alternatively or in addition, the video signals of several cameras can be recorded in synchronized time. In particular, the time signal can be assigned to the video signals simultaneously with the recording of the video signals. Analogously, this can apply to the assignment of the time signal to the camera parameters and / or to the meta data.
In einer Weiterbildung der Erfindung erfolgen das Erfassen des Zeit-Signals und das Aufnehmen der Video-Signale und das Erfassen der Kamera-Parameter über die gesamte Dauer des Ereignisses, um bei der automatischen Auswertung auf den kompletten Datensatz, der während des gesamten Ereig nisses erzeugt wurde, zurückgreifen zu können. Vorzugsweise erfolgt zusätzlich das Erfassen der Meta-Daten über die ge samte Dauer des Ereignisses. In a development of the invention, the time signal is recorded and the video signals and are recorded capturing the camera parameters over the entire duration of the event in order to be able to access the complete data record that was generated during the entire event during the automatic evaluation. In addition, the meta data are preferably additionally recorded over the entire duration of the event.
Bevorzugt arbeiten die verschiedenen Datenquellen mit einer globalen Zeitquelle und damit einer globalen Zeit wie GPS- Zeit, NTP (Network Time Protocol) oder PTP (Precision Time Protocol), so dass die Meta-Daten mit Bild-, Video- oder auch Audiosignalen zeitlich zentral verbunden werden kön nen, ohne dass eine Verarbeitung in einem Erfassungsgerät erfolgt. Alle Daten werden mit der gemeinsamen globalen Zeit versehen. Die Daten aller Datenquellen werden so mit dem gemeinsamen globalen Teil und demgemäß miteinander ver bunden, relativ zueinander blockiert oder verriegelt (lo- cked) . Hierdurch wird die zeitliche Zuordnung oder Verbin dung sämtlicher Daten, nämlich Meta-Daten und Video- und Audiosignale sichergestellt. The various data sources preferably work with a global time source and thus a global time such as GPS time, NTP (Network Time Protocol) or PTP (Precision Time Protocol), so that the metadata with image, video or audio signals are central in terms of time can be connected without processing in a detection device. All data are provided with the common global time. The data from all data sources are thus connected to the common global part and accordingly to one another, blocked or locked relative to one another (locked). This ensures the temporal assignment or connection of all data, namely meta data and video and audio signals.
Das Erfassen der Meta-Daten kann aber auch nur dann erfol gen, wenn ein Parameter der Meta-Daten einen nutzerdefi nierten Grenzwert unterschreitet und/oder übersteigt. Hier durch wird die Ansammlung einer zu großen, ungenutzten Da tenmenge vermieden. Beispielsweise kann die Erfassung einer lokalen Lautstärke nur dann vorgenommen werden, wenn der Schallpegel oberhalb eines nutzerdefinierten Grenzwerts liegt. Eine große Lautstärke kann auf ein bedeutendes Ge schehnis hindeuten, beispielsweise auf ein Foul oder auf ein Überholmanöver. In einer Weiterbildung der Erfindung erfolgen die Schritte des Auswertens und der Bereitstellung des Teils der Video signale während des Ereignisses, so dass der vom Nutzer ge wünschte Teil der Video-Signale noch während des Ereignis ses, insbesondere kontinuierlich, bereitgestellt werden kann. In einem Beispiel liegt eine nutzerseitige Anfrage nach den Highlights des Ereignisses vor. Durch das kontinu ierliche Auswerten während des Ereignisses wird ermittelt, zu welchem Zeit-Signal eine erhöhte Lautstärke in einem Fanbereich erfasst wird, was auf ein bedeutsames Highlight hinweist. Durch die Ermittlung der Zeit-Signale aller High lights können die entsprechenden Teile der Video-Signale dem Benutzer noch vor dem Ende des Ereignisses bereitge stellt werden. However, the meta data can only be acquired if a parameter of the meta data falls below and / or exceeds a user-defined limit. This avoids the accumulation of too much unused data. For example, a local volume can only be recorded if the sound level is above a user-defined limit. A high volume may indicate a significant event, such as a foul or overtaking. In a development of the invention, the steps of evaluating and providing the part of the video signals take place during the event, so that the part of the video signals desired by the user can be made available during the event, in particular continuously. In one example, there is a user request for the highlights of the event. The continuous evaluation during the event determines at what time signal an increased volume is detected in a fan area, which indicates a significant highlight. By determining the time signals of all highlights, the corresponding parts of the video signals can be made available to the user before the end of the event.
Vorzugsweise werden bei dem Auswerten der Video-Signale weitere Meta-Daten generiert. Auf diese Weise können Objekt- bzw. personenbezogene Statistiken erstellt bzw. er gänzt werden. Dieser Schritt kann automatisiert erfolgen, so dass die neu erstellten Statistiken bei der Auswertung der Video-Signale wieder als Meta-Daten zur Verfügung ste hen können. Die Video-Signale, die Kamera-Parameter, die Meta-Daten und/oder der bereitgestellte Teil der Video signale können auf einem Datenträger gespeichert werden, vorzugsweise in Form einer Datenbank, so dass eine Archi vierung und/oder eine spätere Auswertung möglich ist. Further meta data are preferably generated when the video signals are evaluated. In this way, object or person-related statistics can be created or supplemented. This step can be carried out automatically, so that the newly created statistics can be available again as meta data when evaluating the video signals. The video signals, the camera parameters, the meta data and / or the part of the video signals provided can be stored on a data carrier, preferably in the form of a database, so that archiving and / or later evaluation is possible.
Bei der erfindungsgemäßen Vorrichtung kann der Meta-Sensor räumlich getrennt von dem Kamera-Sensor vorgesehen sein. Weiterhin wird vorzugweise jeder Kamera ein Kamera-Sensor zugeordnet, der insbesondere mit der ihm zugeordneten Kame ra integriert ist. In einer Weiterbildung der Erfindung ist mindestens eine Kamera an einem Flugkörper angeordnet, insbesondere an ei ner Drohne, so dass die Kamera schnell und leicht beweglich ist . In the device according to the invention, the meta sensor can be provided spatially separated from the camera sensor. Furthermore, each camera is preferably assigned a camera sensor, which is integrated in particular with the camera assigned to it. In a development of the invention, at least one camera is arranged on a missile, in particular on a drone, so that the camera can be moved quickly and easily.
Weitere Vorteile und Merkmale der Erfindung ergeben sich aus den Ansprüchen und aus der nachfolgenden Beschreibung, in der ein Ausführungsbeispiel der Erfindung unter Bezug nahme auf die Zeichnung im Einzelnen erläutert ist. Dabei zeigen : Further advantages and features of the invention result from the claims and from the following description, in which an embodiment of the invention is explained in detail with reference to the drawing. Show:
Fig. 1 Die erfindungsgemäße Vorrichtung in einer schematischen Darstellung und Fig. 1 The device according to the invention in a schematic representation and
Fig. 2 ein Ablaufdiagramm des erfindungsgemäßen Fig. 2 is a flowchart of the invention
Verfahrens . Procedure.
Fig. 1 zeigt in einer schematischen Skizze eine Laufstrecke 10, beispielsweise eine Laufstrecke 10 für einen Mittel strecken-Lauf, bei dem in Fig. 1 nicht dargestellte Läufer entlang der Laufstrecke 10 um die Wette laufen, während sie von zwei Kameras 11, 12 am Rand der Laufstrecke 10 gefilmt werden. Beispielhaft sind in Fig. 1 eine erste Kamera 11 und eine zweite Kamera 12 gezeigt, die an unterschiedlichen Positionen am Rand der Laufstrecke 10 angeordnet sind. Wäh rend des Laufes nimmt die erste Kamera 11 ein erstes Video signal 15 und die zweite Kamera 12 ein zweites Video-Signal 16 auf, was im Ablaufdiagramm der Fig. 2 skizziert darge stellt ist. Fig. 1 shows a schematic sketch of a running route 10, for example a running route 10 for a medium distance run, in which runners not shown in FIG. 1 run along the running route 10 while they are being run by two cameras 11, 12 Edge of the running track 10 are filmed. 1 shows a first camera 11 and a second camera 12, which are arranged at different positions on the edge of the running route 10. During the run, the first camera 11 records a first video signal 15 and the second camera 12 records a second video signal 16, which is outlined in the flowchart in FIG. 2.
Beide Kameras 11, 12 sind jeweils mit einem integrierten Kamera-Sensor 13, 14 versehen, wobei der erste Kamera- Sensor 13 mit der ersten Kamera 11 und der zweite Kamera- Sensor 14 mit der zweiten Kamera 12 verbunden sind. Der erste Kamera-Sensor 13 erfasst während des Laufes lokale Kamera-Parameter der ersten Kamera 11. Lokale Kamera- Parameter sind im Beispiel der Fig. 1 die geographische Po sition der Kamera, deren Orientierung und deren Kamerawin kel. Die geographische Position der Kamera wird mit einem GPS-Sensor gemessen, die Orientierung mit einem elektri schen Kompass und der Kamerawinkel mit einem elektrischen Gyroskop in Kombination mit einer Software-Schnittstelle zur Kamera. Der GPS-Sensor, der elektrische Kompass, das elektrische Gyroskop und die Software-Schnittstelle sind integral als erster Kamera-Sensor 13 ausgebildet, der die erfassten Kamera-Parameter über eine weitere Schnittstelle ausgibt. Ebenso wie der erste Kamera-Sensor 13 erfasst der zweite Kamera-Sensor 14 die lokalen Kamera-Parameter der zweiten Kamera 12 während des Laufes. Both cameras 11, 12 are each provided with an integrated camera sensor 13, 14, the first camera sensor 13 with the first camera 11 and the second camera Sensor 14 are connected to the second camera 12. The first camera sensor 13 detects local camera parameters of the first camera 11 during the run. In the example of FIG. 1, local camera parameters are the geographical position of the camera, its orientation and its camera angle. The geographic position of the camera is measured with a GPS sensor, the orientation with an electrical compass and the camera angle with an electrical gyroscope in combination with a software interface to the camera. The GPS sensor, the electrical compass, the electrical gyroscope and the software interface are integrally formed as the first camera sensor 13, which outputs the recorded camera parameters via a further interface. Like the first camera sensor 13, the second camera sensor 14 detects the local camera parameters of the second camera 12 during the run.
Beispielsweise weist in Fig. 1 die erste Kamera 11 einen ersten Kamerawinkel 17 auf, der insbesondere eine Kurve 18 der Rennstrecke 10 erfasst und der größer ist als ein zwei ter Kamerawinkel 19 der zweiten Kamera 12. Der zweite Kame rawinkel 19 der zweiten Kamera 12 ist auf einen Zielbereich 20 der Laufstrecke 10 ausgerichtet. Im gezeigten Beispiel bezeichnet der Kamerawinkel 17, 19 den geographischen Be reich, der von der Kamera 11, 12 erfasst wird. Indem also der zweite Kamerawinkel 19 kleiner ist als der erste Kame rawinkel 17, ergibt sich ein vergrößertes Bild, um besser beurteilen zu können, welcher der Läufer als erster den Zielbereich 20 überquert. Die Kamerawinkel 17, 19 der Kame ras 11, 12 sind zeitlich veränderbar und werden kontinuier lich von den jeweils den Kameras 11, 12 zugeordneten Kame ra-Sensoren 13, 14 als Kamera-Parameter erfasst. Die in Fig. 1 nicht dargestellten Läufer sind jeweils mit einem Meta-Sensor 21 in Form eines GPS-Sensors versehen, um zu jeder Zeit des Laufes die geographischen Positionen der Läufer auf der Laufstrecke 10 zu erfassen. Beispielhaft ist im rechten Bereich der Fig. 1 ein GPS-Sensor 21 angeordnet. Zusätzlich ist ein zweiter Meta-Sensor 22 in Form eines Mikrofons auf der linken Seite der Fig. 1 angeordnet, um die Lautstärke der Zuschauer während des Laufes zu messen. For example, in FIG. 1 the first camera 11 has a first camera angle 17, which in particular detects a curve 18 of the racetrack 10 and is larger than a second camera angle 19 of the second camera 12. The second camera angle 19 of the second camera 12 aligned to a target area 20 of the running route 10. In the example shown, the camera angle 17, 19 denotes the geographical area that is captured by the camera 11, 12. Since the second camera angle 19 is smaller than the first camera angle 17, an enlarged image is obtained in order to be able to better judge which of the runners is the first to cross the target area 20. The camera angles 17, 19 of the cameras 11, 12 can be changed over time and are continuously recorded as camera parameters by the cameras 11, 12 assigned to the cameras 11, 12, respectively. The runners, not shown in FIG. 1, are each provided with a meta sensor 21 in the form of a GPS sensor in order to record the geographic positions of the runners on the running route 10 at any time during the run. By way of example, a GPS sensor 21 is arranged in the right-hand area of FIG. 1. In addition, a second meta sensor 22 in the form of a microphone is arranged on the left-hand side of FIG. 1 in order to measure the volume of the spectators during the run.
Die Kameras 11, 12, die Kamera-Sensoren 13, 14 und die Me ta-Sensoren 21, 22 sind jeweils mit einer Datenverarbei tungseinrichtung 23 verbunden, wobei die Verbindungen in Fig. 1 zwar durch Verbindungsleitungen 24 dargestellt sind, aber alternativ auch drahtlos ausgestaltet sein können. The cameras 11, 12, the camera sensors 13, 14 and the metal sensors 21, 22 are each connected to a data processing device 23, the connections in FIG. 1 being represented by connecting lines 24, but alternatively also designed wirelessly could be.
Das erfindungsgemäße Verfahren soll anhand der Fig. 1 und 2 erläutert werden, wobei Fig. 2 ein schematisches Ablaufdia gramm zeigt. The method according to the invention will be explained with reference to FIGS. 1 and 2, with FIG. 2 showing a schematic flow diagram.
Während des Laufs wird ab dem Startsignal ein kontinuierli ches Zeit-Signal eines globalen Zeitsystems wie GPS-Zeit, NTPO oder PTP in jedem Datenerfassungsgerät, wie jede Kame ra, also insbesondere lokal erfasst, das die Zeit vorzugs weise in Millisekunden angibt und das folglich als einheit licher Zeitstempel dient. Dieser Verfahrensschritt ist mit A in der Fig. 2 dargestellt. During the run, a continuous time signal from a global time system such as GPS time, NTPO or PTP is recorded in every data acquisition device, such as every camera, in particular locally, from the start signal, which preferably indicates the time in milliseconds and consequently as uniform timestamp. This process step is represented by A in FIG. 2.
Zusätzlich nehmen die beiden Kameras 11, 12 der Fig. 1 wäh rend des Laufes jeweils kontinuierliche Video-Signale 15,In addition, the two cameras 11, 12 of FIG. 1 each take continuous video signals 15 during the run,
16 auf. Jedem Einzelbild der Video-Signale 15, 16 wird au tomatisch das jeweilige Zeit-Signal zugeordnet. Dieser Ver fahrensschritt ist auf der linken oberen Seite der Fig. 2 als B markiert. Zeitgleich zur Aufnahme der Video-Signale 15, 16 werden Me ta-Daten, wie insbesondere die Positionen, Orientierungen und Kamerawinkel der beiden Kameras 11, 12 durch die Kame ra-Sensoren 13, 14 als Kamera-Parameter erfasst und diesen automatisch ebenfalls das entsprechende globale Zeit-Signal zugeordnet (Schnitt C) . 16 on. Each frame of the video signals 15, 16 is automatically assigned the respective time signal. This process step is marked on the upper left side of FIG. 2 as B. At the same time as the video signals 15, 16 are recorded, meta data, such as, in particular, the positions, orientations and camera angles of the two cameras 11, 12 are recorded as camera parameters by the camera sensors 13, 14 and these automatically also receive the corresponding parameters global time signal assigned (section C).
Gleichzeitig erfasst der GPS-Sensor 21 kontinuierlich die jeweils aktuelle Position des ihm zugeordneten Läufers und das Mikrofon 22 erfasst die aktuelle Lautstärke des Publi kums. Beiden Meta-Daten wird bei deren Erfassung durch die Meta-Sensoren 21, 22 das jeweils aktuelle Zeit-Signal auto matisch zugeordnet (Schnitt D) . At the same time, the GPS sensor 21 continuously detects the current position of the runner assigned to it and the microphone 22 detects the current volume of the audience. Both meta data are automatically assigned the current time signal when they are captured by the meta sensors 21, 22 (section D).
Noch während des Ereignisses werden in einem nächsten Ver fahrensschritt E die dem Zeit-Signal zugeordneten Video signale, Kamera-Daten und Meta-Daten an die Datenverarbei tungseinrichtung 23 übermittelt. Im dargestellten Beispiel der Fig. 1 und 2 ist der Trainer des Läufers interessiert an der Leistung während des Laufes und tätigt daher in ei nem nächsten Verfahrensschritt F eine Nutzereingabe an die Datenverarbeitungseinrichtung 23, indem er solche Video signale anfordert, in denen dieser konkrete Läufer zu sehen ist. Diese Nutzereingabe wird in der Datenverarbeitungsein richtung 23 registriert, so dass die aufgenommenen Video signale 15, 16 dahingehend analysiert werden, ob der Läufer in den Video-Signalen 15, 16 zu sehen ist, s. Verfahrens schritt G der Fig. 2. During the next process step E, the video signals, camera data and meta data associated with the time signal are transmitted to the data processing device 23 during the event. In the example shown in FIGS. 1 and 2, the runner's trainer is interested in the performance during the run and therefore makes a user input to the data processing device 23 in a next method step F by requesting video signals in which this particular runner is to see is. This user input is registered in the data processing device 23, so that the recorded video signals 15, 16 are analyzed as to whether the runner can be seen in the video signals 15, 16, see. Method step G of FIG. 2.
Für die erste Kamera 11 ist dies beispielsweise dann der Fall, wenn die geographische Position des Läufers, die durch den GPS-Sensor 21 kontinuierlich erfasst wird, von dem ersten Kamerawinkel 17 abgedeckt ist. In diesem Fall stellt die Datenverarbeitungseinrichtung 23 nur den Teil des ersten Video-Signals 15 bereit, in dem der Läufer da rauf zu sehen ist. Analog wird mit der automatischen Aus wertung der zweiten Video-Signale 16 der zweiten Kamera 12 verfahren. Die Auswertung der Video-Signale 15, 16 erfolgt während des Ereignisses und simultan für sämtliche Video signale 15, 16. In einem letzten Verfahrensschritt H der Fig. 2 stellt die Datenverarbeitungseinrichtung 23 dem Be nutzer die gewünschten Teile der Video-Signale 15, 16 be reit, auf denen der Läufer zu sehen ist. For the first camera 11, this is the case, for example, when the geographic position of the runner, which is continuously detected by the GPS sensor 21, from the first camera angle 17 is covered. In this case, the data processing device 23 only provides the part of the first video signal 15 in which the runner can be seen there. The procedure is analogous with the automatic evaluation of the second video signals 16 from the second camera 12. The evaluation of the video signals 15, 16 takes place during the event and simultaneously for all video signals 15, 16. In a last method step H of FIG. 2, the data processing device 23 provides the user with the desired parts of the video signals 15, 16 riding on which the runner can be seen.
In einem zweiten Beispiel handelt es sich bei der Nutzer eingabe um eine Anfrage einer Sendeanstalt nach den High lights des Rennens. Diese Nutzereingabe wird von der Daten verarbeitungseinrichtung 23 derart interpretiert, dass nach solchen Zeit-Signalen gesucht wird, in denen das Mikrofon 22 am Rand der Laufstrecke 10 signifikant hohe Lautstärken als Meta-Daten erfasst hat. Dies deutet auf ein besonders bedeutungsvolles Geschehnis hin. Nachdem die Datenverarbei tungseinrichtung 23 die Zeit-Signale ermittelt hat, in de nen hohe Lautstärken gemessen wurden, werden die den Zeit- Signalen zugeordneten ersten Video-Signalen 15 der ersten Kamera 11 ermittelt, da die erste Kamera 11 am nächsten zum Mikrofon 22 angeordnet ist. Die übrige Auswertung und Be reitstellung des gewünschten Teils der Video-Signale 15, 16 erfolgen analog zum vorigen Beispiel. Auf diese Weise wer den dem Benutzer die Highlights des Ereignisses bereitge stellt . In a second example, the user input is a request from a broadcaster for the highlights of the race. This user input is interpreted by the data processing device 23 in such a way that time signals are searched in which the microphone 22 at the edge of the running route 10 has detected significantly high volumes as meta data. This indicates a particularly significant event. After the data processing device 23 has determined the time signals in which high volumes were measured, the first video signals 15 of the first camera 11 assigned to the time signals are determined, since the first camera 11 is arranged closest to the microphone 22 . The rest of the evaluation and provision of the desired part of the video signals 15, 16 are carried out analogously to the previous example. In this way, who provides the user with the highlights of the event.

Claims

Patentansprüche Claims
1. Verfahren zur automatischen Auswertung und Bereitstel lung von Video-Signalen (15, 16) eines Ereignisses, mit den folgenden Schritten: 1. Method for the automatic evaluation and provision of video signals (15, 16) of an event, with the following steps:
- A: Erfassen mindestens eines Zeit-Signals; - A: detection of at least one time signal;
- B: Aufnehmen von Video-Signalen (15, 16) mittels - B: Recording video signals (15, 16) using
mindestens einer Kamera (11, 12) und automatische Zuordnung des oder der Zeit-Signale zu den Video signalen (15, 16); at least one camera (11, 12) and automatic assignment of the time signal or signals to the video signals (15, 16);
- C: Erfassen von lokalen Kamera-Parametern (17, 19) am Ort der Kamera (11, 12) und automatische Zu ordnung des oder der Zeit-Signale zu den Kamera- Parametern (17, 19), wobei die Kamera-Parameter - C: Detection of local camera parameters (17, 19) at the location of the camera (11, 12) and automatic assignment of the time or signals to the camera parameters (17, 19), the camera parameters
(17, 19) mindestens ein Parameter der Gruppe sind: Position der Kamera, Beschleunigung der Ka mera, Orientierung der Kamera, Kamerawinkel, Mag netfeld, Sichtfeld, Luftdruck, Lautstärke, Hel ligkeit, Zeit, aktueller Stromverbrauch;(17, 19) at least one parameter of the group are: position of the camera, acceleration of the camera, orientation of the camera, camera angle, magnetic field, field of view, air pressure, volume, brightness, time, current power consumption;
- D: Erfassen von Meta-Daten mittels mindestens eines- D: Acquisition of meta data using at least one
Meta-Sensors (21, 22) und automatische Zuordnung des oder der Zeit-Signale zu den Meta-Daten (28), wobei die Meta-Daten zumindest ein Parameter der Gruppe sind: Geographische Daten, Objektpositio nen, Sendungsdaten, obj ektspezifische Daten, Sta tistiken, Datenbanken, lokale Lautstärke, benut zerdefinierte Parameter; Meta sensors (21, 22) and automatic assignment of the time signal (s) to the meta data (28), the meta data being at least one parameter of the group: geographic data, object positions, shipment data, object-specific data, Statistics, databases, local volume, user-defined parameters;
- E: Übermittlung der dem oder den Zeit-Signalen zuge ordneten Video-Signale (15, 16), Kamera-Parameter (17, 19) und Meta-Daten zu einer Datenverarbei tungseinrichtung (23); - F: Automatisches Auswerten der dem oder den Zeit-- E: transmission of the one or more time signals assigned video signals (15, 16), camera parameters (17, 19) and meta data to a data processing device (23); - F: automatic evaluation of the time or
Signalen zugeordneten Video-Signale (15, 16) in Abhängigkeit von den dem Zeit-Signal zugeordneten Kamera-Parametern (17, 19), von den dem Zeit- Signal zugeordneten Meta-Daten und von einer Be nutzereingabe und Video signals (15, 16) assigned to signals as a function of the camera parameters (17, 19) assigned to the time signal, of the meta data assigned to the time signal and of a user input and
- G: Bereitstellung von mindestens einem Teil der Vi deo-Signale (15, 16) in Abhängigkeit von der Aus wertung . - G: Provision of at least some of the video signals (15, 16) depending on the evaluation.
2. Verfahren nach Anspruch 1, dadurch gekennzeichnet, dass solche Daten, wie Meta-, Video-, Bild- und Audio- Daten ein unmittelbar ungenaues Zeitsignal eines glo balen Zeitsystems, wie GPS-Zeit, NTP oder PTP versehen werden . 2. The method according to claim 1, characterized in that such data, such as meta, video, image and audio data, an immediately inaccurate time signal of a global time system, such as GPS time, NTP or PTP, is provided.
3. Verfahren nach Anspruch 1 oder 2, dadurch gekennzeich net, dass das Erfassen des Zeit-Signals (25), das Auf nehmen der Video-Signale (26), das Erfassen der Kame ra-Parameter (17, 19) (27) und das Erfassen der Meta-3. The method according to claim 1 or 2, characterized in that the detection of the time signal (25), the recording of the video signals (26), the detection of the camera parameters (17, 19) (27) and capturing the meta
Daten (28) zeitlich synchronisiert erfolgen. Data (28) are synchronized in time.
4. Verfahren nach einem der Ansprüche 1 bis 3, dadurch gekennzeichnet, dass das Erfassen des Zeit-Signals (A) und das Aufnehmen der Video-Signale (B) und/oder das Erfassen der Kamera-Parameter (17, 19) (C) über die gesamte Dauer des Ereignisses erfolgen. 4. The method according to any one of claims 1 to 3, characterized in that the detection of the time signal (A) and the recording of the video signals (B) and / or the detection of the camera parameters (17, 19) (C ) over the entire duration of the event.
5. Verfahren nach Anspruch 4, dadurch gekennzeichnet, 5. The method according to claim 4, characterized in
dass das Erfassen der Meta-Daten (D) über die gesamte Dauer des Ereignisses erfolgt. that the acquisition of the meta data (D) takes place over the entire duration of the event.
6. Verfahren nach einem der Ansprüche 1 bis 4, dadurch gekennzeichnet, dass das Erfassen der Meta-Daten (D) nur dann erfolgt, wenn ein Parameter der Meta-Daten einen nutzerdefinierten Grenzwert unterschreitet und/oder übersteigt. 6. The method according to any one of claims 1 to 4, characterized in that the detection of the meta data (D) takes place only when a parameter of the meta data falls below and / or exceeds a user-defined limit.
7. Verfahren nach einem der Ansprüche 1 bis 6, dadurch gekennzeichnet, dass die Schritte des Auswertens (F) und der Bereitstellung (G) des Teils der Video-Signale (15, 16) während des Ereignisses erfolgen. 7. The method according to any one of claims 1 to 6, characterized in that the steps of evaluating (F) and providing (G) the part of the video signals (15, 16) take place during the event.
8. Verfahren nach einem der Ansprüche 1 bis 7, dadurch gekennzeichnet, dass bei dem Auswerten (F) der Video signale (15, 16) zusätzliche Meta-Daten generiert wer den . 8. The method according to any one of claims 1 to 7, characterized in that during the evaluation (F) of the video signals (15, 16) additional meta data is generated.
9. Verfahren nach einem der Ansprüche 1 bis 8, dadurch gekennzeichnet, dass die dem Zeit-Signal zugeordneten Video-Signale (15, 16), Kamera-Parameter (17, 19), Me ta-Daten und/oder der bereitgestellte Teil der Video signale (15, 16) auf einem Datenträger gespeichert werden . 9. The method according to any one of claims 1 to 8, characterized in that the video signals (15, 16) associated with the time signal, camera parameters (17, 19), meta data and / or the part provided Video signals (15, 16) are stored on a data carrier.
10. Vorrichtung, insbesondere zur Durchführung eines Ver fahrens nach den Ansprüchen 1 bis 9, mit Datenerfas sungsgeräten, wie mindestens einer Kamera (11, 12) zum Aufnehmen von Video-Signalen (15, 16), mindestens ei nem Kamera-Sensor (13, 14) zum Erfassen von lokalen10. The device, in particular for carrying out a method according to claims 1 to 9, with data acquisition devices, such as at least one camera (11, 12) for recording video signals (15, 16), at least one camera sensor (13 , 14) for the detection of local
Kamera-Parametern (17, 19), mindestens einem Meta- Sensor (21, 22) zum Erfassen von Meta-Daten und mit einer Datenverarbeitungseinrichtung (23) zum Empfangen der einem Zeit-Signal zugeordneten Video-Signale (15, 16), Kamera-Parameter (17, 19) und Meta-Daten, zum Äuswerten der Video-Signale (15, 16) und zum Bereit stellen von mindestens einem Teil der Video-Signale (15, 16), wobei die Kamera (11, 12), der Kamera-Sensor (13, 14) und der Meta-Sensor (21, 22) mit der Daten verarbeitungseinrichtung (23) verbunden sind und wobei der Kamera-Sensor (13, 14) mit der Kamera (11, 12) verbunden ist. Camera parameters (17, 19), at least one meta sensor (21, 22) for acquiring meta data and with a data processing device (23) for receiving the video signals (15, 16) associated with a time signal, camera -Parameters (17, 19) and meta data to Evaluation of the video signals (15, 16) and to provide at least some of the video signals (15, 16), the camera (11, 12), the camera sensor (13, 14) and the meta Sensor (21, 22) are connected to the data processing device (23) and the camera sensor (13, 14) is connected to the camera (11, 12).
11. Vorrichtung nach Anspruch 1, dadurch gekennzeichnet, dass die Datenerfassungsgeräte, wie Kameras, Zeiter- fassungsmodule zur Erfassung einer einheitlichen glo balen Zeit wie GPS-Zeit, NTP oder PTP aufweisen. 11. The device according to claim 1, characterized in that the data acquisition devices, such as cameras, have time recording modules for recording a uniform global time such as GPS time, NTP or PTP.
12. Vorrichtung nach Anspruch 10, dadurch gekennzeichnet, dass mindestens eine Kamera (11, 12) an einem Flugkör per angeordnet ist. 12. The apparatus according to claim 10, characterized in that at least one camera (11, 12) is arranged on a Flugkör by.
EP19821249.0A 2018-12-05 2019-12-04 Method and device for automatically evaluating and providing video signals of an event Withdrawn EP3891998A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018009571.2A DE102018009571A1 (en) 2018-12-05 2018-12-05 Method and device for the automatic evaluation and provision of video signals of an event
PCT/EP2019/000332 WO2020114623A1 (en) 2018-12-05 2019-12-04 Method and device for automatically evaluating and providing video signals of an event

Publications (1)

Publication Number Publication Date
EP3891998A1 true EP3891998A1 (en) 2021-10-13

Family

ID=68916468

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19821249.0A Withdrawn EP3891998A1 (en) 2018-12-05 2019-12-04 Method and device for automatically evaluating and providing video signals of an event

Country Status (4)

Country Link
US (1) US11689691B2 (en)
EP (1) EP3891998A1 (en)
DE (1) DE102018009571A1 (en)
WO (1) WO2020114623A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019216419B4 (en) 2019-10-24 2024-06-20 Carl Zeiss Industrielle Messtechnik Gmbh Sensor arrangement for detecting workpieces and method for operating such a sensor arrangement
US20220017095A1 (en) * 2020-07-14 2022-01-20 Ford Global Technologies, Llc Vehicle-based data acquisition

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3762149B2 (en) * 1998-07-31 2006-04-05 キヤノン株式会社 Camera control system, camera server, camera server control method, camera control method, and computer-readable recording medium
US6748158B1 (en) * 1999-02-01 2004-06-08 Grass Valley (U.S.) Inc. Method for classifying and searching video databases based on 3-D camera motion
GB0029893D0 (en) * 2000-12-07 2001-01-24 Sony Uk Ltd Video information retrieval
US7133070B2 (en) * 2001-09-20 2006-11-07 Eastman Kodak Company System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
JP2004194159A (en) * 2002-12-13 2004-07-08 Canon Inc Video communication system
WO2008046243A1 (en) 2006-10-16 2008-04-24 Thomson Licensing Method and device for encoding a data stream, method and device for decoding a data stream, video indexing system and image retrieval system
US20100007730A1 (en) * 2008-07-09 2010-01-14 Lin Meng-Te Surveillance Display Apparatus, Surveillance System, and Control Method Thereof
KR20110132884A (en) 2010-06-03 2011-12-09 한국전자통신연구원 Apparatus for intelligent video information retrieval supporting multi channel video indexing and retrieval, and method thereof
WO2015162548A1 (en) 2014-04-22 2015-10-29 Batchu Krishnaiahsetty Sumana An electronic system and method for marking highlights in a multimedia file and manipulating the multimedia file using the highlights
US10074013B2 (en) * 2014-07-23 2018-09-11 Gopro, Inc. Scene and activity identification in video summary generation
WO2016029170A1 (en) * 2014-08-22 2016-02-25 Cape Productions Inc. Methods and apparatus for automatic editing of video recorded by an unmanned aerial vehicle
US9313556B1 (en) * 2015-09-14 2016-04-12 Logitech Europe S.A. User interface for video summaries
CN108287924A (en) 2018-02-28 2018-07-17 福建师范大学 One kind can the acquisition of positioning video data and organizing search method

Also Published As

Publication number Publication date
WO2020114623A1 (en) 2020-06-11
US11689691B2 (en) 2023-06-27
DE102018009571A1 (en) 2020-06-10
US20220103779A1 (en) 2022-03-31

Similar Documents

Publication Publication Date Title
DE60216693T2 (en) Device for distributing video and device for receiving video
EP1864153B1 (en) Object-tracking and situation-analysis system
DE60213913T2 (en) System and method of content presentation
EP3891998A1 (en) Method and device for automatically evaluating and providing video signals of an event
EP2044573A1 (en) Monitoring camera, method for calibrating the monitoring camera, and use of the monitoring camera
DE102006006667A1 (en) Sports competition e.g. marathon walk, result determining method, involves checking combination of characteristics based on preset criteria using cameras, when frames are plausible based on preset criteria
DE102009020997A1 (en) Method for recording and processing journey data of vehicle, involves determining position and orientation of vehicle by satellite supported positioning system, where stereo camera is installed in vehicle for recording journey images
DE10029463A1 (en) Position and/or movement detection device uses evaluation of signals provided by several transmitters detecting electromagnetic or sonar waves provided by transmitter attached to object
DE102014224120A1 (en) Output audio contributions for a vehicle
DE60123786T2 (en) Method and system for automatic production of video sequences
EP0973445B1 (en) Lameness diagnosis
DE102019203614A1 (en) Apparatus and method for displaying event information detected from video data
DE102008026657A1 (en) Method for imaged representation of three dimensional acoustic objects as measuring object, involves bringing images in relation to acoustic reference image of measuring object immediately or at time point
DE102020213288A1 (en) Display device for a video surveillance system, video surveillance system and method
DE102013103557A1 (en) Media scene rendering system and method and their recording media
DE102017123068A1 (en) System for synchronizing audio or video recordings
DE112019004282T5 (en) Information processing apparatus, information processing method and program
CH708459B1 (en) A method for recording and reproducing the movements of an athlete.
EP3843419B1 (en) Method for controlling a microphone array and device for controlling a microphone array
EP1434184B1 (en) Control of a multicamera system
WO2002030053A1 (en) Method and system for transmitting information between a server and a mobile customer
DE102021110268A1 (en) Method and system for scene-synchronous selection and playback of audio sequences for a motor vehicle
EP3389805A1 (en) Method and system for live determining of a sports device
EP0583441B1 (en) Device for measuring time, especially sporting times
DE102007054088A1 (en) Method and device for image processing, in particular image measurement

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210614

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20221114

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20230525