EP3984236B1 - Verfahren zum bereitstellen von videoinhalten für ein produktionsstudio durch auswählen eines eingangsvideostroms aus einer vielzahl von drahtlosen videokameras sowie eine entsprechende streamsteuerung. - Google Patents

Verfahren zum bereitstellen von videoinhalten für ein produktionsstudio durch auswählen eines eingangsvideostroms aus einer vielzahl von drahtlosen videokameras sowie eine entsprechende streamsteuerung. Download PDF

Info

Publication number
EP3984236B1
EP3984236B1 EP19733419.6A EP19733419A EP3984236B1 EP 3984236 B1 EP3984236 B1 EP 3984236B1 EP 19733419 A EP19733419 A EP 19733419A EP 3984236 B1 EP3984236 B1 EP 3984236B1
Authority
EP
European Patent Office
Prior art keywords
stream controller
cameras
wireless
video
live camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP19733419.6A
Other languages
English (en)
French (fr)
Other versions
EP3984236A1 (de
EP3984236C0 (de
Inventor
Ali El Essaili
Mohamed Ibrahim
Thorsten Lohmar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telefonaktiebolaget LM Ericsson AB
Original Assignee
Telefonaktiebolaget LM Ericsson AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget LM Ericsson AB filed Critical Telefonaktiebolaget LM Ericsson AB
Publication of EP3984236A1 publication Critical patent/EP3984236A1/de
Application granted granted Critical
Publication of EP3984236B1 publication Critical patent/EP3984236B1/de
Publication of EP3984236C0 publication Critical patent/EP3984236C0/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/251Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/251Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/252Processing of multiple end-users' preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/654Transmission by server directed to the client
    • H04N21/6543Transmission by server directed to the client for forcing some client operations, e.g. recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection

Definitions

  • the present disclosure generally relates to the field of video content broadcasting and more specifically to methods and devices for enabling transmission of video content to a production studio from a plurality of wireless video cameras.
  • Live events are one of the major broadcasts on Television, TV, schedules nowadays.
  • TV producers invest a lot of effort to produce Live shows and bring it to consumers in the highest possible production quality.
  • the basic tools used today for producing Live shows and pre-recorded shows are not very different.
  • the producer needs to ensure that the various important events happening in the show are delivered to the consumer in a clear sequence.
  • US 2009/085740 A1 discloses a method for controlling the wireless transmission of video streams from wireless video cameras and to control said wireless video cameras using control messages sent to corresponding wireless video camera controllers based on triggers.
  • Figure 1 depicts how the production process of a live event occurs. In this disclosure, it is assumed that an event producer is interested in covering an outdoor event in a stadium.
  • the production team distributes their cameras around the stadium and collect the feeds from the different cameras in an on-site studio.
  • the on-site studio could be a room in the stadium or a van with the necessary equipment. Alternately the on-site studio could also be a cloud based.
  • the media content is processed, it is sent as a single stream to the production studio. There, the stream could be distributed using satellite link or any other available network.
  • the above system is assuming that all the cameras are connected via wires to an Outside Broadcast Van, OBV, where all cameras are sending high quality video, the high quality video is stored in a storage in the studio, where it is possible to store large amount of data for later viewing, such as for replays.
  • OBV Outside Broadcast Van
  • a method of providing video content to a production studio by selecting an input video stream from a plurality of wireless video cameras over a wireless telecommunication network, wherein each of said wireless video cameras is arranged for providing at least a High Quality, HQ, and a Low Quality, LQ, video output.
  • the method comprising the steps of:
  • the wireless telecommunication network in accordance with the present disclosure may be any wireless telecommunication network such as a 4 th Generation or 5 th Generation wireless telecommunication network.
  • the presented method is especially useful for 5 th Generation wireless telecommunication networks given the bandwidths involved in wirelessly transmitting video content.
  • the radio bandwidth may be limited.
  • a producer is limited to the total amount of radio bandwidth that is available. If the system uses many wireless video camera's, then it might not be possible for each of the wireless video cameras to transmit HQ video content at the same time. This would lead to a congestion of the radio bandwidth.
  • the inventors have found that, typically, only a single wireless camera out of the plurality of wireless cameras is used as a live camera. That is, the feed from a single camera is used for broadcasting. In such a way, at least the live camera should provide for HQ content, while the remaining of the plurality of wireless cameras may suffice in providing LQ content.
  • the LQ content may then be used by the operator to determine whether the live camera should switch to a different camera. If this is the case, then the stream controller could request the different camera to provide HQ content.
  • the inventors have found an improved method.
  • One of the disadvantages of switching from one camera to the next camera is related to latency.
  • the stream controller may request the next camera to provide HQ content. Before the HQ content is received a the production studio, a certain latency will be encountered.
  • the present disclosure is directed to a concept for reducing that particular latency.
  • the inventors consider it to be advantageous that the present technique schedules the transmission of the high-quality stream, allows the network to utilize its resources and sends stream without exhausting the radio resource and affecting the main video source.
  • a live camera may alternately be referred to as the on-air camera.
  • the stream controller may assign one camera as being the live camera. This may be based on an instruction received by the stream controller from a production studio, or for example based on the position of a camera that ensures good coverage of the event being covered.
  • a live camera provides video content in a High Quality, for example High Definition, HD, Full HD, Ultra High Definition, UHD, 4K UHD, 8K UHD etc.
  • the exact resolution with which the camera transmits may be determined by the production studio based on the available radio resources and/or a desired resolution of broadcast.
  • the other cameras i.e., replay or off-line cameras are instructed by the stream controller to stream at a lower quality.
  • the skilled person understands that the replay cameras only have to stream at a quality lower than that being streamed by the on-air camera. This ensures an optimal usage of the available radio resources.
  • the present disclosure may thus form an exception on the above provide general rule, wherein one or more off-line cameras may still be requested to provide for HQ video content when it is determined that it is likely that these cameras may be the chosen as the next live camera.
  • the instruction sent by the stream controller may be in a JavaScript, Open Notation, JSON, format specifying an identifier for a camera and further specifying start and end times of the requested content. Alternately, instead of specifying start and end times, start time and duration may be requested.
  • the instruction may also comprise an indication of the time at which the requested content should go on air.
  • the stream controller may be distributed over multiple entities.
  • the step of estimating may be performed by a different entity then the step of requesting.
  • the step of estimating comprises:
  • One of the parameters which may be used as an input for determining the likelihood is the distance.
  • the distance between the live camera and the further one camera may be used as an input to determine whether it is likely that the further one camera is the next live camera.
  • the likelihood is estimated as being inversely proportional to said determined distance.
  • the presented method may especially be suitable in, for example, football stadiums, Formula 1 races, ice hockey stadiums or anything alike.
  • the plurality of wireless video cameras are distributed over a particular area. This improves the coverage area of the whole system.
  • the angle of view of each of the cameras may still partly overlap. In any case, it may be possible to use a single camera to get a good picture of the location where the action takes place, i.e. where the ball is located or where a particular car races, or anything alike.
  • the inventors have found that it is likely that a camera that is physically close to the live camera is to be the next live camera.
  • a camera that is physically close to the live camera is to be the next live camera.
  • subsequently placed wireless video camera are selected to serve as a live camera to ensure that the car can be tracked over the circuit.
  • the inventors have thus found that the distance between the each of the cameras to the live camera may be used as an input to determine the likelihood.
  • each of the wireless cameras may be equipped with a GPS sensor for determining its FPS location, and its FPS location may be distributed to the stream controller and/or to any of the wireless video cameras.
  • Another option is that, during the instalment of the video cameras, installation persons manually input the location of each of the wireless video cameras into a registry at the stream controller.
  • the of estimating further comprises:
  • Yet another option that could be used for determining whether it is likely that a particular video camera is the next live camera is related to the rate of change in angle of view of the live camera and the particular video camera. When it is likely that the field of view of the particular video camera is going to overlap with the field of view of the live camera, it may be determined that it is likely that that particular video camera is going to be the next live video camera.
  • the likelihood may be estimated proportional to predicted overlap in angle of view of said live camera and of said further one of said plurality of wireless cameras.
  • the step of estimating further comprises:
  • said step of requesting comprises:
  • a stream controller for operating in a wireless telecommunication network and arranged for controlling the providing of video content from a plurality of wireless video cameras to a production study over said wireless telecommunication network, wherein each of said wireless video cameras is arranged for providing at least a High Quality, HQ, and a Low Quality, LQ, video output, said stream controller further comprises a processor which is arranged for:
  • the stream controller arranged to provide video content to a production studio by selecting an input video stream from a plurality of wireless video cameras.
  • the stream controller may be located in the Radio Access Network, RAN, the 5G core network or on an external application server.
  • the processor is further arranged for:
  • the likelihood is estimated as being inversely proportional to said determined distance.
  • the processor is further arranged for said estimating by:
  • the likelihood is estimated proportional to predicted overlap in angle of view of said live camera and of said further one of said plurality of wireless cameras.
  • the processor is further arranged for said estimating by
  • the processor is further arranged for:
  • a computer program product comprising a computer readable medium having instructions stored thereon which, when executed by a stream controller, cause said stream controller to implement a method in accordance with any of the method examples as provided above.
  • Fig. 1 schematically illustrates an outdoor production system, 1.
  • Production system 1 typically comprises a plurality of audio capture equipment 2 and video capture equipments 3. They provide the generated content to an Outside Broadcast Van, OBV 4.
  • An architecture of an OBV 4 is shown in more detail in Fig. 2 .
  • the OBV 4 is responsible for receiving multiple audio/video streams from the plurality of audio and video capture equipments 2, 3 and to select one stream from the available plurality of streams, thereby providing a single stream of audio visual content to the production studio 5.
  • the production studio 5 may add further content such as commentary and/or logos before broadcasting.
  • Fig. 2 schematically illustrates an architecture for an outdoor production system, 10 according to the prior art.
  • the system 10 shown in Fig. 2 assumes a wired setup wherein all the cameras 11 - 14 are wired and therefore can transmit HQ video content without constraints on radio resources. All cameras 11, 12, 13, 14, output are sent in high quality to the on-site studio, 4.
  • a producer team is viewing all inputs on multiple screens in a multi-viewer 15 and according to the content on the screen they choose which camera should be considered as the main input to the output stream, which goes to the production studio 21.
  • the producer can choose multiple input streams and mix them in one output using the live mixer 16 or the recording mixer 17.
  • the output stream afterwards is sent for further processing 18, adding overlays and then sent for encoding 19.
  • the encoded stream is sent over network to the production studio 21.
  • the studio 21 prepares the final stream for distribution by, for example, adding commentary or channel logos.
  • the stream from all cameras 11 - 14, are stored in the storage 19 available at on-site studio 4 in the highest quality. This stream is analyzed by the production team and used for viewing later-on, either after the event or during replays. Where the production team are preparing the replays while the main stream is on air, once the replay is ready, it goes on-air and the live event is put on hold.
  • Fig. 3 schematically illustrates the concept of the present disclosure.
  • each camera is transmitting a single video feed at a time to reduce the bitrate requirements on the wireless network.
  • One issue to solve is how quickly to switch between a LQ video content and a HQ video content stream, when a different camera feed goes on air, i.e. goes live.
  • Typical encoders should be able to switch within one frame.
  • cameras with two encoders are available so a camera can generate both LQ content video and HQ content video feeds at the same time.
  • the presented solution therefore, addresses the wireless transmission link, i.e., the round-trip-time including instruction from the stream controller to switch from LQ content video to HQ content video until a HQ content video feed is received at the production studio.
  • the proposed method aims at predicting that a switch to a different camera will happen.
  • a typical event where, multiple cameras are deployed, e.g. stadium, ski, formula 1, the cameras are distributed across the event field.
  • Figure 3 depicts 101 a sketch of an event with ten cameras, where one camera 105 is currently on air, i.e. live. The other cameras can be divided into different regions with different likelihood that a switch to the other cameras will happen, i.e. as indicated with reference numerals 102, 103 and 104.
  • the production studio may use several information about the cameras, to construct the switching likelihood.
  • Such information includes static information about the event, e.g. number of cameras, geometry, camera capabilities, available encoding bitrates, and dynamic information about the event, e.g. camera location.
  • This information may be exchanged between the cameras and the production studio, in such a way that in 1) static information may be exchanged between the cameras and the production studio, 2) uplink video streaming between the cameras and production studio may take place, 3) capabilities and location information update may take place, and 4) instructions to switch between different qualities may be given by the production studio to the cameras.
  • the production studio creates a mapping which associates the different qualities with the likelihood that a switch will occur.
  • Other mappings can apply and the number of likelihoods and qualities can be determined based on the exchanged camera capabilities and geometry of the event.
  • the switching likelihoods may be updated dynamically based on the updated location information and other event-related parameters
  • the likelihood can be determined as a function of the distance.
  • P 0 ( x 0 , y 0 , z 0 ) be the current position of on-air camera
  • d i ( P i , P 0 ) be the distance between the cameras and on-air camera.
  • the likelihood can be expressed as inversely proportional to the distance: L i ⁇ 1 d i
  • FIG 3 different regions are indicated as shown with reference numerals 102, 103 and 104.
  • the camera's present in the region having reference numeral 102 are likely to be the next live camera. As such, these camera's are request to already start providing the HQ content video stream. It is less likely that the cameras located in the region with reference numeral 103 are considered to be the next live camera. As such, the cameras located in this region 103 may be requested to start providing a Medium Quality, MQ, content video stream. Finally, it is unlikely that the cameras located in the region having reference numeral 104 are considered to be the next live camera, such that it may be acceptable that these cameras provide only a LQ video content stream.
  • Figures 4 and 5 schematically illustrate an example 201, 301 in which the field of view is used as an input parameter for determining the likelihood that a camera is to be the next live camera.
  • each camera has a preferred angle of view as shown in Figure 4 and figure 5 , where cam2 202 is covering the objects within angle , and cam1 203 is covering the objects in the area within .
  • cam2 202 is streaming high quality, HQ, stream to the studio, while it is generating media quality, MQ, video but not transmitting it.
  • the camera movement, following the target 204, is recorded via an accelerometer embedded in the camera, this data can be translated as value of .
  • the accelerometer data is sent periodically to the studio, the studio can compare this value to understand the movement of the target.
  • cam1 203 starts preparing the HQ stream and expects to be on air soon. It's further possible that the network gets notified about this possible changes and prepare the necessary resources for cam1 203.
  • cam1 203 may start sending the HQ stream over the telecommunication network, for example including the radio and the core network, to the production site, while cam2 202 switches from HQ to MQ content video.
  • Fig. 6 schematically illustrates a flow chart 401 of a method in accordance with the present disclosure.
  • the wireless camera may comprise other components, the most obvious being video capture equipment. This is neither illustrated nor further described in the figures as it is considered to be well known.
  • the camera comprises a transmitter arranged for transmitting wirelessly the captured video
  • a single processor or other unit may fulfil the functions of several items recited in the claims.
  • the mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Studio Circuits (AREA)

Claims (15)

  1. Verfahren zum Bereitstellen von Videoinhalten für ein Produktionsstudio (5) durch Auswählen eines Eingangsvideostroms aus einer Vielzahl drahtloser Videokameras (3) über ein drahtloses Telekommunikationsnetz, wobei jede der drahtlosen Videokameras dazu eingerichtet ist, mindestens eine Videoausgabe hoher Qualität, HQ, und eine Videoausgabe niedriger Qualität, LQ, bereitzustellen, wobei das Verfahren die folgenden Schritte umfasst:
    - Auswählen, durch eine Streamsteuerung, einer unter der Vielzahl drahtloser Kameras als eine Live-Kamera, wodurch die ausgewählte Live-Kamera angewiesen wird, HQ-Videoausgabe derart bereitzustellen, dass Videoinhalte, die dem Produktionsstudio bereitgestellt werden, die HQ-Videoausgabe der ausgewählten Live-Kamera sind; gekennzeichnet durch
    - Schätzen, durch die Streamsteuerung, einer Wahrscheinlichkeit einer weiteren einen der Vielzahl drahtloser Kameras, eine nächste Live-Kamera zu sein, die auf die ausgewählte Live-Kamera folgt;
    - Auffordern, durch die Streamsteuerung, der weiteren einen der Vielzahl drahtloser Kameras, HQ-Videoausgabe basierend auf der geschätzten Wahrscheinlichkeit derart bereitzustellen, dass HQ-Videoausgabe bereits verfügbar ist, falls die Streamsteuerung entscheidet, auf die weitere eine der Vielzahl drahtloser Kameras umzuschalten.
  2. Verfahren nach Anspruch 1, wobei der Schritt des Schätzens Folgendes umfasst:
    - Bestimmen, durch die Streamsteuerung, einer Distanz zwischen der weiteren einen der Vielzahl drahtloser Kameras und der Live-Kamera und
    - Schätzen, durch die Streamsteuerung, der Wahrscheinlichkeit, basierend auf der bestimmten Distanz.
  3. Verfahren nach Anspruch 2, wobei die Wahrscheinlichkeit als umgekehrt proportional zu der bestimmten Distanz geschätzt wird.
  4. Verfahren nach einem der vorstehenden Ansprüche, wobei der Schritt des Schätzens ferner Folgendes umfasst:
    - Bestimmen, durch die Streamsteuerung, einer Änderungsrate in einem Betrachtungswinkel der Live-Kamera und der weiteren einen der Vielzahl drahtloser Kameras;
    - Schätzen, durch die Streamsteuerung, der Wahrscheinlichkeit, basierend auf der bestimmten Änderungsrate in dem Betrachtungswinkel.
  5. Verfahren nach Anspruch 4, wobei die Wahrscheinlichkeit proportional zu vorhergesagter Betrachtungswinkelüberlappung der Live-Kamera und der weiteren einen der Vielzahl drahtloser Kameras geschätzt wird.
  6. Verfahren nach einem der vorstehenden Ansprüche, wobei der Schritt des Schätzens ferner Folgendes umfasst:
    - Schätzen, durch die Streamsteuerung, einer Wahrscheinlichkeit jeder der Vielzahl drahtloser Kameras, eine nächste Live-Kamera zu sein, die auf die ausgewählte Live-Kamera folgt;
    - Auffordern, durch die Streamsteuerung, jeder der Vielzahl drahtloser Kameras, HQ-Videoausgabe basierend auf der Schätzung derart separat bereitzustellen, dass HQ-Videoausgabe verfügbar gemacht wird, falls die Streamsteuerung entscheidet, auf die weitere eine der Vielzahl drahtloser Kameras umzuschalten.
  7. Verfahren nach einem der vorstehenden Ansprüche, wobei der Schritt des Aufforderns Folgendes umfasst:
    - Bestimmen, durch die Streamsteuerung, dass die geschätzte Wahrscheinlichkeit einen vorbestimmten Schwellenwert überschreitet, und
    wobei der Schritt des Aufforderns Folgendes umfasst:
    - Auffordern, durch die Streamsteuerung, einer beliebigen der Vielzahl drahtloser Kameras, HQ-Videoausgabe bereitzustellen, für die bestimmt wird, dass die assoziierte geschätzte Wahrscheinlichkeit den vorbestimmten Schwellenwert überschreitet.
  8. Streamsteuerung zum Betreiben in einem drahtlosen Telekommunikationsnetz und dazu eingerichtet, das Bereitstellen von Videoinhalten aus einer Vielzahl drahtloser Videokameras (3) für ein Produktionsstudio über das drahtlose Telekommunikationsnetz zu steuern, wobei jede der drahtlosen Videokameras dazu eingerichtet ist, mindestens eine Videoausgabe hoher Qualität, HQ, und eine Videoausgabe niedriger Qualität, LQ, bereitzustellen, wobei die Streamsteuerung ferner eine Netzschnittstelle und einen Prozessor umfasst, der zu Folgendem eingerichtet ist:
    - Auswählen einer unter der Vielzahl drahtloser Kameras als eine Live-Kamera, wodurch die ausgewählte Live-Kamera angewiesen wird, HQ-Videoausgabe derart bereitzustellen, dass Videoinhalte, die dem Produktionsstudio bereitgestellt werden, die HQ-Videoausgabe der ausgewählten Live-Kamera sind; gekennzeichnet durch
    - Schätzen einer Wahrscheinlichkeit einer weiteren einen der Vielzahl drahtloser Kameras, eine nächste Live-Kamera zu sein, die auf die ausgewählte Live-Kamera folgt;
    - Auffordern der weiteren einen der Vielzahl drahtloser Kameras, HQ-Videoausgabe basierend auf der geschätzten Wahrscheinlichkeit derart bereitzustellen, dass HQ-Videoausgabe bereits verfügbar ist, falls die Streamsteuerung entscheidet, auf die weitere eine der Vielzahl drahtloser Kameras umzuschalten.
  9. Streamsteuerung nach Anspruch 8, wobei der Prozessor ferner zu Folgendem eingerichtet ist:
    - Bestimmen einer Distanz zwischen der weiteren einen der Vielzahl drahtloser Kameras und der Live-Kamera und
    - Schätzen der Wahrscheinlichkeit, basierend auf der bestimmten Distanz.
  10. Streamsteuerung nach einem der Ansprüche 8-9, wobei die Wahrscheinlichkeit als umgekehrt proportional zu der bestimmten Distanz geschätzt wird.
  11. Streamsteuerung nach einem der Ansprüche 8-10, wobei der Prozessor ferner durch Folgendes zu dem Schätzen eingerichtet ist:
    - Bestimmen einer Änderungsrate in einem Betrachtungswinkel der Live-Kamera und der weiteren einen der Vielzahl drahtloser Kameras;
    - Schätzen der Wahrscheinlichkeit, basierend auf der bestimmten Änderungsrate in dem Betrachtungswinkel.
  12. Streamsteuerung nach Anspruch 11, wobei die Wahrscheinlichkeit proportional zu vorhergesagter Betrachtungswinkelüberlappung der Live-Kamera und der weiteren einen der Vielzahl drahtloser Kameras geschätzt wird.
  13. Streamsteuerung nach einem der Ansprüche 8-12, wobei der Prozessor ferner durch Folgendes zu dem Schätzen eingerichtet ist:
    - Schätzen einer Wahrscheinlichkeit jeder der Vielzahl drahtloser Kameras, eine nächste Live-Kamera zu sein, die auf die ausgewählte Live-Kamera folgt;
    - Auffordern jeder der Vielzahl drahtloser Kameras, HQ-Videoausgabe basierend auf der geschätzten Wahrscheinlichkeit derart separat bereitzustellen, dass HQ-Videoausgabe bereits verfügbar ist, falls die Streamsteuerung entscheidet, auf die weitere eine der Vielzahl drahtloser Kameras umzuschalten.
  14. Streamsteuerung nach einem der Ansprüche 8-13, wobei der Prozessor ferner zu Folgendem eingerichtet ist:
    - Bestimmen, dass die geschätzte Wahrscheinlichkeit einen vorbestimmten Schwellenwert überschreitet, und
    - Auffordern, durch die Streamsteuerung, einer beliebigen der Vielzahl drahtloser Kameras, HQ-Videoausgabe bereitzustellen, für die bestimmt wird, dass die assoziierte geschätzte Wahrscheinlichkeit den vorbestimmten Schwellenwert überschreitet.
  15. Computerprogrammprodukt, umfassend ein computerlesbares Medium, das darauf gespeicherte Anweisungen aufweist, die, wenn sie durch eine Streamsteuerung nach Anspruch 8 ausgeführt werden, die Streamsteuerung dazu veranlassen, ein Verfahren nach einem der Ansprüche 1-7 umzusetzen.
EP19733419.6A 2019-06-14 2019-06-14 Verfahren zum bereitstellen von videoinhalten für ein produktionsstudio durch auswählen eines eingangsvideostroms aus einer vielzahl von drahtlosen videokameras sowie eine entsprechende streamsteuerung. Active EP3984236B1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2019/065719 WO2020249233A1 (en) 2019-06-14 2019-06-14 A method of providing video content to a production studio by selecting an input video stream from a plurality of wireless video cameras, as well as a corresponding stream controller

Publications (3)

Publication Number Publication Date
EP3984236A1 EP3984236A1 (de) 2022-04-20
EP3984236B1 true EP3984236B1 (de) 2024-04-10
EP3984236C0 EP3984236C0 (de) 2024-04-10

Family

ID=67060372

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19733419.6A Active EP3984236B1 (de) 2019-06-14 2019-06-14 Verfahren zum bereitstellen von videoinhalten für ein produktionsstudio durch auswählen eines eingangsvideostroms aus einer vielzahl von drahtlosen videokameras sowie eine entsprechende streamsteuerung.

Country Status (5)

Country Link
US (1) US12010354B2 (de)
EP (1) EP3984236B1 (de)
CN (1) CN114073096A (de)
MX (1) MX2021015071A (de)
WO (1) WO2020249233A1 (de)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2378339A (en) * 2001-07-31 2003-02-05 Hewlett Packard Co Predictive control of multiple image capture devices.
US8233042B2 (en) * 2005-10-31 2012-07-31 The Invention Science Fund I, Llc Preservation and/or degradation of a video/audio data stream
US8199196B2 (en) * 2007-09-27 2012-06-12 Alcatel Lucent Method and apparatus for controlling video streams
US8693848B1 (en) * 2012-11-29 2014-04-08 Kangaroo Media Inc. Mobile device with smart buffering
US10205889B2 (en) * 2013-03-08 2019-02-12 Digitarena Sa Method of replacing objects in a video stream and computer program
WO2014145925A1 (en) 2013-03-15 2014-09-18 Moontunes, Inc. Systems and methods for controlling cameras at live events
EP3133819A1 (de) * 2014-04-14 2017-02-22 Panasonic Intellectual Property Management Co., Ltd. Bildlieferungsverfahren, bildempfangsverfahren, server, endgerätevorrichtung und bildlieferungssystem
CN105828206A (zh) * 2016-03-22 2016-08-03 乐视网信息技术(北京)股份有限公司 多路视频点播方法和装置
US10834305B2 (en) * 2016-04-11 2020-11-10 Spiideo Ab System and method for providing virtual pan-tilt-zoom, PTZ, video functionality to a plurality of users over a data network
CN107241611B (zh) 2017-05-27 2019-09-24 蜜蜂四叶草动漫制作(北京)有限公司 一种直播联动装置及直播联动系统

Also Published As

Publication number Publication date
US12010354B2 (en) 2024-06-11
MX2021015071A (es) 2022-01-18
WO2020249233A1 (en) 2020-12-17
CN114073096A (zh) 2022-02-18
EP3984236A1 (de) 2022-04-20
EP3984236C0 (de) 2024-04-10
US20220264157A1 (en) 2022-08-18

Similar Documents

Publication Publication Date Title
US11838563B2 (en) Switching between transmitting a preauthored video frame and a composited video frame
JP5795404B2 (ja) 被管理ネットワークを介したテレビ放送および非被管理ネットワークを介した双方向コンテンツのクライアントデバイスへの提供
EP2106665B1 (de) Interaktives verschlüsseltes inhaltssystem mit objektmodellen zur ansicht auf einem ferngerät
CN106462490B (zh) 多媒体流水线架构
JP2011511572A (ja) 双方向テレビ環境における自動映像プログラム録画
US20140223502A1 (en) Method of Operating an IP Client
US8957947B2 (en) Image processing apparatus and control method thereof
CN109644286B (zh) 分发装置和方法、接收装置和方法、介质和内容分发系统
WO2003084225A1 (fr) Dispositif de liaison video, systeme de distribution video et procede de liaison video
EP3984236B1 (de) Verfahren zum bereitstellen von videoinhalten für ein produktionsstudio durch auswählen eines eingangsvideostroms aus einer vielzahl von drahtlosen videokameras sowie eine entsprechende streamsteuerung.
US20230345062A1 (en) Multiplexed place shifting device
US20110321104A1 (en) System and method for mass distribution of high quality video
US20220231773A1 (en) Replay realization in media production using fifth generation, 5g telecommunication
US9319719B2 (en) Method for processing video and/or audio signals
JP2008227940A (ja) コンテンツ情報配信用サーバ及び監視映像配信システム
CN112740716A (zh) 用于提供经编辑的视频内容的方法、装置和计算机程序
US20230239421A1 (en) System for Remote Production of Audiovisual Contents
US20200077145A1 (en) Systems and methods for improved availability for dth satellite broadcasting service using iptv signal source as fall back mechanism for delivery
CN113315932A (zh) 云转播平台、无人混合采访系统与方法
JPWO2018092173A1 (ja) 遠隔映像処理システム
KR101999234B1 (ko) 하이브리드 브로드캐스트 브로드밴드 환경에서 QoS 제공을 위한 방송 송수신 방법 및 시스템
JP2014049991A (ja) 映像信号送信方法、映像信号受信装置及び映像信号受信方法
US20080174662A1 (en) System and method for digital type wireless monitoring
EP3089461A1 (de) System und verfahren zum verteilen von inhalt über dynamische kanalzuweisung in einem gateway mit mobilem inhalt
EP3160149A2 (de) System und verfahren zur verteilten verarbeitung und auswahl von videoinhalt in einem gateway mit mobilen inhalten

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20211217

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20240104

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602019049991

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

U01 Request for unitary effect filed

Effective date: 20240410

U07 Unitary effect registered

Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT SE SI

Effective date: 20240418

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240627

Year of fee payment: 6