WO2022002766A1 - Procédé, dispositif de transmission et dispositif de réception pour diffusion en continu d'événements - Google Patents

Procédé, dispositif de transmission et dispositif de réception pour diffusion en continu d'événements Download PDF

Info

Publication number
WO2022002766A1
WO2022002766A1 PCT/EP2021/067430 EP2021067430W WO2022002766A1 WO 2022002766 A1 WO2022002766 A1 WO 2022002766A1 EP 2021067430 W EP2021067430 W EP 2021067430W WO 2022002766 A1 WO2022002766 A1 WO 2022002766A1
Authority
WO
WIPO (PCT)
Prior art keywords
audio
video
control data
data
ports
Prior art date
Application number
PCT/EP2021/067430
Other languages
English (en)
Inventor
Bart Swinnen
Hans VANDERMAESEN
Stijn SONTROP
Walter BREBELS
Stijn GEYSEN
Original Assignee
Luminex Lighting Control Equipment Nv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luminex Lighting Control Equipment Nv filed Critical Luminex Lighting Control Equipment Nv
Priority to EP21737383.6A priority Critical patent/EP4176586A1/fr
Publication of WO2022002766A1 publication Critical patent/WO2022002766A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/85406Content authoring involving a specific file format, e.g. MP4 format
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Definitions

  • the present invention relates to a method for streaming events.
  • the method relates to streaming live events in sync including show elements such as lighting fixtures.
  • the present invention also relates to a transmitting device and a receiving device for streaming events.
  • Live streaming audio and video is well-known in the art. It knows a wide range of uses, including education and entertainment. Streaming live events is also increasingly more common. For example, sports and entertainment performances may be streamed and viewed across the world.
  • the present invention and embodiments thereof serve to provide a solution to one or more of above-mentioned disadvantages.
  • the present invention relates to a method according to claim 1.
  • the present invention relates to a sending device according to claim 2.
  • the present invention relates to a receiving device according to claim 3.
  • Figure 1 schematically shows an example of a live event streaming setup according to the present invention.
  • the present invention concerns an audio/video (AV) system capable of casting and receiving control data in sync with audio and video data.
  • AV audio/video
  • a compartment refers to one or more than one compartment.
  • the terms "one or more” or “at least one”, such as one or more or at least one member(s) of a group of members, is clear per se, by means of further exemplification, the term encompasses inter alia a reference to any one of said members, or to any two or more of said members, such as, e.g., any >3, >4, >5, >6 or >7 etc. of said members, and up to all said members.
  • protocol stack refers to one or more protocols that work together according to a layered model, such as ArtNet over Ethernet.
  • the concept of 'protocol stack' is by no means exhaustive to interpret as related with more than one protocol, and may also be related to a single protocol which in certain implementation forms may coincide with the complete protocol stack, such as for example DMX.
  • operation instruction refers to instructions intended for A V devices in an AV network at the time when the AV network is operational, and relate to audio and/or video and/or lighting and/or theatre-related effects.
  • event data refers to the combination of audio, video and control data, in order to discern from traditional streaming wherein only audio and video data is transmitted.
  • AV network is understood to be the internal network at an event, through which the event data is configured and transmitted.
  • An AV network thus transmits event data, that is to say audio data, video data and control data. This is markedly different from traditional streaming where only audio and video data is transmitted.
  • the invention relates to a computer-implemented method for streaming an event; said method utilizing a sending system comprising a series of ports, wherein at least one of said ports can be connected to an audio-visual (AV) network including control data, and wherein at least one of said ports can be connected to the internet, the method comprising the steps of :
  • the invention in a second aspect, relates to a transmitter or sending system, said sending system comprising a series of ports, wherein at least one of said ports can be connected to an audio-visual (AV) network including control data, and wherein at least one of said ports can be connected to the internet, wherein the system sender can be configured to synchronize audio, video and control data and cast said audio, video and control data to the internet.
  • AV audio-visual
  • the invention in a third aspect, relates to a receiving system or receiver, said receiving system comprising a series of ports, wherein at least one of said ports can be connected to an audio-visual (AV) network including control data, and wherein at least one of said ports can be connected to the internet, wherein the system receiver receives audio, video and control data from the internet and transmits said audio, video and control data to said AV network.
  • AV audio-visual
  • Control data refers to all event-related peripherals or show elements. In particular this includes all data required to set up lighting fixtures. Further examples include but are not limited to control of the intensity of a light source, color of a light source or display, position and angle of a light or sound source, choice of an audio or video channel, flow rate of a fog machine or smoke machine, pyrotechnics, actuators and controllers of decors and so forth.
  • This allows the streaming device to not only capture the audio and video of an event, but also include the event peripherals and effects. As a result, the full event experience can be shared and streamed live. An additional advantage is that the full experience, including show elements can be saved and reused at a later date.
  • streaming includes both “live streaming” or “livestreaming” and streaming with a significant delay. It is clear that an event data stream can also be recorded, and thus be played at any later moment and at any number of venues, both sequentially and simultaneously.
  • live streaming refers to streaming event data with minimal delay, preferably a delay smaller than 10 minutes, more preferably a delay smaller than 5 minutes, more preferably a delay smaller than 2 minutes, more preferably a delay smaller than 1 minute, more preferably a delay smaller than 50 seconds, more preferably a delay smaller than 40 seconds, more preferably a delay smaller than 30 seconds, more preferably a delay smaller than 20 seconds, more preferably a delay smaller than 10 seconds, more preferably a delay smaller than 5 seconds, more preferably a delay smaller than 4 seconds, more preferably a delay smaller than 3 seconds, more preferably a delay smaller than 2 seconds, more preferably a delay smaller than 1 second, more preferably a delay smaller than 900 ms, more preferably a delay smaller than 800 ms, more preferably a delay smaller than 600 ms, more preferably a delay smaller than 500 ms, more preferably a delay smaller than 400 ms, more preferably a delay smaller than 500 ms, more preferably
  • the streaming method, sending device and receiving device can be used for streaming, including livestreaming with minimal delay.
  • the casting of said audio, video and control data to the internet is cast over a virtual private network (VPN).
  • VPN virtual private network
  • a VPN secure tunnel improves the security of the cast. Additionally, it allows for plug-and-play sending and receiving systems.
  • the casting of said audio, video and control data is sent to a cloud service.
  • the cloud is well suited to distribute streaming said data to multiple receivers. This enables the sending system to only cast said audio, video and control data to the cloud once regardless of the amount of receivers. This reduces the bandwidth requirements.
  • the cloud service can optimize the experience for each receiver individually.
  • the streaming data can be adapted for certain receivers to adhere to their bandwidth limitations. Packets can be buffered to ensure a smooth streaming experience. Packets that were lost can be retransmitted. Forward error correction can be used to anticipate for lost packets. This all without increasing the bandwidth or computational resources on the sender side. This drastically improves the scalability of the receiver side.
  • the sender and receiver device can be plug-and- play, no additional configuration is needed in the routers and / or firewalls on the receiver location and throughout the internet.
  • the cloud service comprises a central on-boarding or registration system, in which an administrator can control which receivers can receive data from which senders at each point in time.
  • an administrator can create an event at the onboarding service.
  • the administrator can then assign sender system(s) to said event.
  • sender system(s) When assigning multiple sending systems to a single event, these sending systems can advantageously communicate securely. This allows synchronization of data from each of said sending systems, optimization of the data stream and thus the experience.
  • the administrator can further assign receiving system (s) to said event.
  • the receiving system(s) of said event can receive the transmitted data.
  • the sending systems and receiving systems can only send and receive data to and from events they're assigned to.
  • the sending system and receiving system isn't authorized to send or receive data from other events.
  • the data which each receiving system receives can be limited by the administrator. This allows selectively casting event data to a wide range of receivers simultaneously. It also allows tailoring the event data to each receiving system. This further improves bandwidth utilization.
  • a live event with audio, video and show elements including lighting effects, fog generator and pyrotechnics can be transmitted.
  • This live event can be streamed to another live event utilizing a similar setup with lighting fixtures, fog and pyrotechnics.
  • the live event can be cast as a purely audio-visual stream, for example to home computers.
  • the receivers which do not utilize the control data do not need to be sent said control data thus reducing bandwidth.
  • the administrator can control the onboarding service of an event through a graphical user interface (GUI).
  • GUI graphical user interface
  • said GUI is a web portal.
  • said GUI is an app, preferably a smartphone app.
  • sending and receiving systems can be signed up and / or connected to an event by scanning a QR code. This advantageously allows providing QR codes on each sending and receiving device. This allows for easier configuration of an event and an improved plug-and-play experience.
  • the administrators and technicians can configure, monitor and optimize streams and devices using a monitoring and configuration service.
  • This monitoring and configuration service allows to monitor and adjust the connection for each sending system and receiving system.
  • the monitoring and configuration service monitors the connection quality for each sending system and receiving system, preferably including traffic telemetries.
  • this allows configuration of input filtering, codecs, data encryption and delay values on the transmitter or sending system side.
  • this allows configuration of correction values for example correction delay values, buffering time, adjusted audio, video and control data quality on the receiving system side.
  • Most preferably configurations can be set individually for each sending system and receiving system within the monitoring and configuration service.
  • the monitoring and configuration service may be linked to the onboarding service.
  • an administrator of the onboarding service may provide limited access to the monitoring and configuration service, for example to technicians.
  • different event peripherals or show elements at each receiver location can be monitored and configured from a remote location. More preferably, different event peripherals or show elements at each receiver location can be discovered, monitored and configured from a remote location. Discovery, monitoring and configuration can be achieved using industry standard discovery and configuration protocols such as RDM and RDMNet. A receiver set-up can consequently automatically be patched to match the configuration that is expected from a transmitter perspective. Alternatively the data sent to said receiver location can automatically be patched to match the configuration that is available on said receiving location.
  • RDM Remote Device Management
  • RDM Remote Device Management
  • Data formats which are suitable for said control data include but are not limited to: ArtNet, sACN, KiNet, RTT rPL, MVR, RDM / RDMNet or DMX. These data formats are standard entertainment industry control protocols well known within the art.
  • Protocols which are suitable for discovery of a set-up over a network include but are not limited to : mDNS, LLDP, ArtNet, RDM / RDMNet or NMOS. These data formats are standard entertainment industry discovery protocols well known within the art.
  • Video input formats include but are not limited to NDI, Streaming MPEG, SDI and HDMI.
  • Video transmission formats include but are not limited to SD video, 1080p50/50.94 video, 720p50/59.94 video, UHDp30 video, 1080i50/59.94 video, UHDp60video and others.
  • Suitable video codecs include H.264, VP8 and VP9.
  • Audio input formats include but are not limited to : AES67, Dante, AVB, Analog, digital AES3.
  • Audio transmission format include but are not limited to 48 kHz, 96 kHz, 192 kHz, 24 bit and so forth.
  • the sending and the receiving device each have at least one port which can be connected to the internet.
  • This port does not need to be a physical port suited for a wire.
  • Any suitable connection to the internet is sufficient, including wireless ports.
  • a port connecting to a Bluetooth, 4G, 5G or satellite internet are also suitable as ports which can be connected to the internet.
  • the connection with the internet is sufficiently fast, for example allowing at least 1 Gigabit/s downlink and uplink. It is clear that for events in remote locations, such as outdoor events including many festivals, sending and receiving devices which do not require an existing internet cable such as fiber connection are particularly advantageous.
  • the sending device comprises multiple additional Ethernet ports for multiple-purpose connecting, such as a local Ethernet network, NDI video input or output, network audio and so forth.
  • all internet and Ethernet ports allow at least 1 Gigabit/s downlink and uplink in full duplex.
  • the internet and Ethernet ports are RJ45 ports. More preferably, the Ethernet ports are characterized by compatibility with multiple protocols. This allows for multiple-purpose connecting.
  • the protocol stack includes at least one of the following protocols : Dante, RAVEN NA/AES67, Ethersound, Q LAN, sACN, ArtNet, RTTrPL (BlackTraX), IEEE 802.
  • Dante refers to Digital Audio Network Through Ethernet.
  • Q LAN or Q- LAN refers to a audio-over-IP networking technology.
  • sACN refers to streaming ACN, a DMX-over-Ethernet technology.
  • Artnet or Art-Net is a communication protocol for sending DMX and RDM over LJDP (User Datagram Protocol), generally UDP/IP.
  • RTTrPL refers to real-time tracking protocol.
  • PoE refers to Power over Ethernet.
  • the sending device comprises at least one HDMI input port. In another preferred embodiment, the sending device comprises at least one HDMI output port. In the most preferred embodiment, the sending device comprises at least one HDMI input and at least one HDMI output. In a preferred embodiment, the sending device comprises a left and a right (L+R) XLR 3 female pin for audio input. In a preferred embodiment, the sending device comprises at least a left and a right (L+R) XLR 3 male pin for audio output. In the most preferred embodiment, the sending device comprises at least comprises a left and a right (L+R) XLR 3 female pin for audio input and at least a left and a right (L+R) XLR 3 male pin for audio output
  • the receiving device comprises at least one HDMI input port. In another preferred embodiment, the receiving device comprises at least one HDMI output port. In the most preferred embodiment, the receiving device comprises at least one HDMI input and at least one HDMI output. In a preferred embodiment, the receiving device comprises at least one (L+R) audio input pin : XLR 3 pin female. In a preferred embodiment, the receiving device comprises at least one (L+R) audio output pin : XLR 3 pin male. In the most preferred embodiment, the sending device comprises at least one (L+R) XLR 3 pin female and at least one (L+R) XLR 3 pin male.
  • a single streaming device can be utilized as both sending and receiving device.
  • the streaming device can then be configured as sending device, that is to say receiving event data from the A V network (including control data), sampling, synchronizing and casting said event data to the cloud service.
  • the streaming device can also be configured as receiving device, that is to say receiving data from the cloud service and transmitting said event data to the respective ports for audio, video and control data.
  • Fig. 1 schematically presents an example of a live event streaming setup according to the present invention.
  • Fig. 1 displays a sending device 1, securely connected to the cloud service 2.
  • the cloud service 2 then distributes the stream to receiving devices 3 and 4.
  • Sending device 1 receives control data (ArtNet) 11 and audio and video data (NDI) 12 live from an event. This data is synchronized by adding a corresponding time delay. The synchronized data is then sampled using RTP-bin implementation.
  • ArtNet control data
  • NDI audio and video data
  • Sending device 1 forms a secure VPN connection 5 with cloud service 2.
  • Sending device 1 sends event data 121, 122 and 123 through the VPN connection 5 to the cloud service 2.
  • the event data comprises control data 121 (ArtNet, multicast RTP), video data 122 (VP9, multicast RTP) and audio data 123 (Opus, multicast RTP).
  • the cloud service 2 sends receiver feedback 124 (unicast RTCP) through the VPN connection 5 to the sending device 1.
  • Cloud service 2 distributes said event data 231, 232, 233, 241, 242, 243 to receivers 3 and 4 through a secure VPN connection 5.
  • the event data comprises control data 231, 241 (ArtNet, multicast RTP), video data 232, 242 (VP9, multicast RTP) and audio data 233, 243 (Opus, multicast RTP).
  • the receiver devices send back receiver feedback 234, 244 (unicast RTCP) through the VPN connection 5 to the cloud service 2.
  • the receiver devices 3, 4 provide a control signal 31, 41 (ArtNet); a video signal 32, 42 (HDMI) and an audio signal (Analog) 33, 43.
  • the receiver device can add an additional delay to the respective audio, video and control data streams if needed for synchronization.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention concerne un procédé mis en œuvre par ordinateur pour la diffusion en continu d'un événement ; ledit procédé utilisant un système d'envoi comprenant une série de ports, au moins l'un desdits ports pouvant être connecté à un réseau audiovisuel (AV) comprenant des données de commande, et au moins l'un desdits ports pouvant être connecté à l'internet, le procédé comprenant les étapes consistant à : recevoir des données audio, des données vidéo et des données de commande, synchroniser lesdites données audio, vidéo et de commande, échantillonner temporellement lesdites données audio, vidéo et de commande, et diffuser lesdites données audio, vidéo et de commande sur l'internet. L'invention concerne en outre un dispositif d'émission et un dispositif de réception pour la diffusion en direct d'un événement.
PCT/EP2021/067430 2020-07-03 2021-06-24 Procédé, dispositif de transmission et dispositif de réception pour diffusion en continu d'événements WO2022002766A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21737383.6A EP4176586A1 (fr) 2020-07-03 2021-06-24 Procédé, dispositif de transmission et dispositif de réception pour diffusion en continu d'événements

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
BE202005500 2020-07-03
BEBE2020/5500 2020-07-03

Publications (1)

Publication Number Publication Date
WO2022002766A1 true WO2022002766A1 (fr) 2022-01-06

Family

ID=76765133

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/067430 WO2022002766A1 (fr) 2020-07-03 2021-06-24 Procédé, dispositif de transmission et dispositif de réception pour diffusion en continu d'événements

Country Status (2)

Country Link
EP (1) EP4176586A1 (fr)
WO (1) WO2022002766A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100050222A1 (en) * 2007-02-02 2010-02-25 Yvon Legallais System and method for transporting interactive marks
US20110063502A1 (en) * 2008-05-19 2011-03-17 Thomson Licensing Device and method for synchronizing an interactive mark to streaming content
US20140003792A1 (en) * 2012-06-29 2014-01-02 Kourosh Soroushian Systems, methods, and media for synchronizing and merging subtitles and media content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100050222A1 (en) * 2007-02-02 2010-02-25 Yvon Legallais System and method for transporting interactive marks
US20110063502A1 (en) * 2008-05-19 2011-03-17 Thomson Licensing Device and method for synchronizing an interactive mark to streaming content
US20140003792A1 (en) * 2012-06-29 2014-01-02 Kourosh Soroushian Systems, methods, and media for synchronizing and merging subtitles and media content

Also Published As

Publication number Publication date
EP4176586A1 (fr) 2023-05-10

Similar Documents

Publication Publication Date Title
US10142387B2 (en) Distributed coordination of network elements for packet encapsulation
US9021134B1 (en) Media stream transport conversion within an intermediate network device
JP5808499B2 (ja) ワイドエリアミラーリングルータ
WO2008018065A3 (fr) visioconférence sur des réseaux IP
CN110768817B (zh) 视联网终端的升级方法和装置
US9497103B2 (en) Isochronous local media network for performing discovery
CN103797769A (zh) 基于服务受控会话的流拦截器
CN101540652B (zh) 多视角视频码流的终端异构自匹配传输方法
KR20100084659A (ko) 터미널들의 출력을 동기화하기 위한 방법 및 시스템
WO2008002785A3 (fr) Systèmes et procédés de configuration d'un commutateur de couche 2 pour le filtrage multidiffusion
WO2004082221A2 (fr) Application de protocoles de multidiffusion et techniques de tunnelisation de reseau virtuel prive pour assurer une qualite de service elevee pour le transport en temps reel de medias a travers des reseaux d'internet
JP2010098761A5 (fr)
US20100161824A1 (en) Method for transmitting of a multi-channel data stream on a multi-transport tunnel, corresponding computer-readable storage means and tunnel end-points
CN109862307B (zh) 一种视频会议发起的方法和装置
CN109618120A (zh) 视频会议的处理方法和装置
CN109660816A (zh) 信息处理方法和装置
WO2012068940A1 (fr) Procédé de surveillance d'un terminal via un réseau ip et un mcu
CN110392225B (zh) 一种控制方法和视联网视频会议系统
CN109963108B (zh) 一种一对多对讲的方法和装置
WO2022002766A1 (fr) Procédé, dispositif de transmission et dispositif de réception pour diffusion en continu d'événements
TWI536815B (zh) 經由一網路發射及接收一資訊信號之方法、應用該方法之發射器及接收器及應用於該網路內之分離器單元
US20140040966A1 (en) Multi-Channel Multi-Stream Video Transmission System
CN109361891B (zh) 一种分级会议中的数据同步方法和系统
WO2015043170A1 (fr) Procédé de traitement d'interaction d'informations de point d'extrémité, appareil et point d'extrémité de téléprésence
Masuda et al. Application-network collaborative bandwidth on-demand for uncompressed HDTV transmission in IP-optical networks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21737383

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021737383

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE