EP3507987A1 - Verfahren zur übertragung von echtzeitbasierten digitalen videosignalen in netzwerken - Google Patents
Verfahren zur übertragung von echtzeitbasierten digitalen videosignalen in netzwerkenInfo
- Publication number
- EP3507987A1 EP3507987A1 EP17761866.7A EP17761866A EP3507987A1 EP 3507987 A1 EP3507987 A1 EP 3507987A1 EP 17761866 A EP17761866 A EP 17761866A EP 3507987 A1 EP3507987 A1 EP 3507987A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- video
- format
- packet
- data
- packets
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000013467 fragmentation Methods 0.000 claims description 12
- 238000006062 fragmentation reaction Methods 0.000 claims description 12
- 238000004806 packaging method and process Methods 0.000 claims description 6
- 239000012634 fragment Substances 0.000 description 37
- 230000005540 biological transmission Effects 0.000 description 22
- 239000000872 buffer Substances 0.000 description 14
- 238000012545 processing Methods 0.000 description 8
- 238000011161 development Methods 0.000 description 7
- 230000018109 developmental process Effects 0.000 description 7
- AWSBQWZZLBPUQH-UHFFFAOYSA-N mdat Chemical compound C1=C2CC(N)CCC2=CC2=C1OCO2 AWSBQWZZLBPUQH-UHFFFAOYSA-N 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000006835 compression Effects 0.000 description 5
- 238000007906 compression Methods 0.000 description 5
- 230000001934 delay Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000015654 memory Effects 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 238000012856 packing Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010237 hybrid technique Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23605—Creation or processing of packetized elementary streams [PES]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/85406—Content authoring involving a specific file format, e.g. MP4 format
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
Definitions
- the invention relates to the transmission of digital video and audio signals from a video signal source via a server to a display device, on which the data provided by the video signal source are displayed.
- the transmission of real-time video data also called live streaming, generally takes place in networks via several intermediate stations, starting from the camera via a processing or coding unit at the camera to a server which makes the forwarding to the receiver or the playback device ,
- the distances in the networks can lead to delays in the transmission. Additional delays arise through
- the quality of the transmission results on the one hand from the image quality itself, but also from the liquid of the reproduction.
- the image quality is related to the available bandwidth in the network, the liquid at the frame rate measured in frames per second or Hz. It is not possible with previous methods to transmit video data in everywhere available networks without loss of quality.
- the video data must be compressed (encoded) to match the available ones
- the grouping of groups of keyframe groups is referred to as the "group of pictures” (GOP), and the GOP length often ranges from 1-10 seconds or more, the longer the GOP length, the better the average video quality achieved But the longer they are necessary
- buffer memories are used, which should compensate for the fluctuations over certain time periods.
- long intervals of "key frames" I-frames are selected during the compression, which are also in the range of 2-10 seconds.
- buffer times or buffer times often occur Latencies of more than 30 seconds.
- Multimedia streaming is known as well as a device for the Multan
- Multimedia streaming is known using this format. This generates fragments of data. In each fragment will be a variety of
- Multimedia data boxes are arranged one behind the other, and at the end of the fragment, a box is arranged, which contains on the multimedia data related metadata (US 201 1/0276662 A1).
- Packing / Multiplex The signal output from the video signal source in a stream is divided into packets.
- a packet may contain video data, audio data, or both (multiplexing).
- packet in this context refers to the aggregation, multiplexing of video and optionally audio data in the output format, not to the size of packets in the network transport, which subsequently takes place on a lower system level.
- the consideration of the network layer is not the subject of this device.
- Streaming differs in concept usually in the application of video communication by the number of viewers. Streaming should be possible regardless of the number of viewers.
- Sources are usually live cameras, either as a separate device or built into a device (mobile device, laptop, surveillance camera, action cam, stationary or attached to mobile devices, etc.).
- the source can also be an artificially generated signal that does not have to come from a camera, e.g. for presentations or games.
- RTMP Real-time Messaging Protocol
- Flash-Plugin additional software module
- HLS was invented by Apple and is based on a buffered transmission of portions of the livestream.
- DASH is an ISO standard based on the same principle. The pieces of the livestream need a certain amount
- HLS and DASH are part of the "HTMLS" standard, used by most
- HTML5 it is possible to embed video data directly on a web page and present it in multimedia environments. HTML only allows certain video formats and protocols for embedding video data.
- HTML5 The state of the art in HTML5 is the ISO-MPEG4 video compression method, ITU-H264 with the MP4 file format and the proprietary, less widely used VP8 or VP9 with the WebM file format.
- the file-based formats like mp4 are not designed for real-time playback.
- Web technology WebRTC are used. Real-time communication, in contrast to streaming, is not standardized in terms of international standards and is available on a few devices, so that end devices such as TV and mobile devices do not have a uniform interface for this.
- Video communication applications are designed for the transmission of point-to-point connections similar to telephony (one-to-one).
- the video communication and chat protocols are not compatible with streaming standards (HTML5, HLS, DASH).
- an HTML 5 video element is capable of playing either a complete file or a video fragment or segment, which may be provided in the form of a file or as part of a data stream.
- DASH and HLS segments are used, which in turn are divided into fragments.
- the state of the art is the ISO MP4 file format in the variant fMP4.
- a segment length corresponds to at least one GOP length, ie 1 to 10 seconds per segment.
- the additional inserted latency is a length of a segment.
- a segment may contain one or more complete GOPs. The minimum latency thus corresponds to the GOP length. Multiple buffering creates a threefold latency in existing devices.
- the fragmentation can be carried out, for example, on the basis of the ISO standard "fMP4".
- fMP4 Formai package is synonymous with MP4 Fragment.
- the temporal size of a fragment corresponds to several video images in the prior art. According to the prior art, a fragment contains at least the number of video pictures of a GOP length.
- the fragments consist of different type designations ("atoms") .
- the packet fragments are divided into headers and payloads, so there is one between the individual payloads of the packet fragments
- the transmission usually takes place via the IP protocol TCP, so that a disturbance on the transmission link at this protocol level is excluded. If the connection is lost, it is necessary and possible to re-connect to the live stream to continue a real-time transmission.
- Both the coding and the server and the playback device have buffer memory.
- each packet is provided with a time stamp.
- Timestamps are a common means of synchronizing A / V packets. For each time of recording with a live source, there is a timestamp that can be synchronized with, for example, the real time. The playback page can then determine how late or early the packet is relative to real time and other packets.
- a data stream according to the fMP4 format consists of an introductory data structure "ftyp” and “moov” followed by in an example 5 packet fragments.
- Each package fragment consists of 2 parts, namely a part "moof der
- Information includes the number of video and audio frames in the package, the timing or duration of the video and audio frames, the byte size of the video and audio frames, and the byte position of the video and audio frames.
- This atom "moof is then followed by an atom" mdat ", in which the actual video and audio are included.
- the individual parts of this exemplary stream are immediately adjacent to each other.
- the HLS format can also be used instead of fMP4.
- the HLS format consists of 2 parts: several
- Segments in the format TS (ISO-MPEG transport stream), each comprising at least one GOP length and playable independently of each other, and the index data (playlist) in the format m3u8, each pointing to the segments.
- 3 segments per index are used, which shift in time during the transmission.
- 3 segments per index are used, which shift in time during the transmission.
- a minimum latency of 3 ⁇ 10 30 seconds results.
- the playback device in the prior art includes a dedicated buffer which generates additional latency.
- the buffer is set automatically in the playback device.
- the automatic setting usually takes place on the basis of the set playing time of the data stream, which corresponds at least to the segment length.
- a camera has a frame rate of 25 frames per second.
- An image corresponds to a duration of 40 ms.
- the images generated by the signal source can in practice in different
- the invention is based on the object to provide a method for transmitting real-time-based digital video signals in networks, which can also find application where it is a quick response on the part of the Receiver arrives, for example, in videoconferencing, auctions or interactive involvement of the audience.
- the invention proposes a method with the features mentioned in claim 1. Further developments of the invention are the subject of dependent claims.
- the signal output by the video signal source in a stream is thus fragmented into packets, with a packet fragment corresponding to at least one video picture with associated audio information.
- Using just one video frame allows playback with the least possible delay between video capture and playback.
- the delay is still significantly less than in the prior art, as long as the number of times in the
- Packet fragment contained in the known in the prior art number in a GOP (Group of Pictures,) remains.
- the temporal size of a fragment corresponds to the length of one or more video images that is smaller than a GOP.
- the data size corresponds to one or more video images and possibly the corresponding time
- Audio data plus multiplexed data Audio data plus multiplexed data.
- the packetizing unit keeps the buffer as small as possible, since the filling of a buffer is usually associated with latencies which the invention wishes to keep as small as possible.
- the packaging is made into fragments in the area of the video source.
- the packet fragments are present in the fragmented MP4 format (fMP4).
- fMP4 fragmented MP4 format
- an initialization segment is provided, followed by a repeating group of a fragment header (moof) and a fragment data segment (mdat).
- Figure 1 is a schematic overview of the various stages of the invention
- Figure 2 is a schematic representation of one consisting of 5 fragments
- Fig. 3 is a flowchart showing a processing unit for processing the incoming stream
- FIG. 4 shows a flowchart of the processing in the one mentioned in FIG.
- Figure 5 is a parent diagram.
- FIG. 1 shows, in a highly simplified schematic form, the structure of a
- the video signal is generated by a video signal source 1, for example a video camera.
- the video signal source 1 is connected via a transmission path 2 with a yer michseinnchtung 3.
- This yer michseinnchtung 3 may be, for example, a server.
- the signal of the video source 1 is transmitted to the yer michseignchtung.
- za michmaschine 3 3 the video signal is fragmented into packets, which in the The following will be explained in more detail.
- the packetizer 3 is connected via a further transmission path or channel 4 to a display device 5 on which a user can see what the source is transmitting.
- Channel 4 may be a continuous channel with back and forth
- Data stream carried out, namely on the one hand, a packaging and segmentation of the income data stream and on the other an adaptation of the data stream to a suitable format for the playback device 5 format.
- FIG. 2 shows, by way of example, the data stream on the basis of the fMP4 format (state of the art).
- the stream is in a standard-compliant form.
- the stream starts with a ftyp box followed by a moov box. This is followed by a continuous sequence
- Each fragment consists of a moof and a mdat box.
- Moof contains
- the mdat box contains the actual video and audio data.
- FIG. 3 shows in simplified form the procedure or the sequence of the method within a processing unit in which the input stream is processed.
- the process begins in block 1 1, where the stream arrives.
- Block 1 1 is followed by processing block 12, which may also be referred to as a demultiplex block.
- processing block 12 which may also be referred to as a demultiplex block.
- the incoming stream in video, audio and
- These media data include the type of packet, namely video, audio or metadata, time information
- Packaging according to the invention admitting device, which is explained in more detail in Figure 4.
- a query is made as to whether it is an audio packet. If so, the audio packet is stored in block 30.
- Video package is trading. If so, the next block 33 queries whether it is the start of a video frame. If this is not the case, the video packet is stored in block 34. In the abrage block 35 following on a positive ablation, it is checked whether the number of video frames in the fragment is buffered. If not, the video packet is stored in block 36.
- Block 40b stores the current video packet.
- block 42 If it is the first fragment to send, as determined in query block 41, block 42 outputs the initialization header, the fragment header, and the fragment data.
- the query block 41 If it is not the first fragment to be sent, that is, the query block 41 provides a negative response, the fragment header and the fragment data are output in block 43. With the output in block 44 is the activity of
- FIG. 5 shows again in a superordinate representation the structure of the method as proposed by the invention.
- the control of the stream is done in such a way that at the beginning there is "source”, which is the data stream
- the video, audio and metadata contained in the stream are unpacked and forwarded via connection 52 to the packetizer or multiplexer component 53.
- the multiplexer component 53 does what was explained in detail in FIG. In this multiplexer component 53 is generated in an HTMLS capable Filestream format. For example, it can be fMP4 for Chrome, Firefox, or IE 11. For Safari US X and iOS it will Format m3u8 / ts (HLS) preferred. Subsequently, the forwarding takes place via the connection 54 to the output group 55. From there, the forwarding takes place to the outputs, which are no longer shown in detail.
- Total end-to-end latency is the sum of network transport latency, format-related latencies, and buffers latency in the playback device.
- the network transport latencies are made up of the transmission of the encoder to the server, the transfer from the server to the packaging unit Player / Transmux Server (53 in FIG. 5) and the transfer therefrom to the server
- Grouping-related timing dependencies on the delivery of a stream lead to additional latency.
- the beginning of a segment or fragment can not be delivered until all contained samples have been received.
- the additional latency for fMP4 formatting is the length of a fMP4 fragment.
- a fragment contains one or more complete GOPs (group of pictures).
- the minimum latency in the known methods thus corresponds to the GOP length.
- the method of fragmentation per frame proposed by the invention shortens the format-related latency on the frame length.
- the HLS format can also be used instead of fMP4.
- the buffer in the display device corresponds to at least one segment length, which can also lead to a latency.
- it may be provided to set the nominal playing time of the segments to low values. This is controlled by customization by the device in the fMP4 header and / or in the HLS playlist.
- the device also monitors and controls the buffer of the playback unit.
- the GOP limit is canceled. Many small frames are transmitted and received as a data stream.
- time information time stamp, duration, seasons
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Databases & Information Systems (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016116555.7A DE102016116555A1 (de) | 2016-09-05 | 2016-09-05 | Verfahren zur Übertragung von echtzeitbasierten digitalen Videosignalen in Netzwerken |
PCT/EP2017/072115 WO2018042036A1 (de) | 2016-09-05 | 2017-09-04 | Verfahren zur übertragung von echtzeitbasierten digitalen videosignalen in netzwerken |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3507987A1 true EP3507987A1 (de) | 2019-07-10 |
Family
ID=59791066
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17761866.7A Pending EP3507987A1 (de) | 2016-09-05 | 2017-09-04 | Verfahren zur übertragung von echtzeitbasierten digitalen videosignalen in netzwerken |
Country Status (4)
Country | Link |
---|---|
US (1) | US20190191195A1 (de) |
EP (1) | EP3507987A1 (de) |
DE (1) | DE102016116555A1 (de) |
WO (1) | WO2018042036A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115119009B (zh) * | 2022-06-29 | 2023-09-01 | 北京奇艺世纪科技有限公司 | 视频对齐方法、视频编码方法、装置及存储介质 |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6938268B1 (en) * | 1998-01-08 | 2005-08-30 | Winston W. Hodge | Video stream sharing |
US20030058707A1 (en) * | 2001-09-12 | 2003-03-27 | Dilger Bruce C. | System and process for implementing commercial breaks in programming |
US8893207B2 (en) * | 2002-12-10 | 2014-11-18 | Ol2, Inc. | System and method for compressing streaming interactive video |
US8832772B2 (en) * | 2002-12-10 | 2014-09-09 | Ol2, Inc. | System for combining recorded application state with application streaming interactive video output |
US9003461B2 (en) * | 2002-12-10 | 2015-04-07 | Ol2, Inc. | Streaming interactive video integrated with recorded video segments |
WO2007130695A2 (en) * | 2006-05-05 | 2007-11-15 | Globstream, Inc. | Method and apparatus for streaming media to a plurality of adaptive client devices |
US9380096B2 (en) * | 2006-06-09 | 2016-06-28 | Qualcomm Incorporated | Enhanced block-request streaming system for handling low-latency streaming |
JP5542913B2 (ja) * | 2009-04-09 | 2014-07-09 | テレフオンアクチーボラゲット エル エム エリクソン(パブル) | メディアファイルを生成し処理するための方法および構成 |
US9237387B2 (en) * | 2009-10-06 | 2016-01-12 | Microsoft Technology Licensing, Llc | Low latency cacheable media streaming |
US9032466B2 (en) * | 2010-01-13 | 2015-05-12 | Qualcomm Incorporated | Optimized delivery of interactivity event assets in a mobile broadcast communication system |
US20110276662A1 (en) | 2010-05-07 | 2011-11-10 | Samsung Electronics Co., Ltd. | Method of constructing multimedia streaming file format, and method and apparatus for servicing multimedia streaming using the multimedia streaming file format |
US20110282965A1 (en) * | 2010-05-17 | 2011-11-17 | Ifan Media Corporation | Systems and methods for providing interactivity between a host and a user |
EP2656618A4 (de) * | 2010-12-22 | 2015-08-26 | Ando Media Llc | Echtzeitmedienstream-einfügungsverfahren und -vorrichtung |
US8464304B2 (en) * | 2011-01-25 | 2013-06-11 | Youtoo Technologies, LLC | Content creation and distribution system |
US11025962B2 (en) * | 2011-02-28 | 2021-06-01 | Adobe Inc. | System and method for low-latency content streaming |
US8510555B2 (en) * | 2011-04-27 | 2013-08-13 | Morega Systems Inc | Streaming video server with virtual file system and methods for use therewith |
US20130080579A1 (en) * | 2011-09-26 | 2013-03-28 | Unicorn Media, Inc. | Dynamically-executed syndication services |
US8893167B2 (en) * | 2012-02-07 | 2014-11-18 | Turner Broadcasting System, Inc. | Method and system for automatic content recognition based on customized user preferences |
US20140140417A1 (en) | 2012-11-16 | 2014-05-22 | Gary K. Shaffer | System and method for providing alignment of multiple transcoders for adaptive bitrate streaming in a network environment |
US20140198839A1 (en) * | 2013-01-17 | 2014-07-17 | Nvidia Corporation | Low latency sub-frame level video decoding |
US9491521B2 (en) * | 2013-06-21 | 2016-11-08 | Arris Enterprises, Inc. | Trick play seek operation for HLS converted from DTCP |
JP2015023575A (ja) * | 2013-07-19 | 2015-02-02 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 送信方法、受信方法、送信装置及び受信装置 |
US8955027B1 (en) * | 2013-11-21 | 2015-02-10 | Google Inc. | Transcoding media streams using subchunking |
US9635077B2 (en) * | 2014-03-14 | 2017-04-25 | Adobe Systems Incorporated | Low latency live video streaming |
CN105100954B (zh) * | 2014-05-07 | 2018-05-29 | 朱达欣 | 一种基于互联网通信及流媒体直播的交互应答系统及方法 |
WO2016109770A1 (en) * | 2014-12-31 | 2016-07-07 | Imagine Communications Corp. | Fragmented video transcoding systems and methods |
US10264044B2 (en) * | 2016-08-29 | 2019-04-16 | Comcast Cable Communications, Llc | Apparatus and method for sending content as chunks of data to a user device via a network |
-
2016
- 2016-09-05 DE DE102016116555.7A patent/DE102016116555A1/de active Pending
-
2017
- 2017-09-04 WO PCT/EP2017/072115 patent/WO2018042036A1/de active Application Filing
- 2017-09-04 US US16/330,156 patent/US20190191195A1/en active Pending
- 2017-09-04 EP EP17761866.7A patent/EP3507987A1/de active Pending
Also Published As
Publication number | Publication date |
---|---|
US20190191195A1 (en) | 2019-06-20 |
DE102016116555A1 (de) | 2018-03-08 |
WO2018042036A1 (de) | 2018-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE69814642T2 (de) | Verarbeitung codierter videodaten | |
DE69736706T2 (de) | Verfahren und gerät zum spleissen komprimierter datenflüsse | |
DE112012002526B4 (de) | Medieninhalt-Übertragungsverfahren und Übertragungsvorrichtung unter Verwendung desselben | |
DE60207381T2 (de) | Verfahren und system zum puffern von stream-daten | |
DE112011101911T5 (de) | Fragmentierte Dateistruktur für die Ausgabe von Live-Medien-Streams | |
DE112012002159T5 (de) | Kontextsensitive Client-Pufferschwellenwerte | |
DE69627031T2 (de) | Dateiprozessor für die verteilung von multimedia-dateien | |
US20200260132A1 (en) | Video distribution synchronization | |
DE112012001770T5 (de) | Auf Echtzeitverarbeitungsfähigkeit basierende Qualitätsanpassung | |
DE112006002677T5 (de) | Verfahren und Vorrichtung für RTP-Ausgabe-Streaming unter Verwendung von komplementären Richtungsdateien | |
DE112011101908T5 (de) | Qualitätseinstellung unter Verwendung eines fragmentierten Medienstroms | |
DE112011103333T5 (de) | Medienkonvergenzplattform | |
DE112013001136T5 (de) | Effiziente Abgrenzung und Verteilung von Media-Segmenten | |
DE112011102879T5 (de) | Medienrechteverwaltung auf mehreren Geräten | |
DE102011078021A1 (de) | Vorrichtung und Verfahren zum Schalten von Echtzeitmedienströmen | |
DE112011101004T5 (de) | Medienkonvergenzplattform | |
DE112015004179T5 (de) | Router-Fabric | |
EP2127382B1 (de) | Verfahren und system zum störungsfreien umschalten zwischen programmkanälen in einer videoumgebung | |
EP3507987A1 (de) | Verfahren zur übertragung von echtzeitbasierten digitalen videosignalen in netzwerken | |
DE102007026531A1 (de) | Verfahren zur Synchronisierung von Szene-Datenfiles und Mediendatenströmen in einem unidirektionalen Datenübertragungssystem | |
DE102005052207A1 (de) | Verfahren zum Übertragen von einem Datenstrom von einer Datenquelle zu einer Datensenke sowie Datensenkengerät, Datenquellgerät und Gerät zur Durchführung des Verfahrens | |
Köhnen et al. | A DVB/IP streaming testbed for hybrid digital media content synchronization | |
WO2021008943A1 (de) | Verfahren zur übertragung von videoinformation an ein telekommunikationsgerät, wobei die videoinformation eine mehrzahl an videoinformationsströmen umfasst, system, telekommunikationsgerät, inhaltebezogene hintergrund-servereinrichtung, computerprogramm und computerlesbares medium | |
DE102005046382A1 (de) | Verfahren, Kommunikationsanordnung und dezentrale Kommunikationseinrichtung zum Übermitteln von Multimedia-Datenströmen | |
DE112018002893T5 (de) | Verfahren zum Senden und Empfangen eines Rundsendungssignals und eine Vorrichtung hierfür |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190401 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200806 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |