WO2006012911A1 - Dynamic optimization of wireless real-time video data flow - Google Patents

Dynamic optimization of wireless real-time video data flow Download PDF

Info

Publication number
WO2006012911A1
WO2006012911A1 PCT/EP2004/008688 EP2004008688W WO2006012911A1 WO 2006012911 A1 WO2006012911 A1 WO 2006012911A1 EP 2004008688 W EP2004008688 W EP 2004008688W WO 2006012911 A1 WO2006012911 A1 WO 2006012911A1
Authority
WO
WIPO (PCT)
Prior art keywords
interval
time
frame
data flow
frames
Prior art date
Application number
PCT/EP2004/008688
Other languages
French (fr)
Inventor
Robert Mirbaha Vahid
Thomas Kelz
Original Assignee
Fg Microtec Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fg Microtec Gmbh filed Critical Fg Microtec Gmbh
Priority to PCT/EP2004/008688 priority Critical patent/WO2006012911A1/en
Publication of WO2006012911A1 publication Critical patent/WO2006012911A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • H04N21/4382Demodulation or channel decoding, e.g. QPSK demodulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44209Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4621Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/632Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing using a connection between clients on a wide area network, e.g. setting up a peer-to-peer communication via Internet for retrieving video segments from the hard-disk of other client devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/64322IP

Definitions

  • the present invention relates to the quality of service (QoS) management for mobile video applications. More specifically, the invention relates to dynamic optimization of wireless real-time video data flow, based on general predictions of mo ⁇ bile link characteristics.
  • QoS quality of service
  • ITU ITU
  • UMTS ETSI 3GPP
  • EDGE EDGE
  • ANSI 3GPP2 UMTS
  • PDAs per ⁇ sonal digital assistants
  • IP inter ⁇ net protocol
  • IP protocol for mobile radio networks has its chal- lenges, which are mainly due to frequent changes in the qual ⁇ ity of the connection between the mobile user and the corre ⁇ sponding base stations. These changes are a result of a number of complex factors, such as geographical factors, meteorologic factors or the movement of the mobile user, resulting in fre- quent network cell changes.
  • the delay of the data packages has to be minimized, since the mobile user regards delayed se ⁇ quences as highly disturbing, whereas missing packages (re ⁇ sulting, e.g., in "crackling" voice transfer) are less notice ⁇ able.
  • quality of service state predictors are described, which use a method for predicting link quality parameters in 2.5 and 3G mobile access networks.
  • the predicted link quality parameters are used for controlling lower layer corrective mechanisms such as transmission power control, to aid QoS systems and applications in their quality management process.
  • the codec mode and the bit rate can be adapted according to the current and predicted link qual ⁇ ity.
  • the method for adapt ⁇ ing the data transfer described in PCT/EP 02/03018 or DE 102 47 581 is rather limited with regard to video streaming.
  • Modern video encoders such as the H.264 video codec, include features for adapting a video stream for a wide range of different links.
  • codecs such as the H.264 video codec
  • video streaming are used in a technical meaning. They do not include audio and audio transmission but only the (moving) pictures.
  • the features for adapting a video stream also include features for adapting to links with low bandwidth and/or high bit error rate.
  • this adaptation is static during the video session, i. e., once the parameters are fixed for a transmission session, they are not to be changed again until the transmission is over. This means that the video stream will be adapted to the underlying link only at the beginning of the video transmission.
  • the invention is based on the finding that quality of the video transmission can be improved dramatically if the adapta ⁇ tion is done dynamically during the transmission knowing the actual status of the underlying link (e.g., bandwidth, band ⁇ width variations, delay, jitter, bit error rate) or even pre ⁇ dict the (near) future development of these parameters.
  • the actual status of the underlying link e.g., bandwidth, band ⁇ width variations, delay, jitter, bit error rate
  • Part of the invention is a method for dynamically adapting a video stream during the transmission.
  • the preferred embodi ⁇ ments of the invention are set forth in the dependent claims.
  • PDA personal digital assistant
  • 3G cellular phone or a 3G cellular phone
  • a wireless communication network is diclosed.
  • the mobile device is assumed to comprise at least one application generating and encoding video data using at least one codec.
  • a hardware system is dis- closed, enabling the realization of the method in one of the listed variations.
  • the encoded video data is assumed to be comprising P-frames and I-frames.
  • I- frames are including the full (compressed) video picture.
  • the decoder does not need information on previ ⁇ ously send frames.
  • the picture used for encoding this I-frame is the starting point for P-frame generation.
  • P-frames are in- eluding information on how parts of previous I-frames are changing (e.g., shifts, size variations, etc.) . This informa ⁇ tion is mainly vector based. Without the previously generated I-frame it is not possible to decode a video picture only by using a P-frame.
  • the amount of data of a P-frame is considerably smaller than the amount of data of an I-frame. Consequently, the transmission of P-frames requires a lower bandwidth than the transmission of I-frames. In more or less regular temporal intervals, I-frames are sent, whereas in between, only P-frames are transmitted, in order to reduce the overall amount of transmitted data. Thus, e.g., every 200 P-frames, one I-frame is generated and transmitted.
  • the loss of a P-frame results in the loss of one decoded video picture, thus reducing the decoded video frame rate and dete ⁇ riorating the quality of the video.
  • the loss of an I-frame re ⁇ sults in the loss of a whole sequence of video pictures.
  • the method described in the following comprises several steps. These steps not necessarily have to be taken in the given or ⁇ der. One or more steps can be performed in parallel. Addi ⁇ tional steps not listed can be performed.
  • parameters of the air link as e.g. coding scheme, and/or parameters of a transmission protocol stack, and/or available bandwidth, and/or maximum buffer sizes, and/or buffer fill levels, and/or information about PDP (Packet Data Protocol) contexts, e.g.
  • PDP Packet Data Protocol
  • QoS quality of service
  • RSCP received signal code power
  • SIR signal to interference ratio
  • RSSI received signal strength indicator
  • signal strength of the wireless connection and/or traffic volume measurement, and/or position of the mobile device, and/or altitude of the mobile device, and/or direction of the mobile device, and/or velocity of the mobile device, and/or block size (i.e. the size of the Data Link Layer transmission blocks.
  • IP packets are cut in blocks and the blocks are transferred to the re ⁇ ceiving side.) , and/or block error rate (which is similar to the bit error rate but seen for the whole Data Link Layer transmis ⁇ sion block) , and/or the codec employed, and/or the compression of header data, and/or bit error rate, and/or frame loss rate, and/or transmission delay.
  • block error rate which is similar to the bit error rate but seen for the whole Data Link Layer transmis ⁇ sion block
  • codec employed
  • a prediction of a future state of the data flow is made for a given time-interval .
  • this prediction may refer to one or more of the following types of information on the data flow: predictions related to cell reselections, and/or predictions related to throughput, and/or predictions related to signal to interference ratio (SIR) , and/or predictions related to bit error rate, and/or predictions related to the suitable coding scheme (In GPRS and EDGE (at least) a certain number of bits, i.e. a transmission block can transferred in one time- slot. Some of the bits can be used for the data trans ⁇ fer, others are used to secure these data bits.
  • predictions related to transmission delay and/or predictions related to block error rate (an error in a transmission block (if the above mentioned protection failed) are leading to an error in the IP packet which may lead to the loss of an IP packet. Predicting this rate means predicting packet losses or damaged pack ⁇ ets) , and/or predictions related to round trip time, and/or
  • time-interval is not necessarily restricted to an actual time, it can, e.g., equally well designate an internal clock of a computer.
  • Other time scales not necessarily having a continuous and steady succession in time, but indicating, e.g., the progress of a transmission, might be used.
  • a widely used time scale is the TTI-time-scale (transmission time interval, i.e. 10 or 20 ms per interval) .
  • one or more measures are taken, in order to dynamically adapt the video data flow, especially in order to reduce the risk of loss of important video data, to the pre ⁇ dicted state of the data flow during the given time-interval in the near future.
  • the goal of this adaptation is to provide the best video quality on the receiving side for each situa ⁇ tion, i. e. for each possible state of the the link or data flow quality.
  • the measures may comprise one or more of the following steps: dynamic I-frame forward error correction (FEC) dynamic I-frame generation dynamic I-frame generation delaying dynamic P-frame rate adjusting dynamic quantization adaptation
  • the method of dynamic I-frame forward error correction is employed.
  • Dynamic I-frame FEC is typically used in cases of high bit error rates with the goal to prevent the loss of an I-frame, which would result in the loss of a whole sequence of frames in a row on the de ⁇ coded side. Note that P-frames based on a lost I-frame cannot be de-coded.
  • a copy of each recently sent I-frame is buffered, and if a loss of this I- frame during transmission is detected, the I-frame is re ⁇ transmitted. There is no direct acknowledgment for receiving an I-frame.
  • the RTCP packets RTP Control Protocol or Real Time Control Protocol
  • the I-frame is re-transmitted with additional forward error correction (FEC) information. This step may be combined with the additional condition that the time that has passed between the generation of the respective I-frame and the detection of the loss of this I-frame is not too high, i. e. remains below a given threshold.
  • FEC forward error correction
  • forward error correction means are known to the person skilled in the art and may be used. These forward error cor- rection algorithms include adding additional data allowing for the detection of transmission failures and the reconstruction of certain data packages by the receiving device, even if part of the transmitted data are lost during transmission.
  • forward error correction other means of error correction usu- ally employed in wireless data transfer can be employed ac ⁇ cordingly. These are, e.g., convolutional coding or bit coding according to the used coding scheme (see above) .
  • the following method is proposed: If the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, the I-frames to be transmitted within this time- interval are send with additional FEC information. This addi ⁇ tional FEC information increases the chance that (even if part of the transmitted data are lost) , the I-frame may be recon ⁇ structed by the addressee.
  • the method of dynamic I- frame generation is used with the goal to send a new I-frame, i.e. synchronization point, to the decoder of the frames in the receiving device.
  • Dynamic I-frame generation might especially be useful or even necessary in one of the following cases: 1.
  • the loss of an I-frame could not be prevented and there was no possibility to re-transmit a copy of the lost I- frame.
  • the last I-frame was generated more than a given time- interval ago.
  • the video pictures are de- coded on the basis of the P-frames combined with a rather "old" I-frame. This will reduce the accuracy of the P-frames so that the difference between encoded and decoded picture are increasing.
  • One reason for this situation may be a delayed I-frame generation as a con- sequence of a reduced available bandwidth (dynamic I- frame generation delaying, see below) . If more band ⁇ width becomes available, the generation of an I-frame may be enforced.
  • control sig ⁇ nals are generated. These control signals trigger a codec to create an I-frame at the nearest possible point in time.
  • the generation of P-frames as well as the transmission of P-frames remains independent of the predicted future state of the data flow. Dynamic I-frame generation delaying
  • dynamic I-frame generation delaying is used. This method is especially useful in cases where the bandwidth available for data transmission is re ⁇ cuted.
  • one or more control signals may be generated controlling the at least one codec not to create an I-frame within this time-interval.
  • I- frames to be sent within this time-interval are buffered, and the transmission of these I-frames is delayed until the time- interval has passed.
  • the generation of P-frames as well as the transmission of P-frames remains independent of the predicted future state of the data flow.
  • Dynamic P-frame rate adjustment In a fourth preferred embodiment, dynamic P-frame rate adjust ⁇ ment is used, preferrably in those cases where the bandwidth available for data transfer is too low for a transfer of all P-frames.
  • the transmission of one or more P-frames to be sent within this time-interval may be suppressed.
  • single P-frames are dropped (i. e. erased from a transmission buffer) rather than transmitted, in order to re ⁇ cute the number of P-frames.
  • P-frames are not dropped "in sequence” but out of sequence, stochastically, in order to prevent "jumps" in the decoded video stream.
  • one or more control signals may be gener ⁇ ated controlling the at least one codec not to create a P- frame within this time-interval or to reduce the number of P- frames to be generated within this time-interval below a cer ⁇ tain number or rate.
  • dynamic quantization adapta ⁇ tion is used to adapt the amount of transferred data according to the available bandwidth.
  • Quantization typically is the "lossy" part of the video pic ⁇ ture compression. It can be compared with the JPEG compression of a (still) picture, as known to the person skilled in the art. With higher quantization the quality of the decoded pic- ture is lower but also the needed bandwidth to transfer this video picture is reduced.
  • the video codec H.264 e.g., con ⁇ tains 52 quantization levels.
  • one or more con ⁇ trol signals may be generated controlling the at least one co ⁇ dec to reduce the quantization of the data, thus increasing the quality of the video data.
  • the quantization of the data may be increased.
  • quantization adapta ⁇ tion will have an influence on a whole group of frames accord ⁇ ing to long-time but bigger bandwidth changes (on a temporal timescale of app. 500 ms) .
  • the other four methods typically might be of special use in cases of occurrence of very fast but rather small short-time changes of the state of the data flow.
  • the term "available bandwidth" for video streaming is defined as:
  • the present invention includes:
  • a storage medium wherein a data structure is stored on the storage medium and wherein the data structure is adapted to perform the method according to one of the embodiments de ⁇ scribed in this description after having been loaded into a main and/or working storage of a computer or of a computer network, and
  • a computer program product having program code means, wherein the program code means can be stored or are stored on a storage medium, for performing the method according to one of the embodiments described in this description, if the pro- gram code means are executed on a computer or on a computer network.
  • Fig. 1 shows a a schematic overview of the method for dy ⁇ namic optimization of real-time video data flow be ⁇ tween a mobile device and a wireless communication network
  • Fig. 2 shows a schematic diagram of a system for performing the method depicted in Fig. 1 in one of its embodi ⁇ ments
  • FIG. 1 a schematic overview of the method for dynamic op ⁇ timization of real-time video data flow between mobile devices and a mobile network is depicted.
  • Fig. 2 a physical and/or embedded system is depicted, adapted for realizing the method of Fig. 1 in one or more of its variations.
  • the arrows in Fig. 2 indicate the direction of data flow. In the following, Fig. 2 will be described in conjunction with the respective steps depicted in Fig. 1.
  • Fig. 2 shows the typical system of functional layers of a mobile device real-time video streaming as known to the person skilled in the art from the OSI refer ⁇ ence model.
  • OSI refer ⁇ ence model
  • the Net ⁇ work Layer, the Session Layer and the Presentation Layer are omitted for the sake of simplicity, but may be controlled in a similar way.
  • applica ⁇ tions 218 comprising applications generating video data may be run.
  • video data acquisistion using a cell phone equipped with a video camera may be named, including the re- spective application software.
  • codec modules 220, 222 including codecs for video encoding 220 and for voice encoding 222.
  • the implementation of the video codec H.264 and the audio codec AMR (adaptive multi rate) is assumed in the following.
  • the codecs 220, 222 transform the data streams gen ⁇ erated by the various applications 218 into encoded data frames 224, which are passed down from the Application Layer 210 via the various other layers to the Physical Layer 216 to be transmitted via the wireless network.
  • the frames 224 may be modified (symboli ⁇ cally depicted by the frames 226 in Fig. 2) , especially equipped with additional information (e.g., additional head- ers) , according to the respective protocols used for the type of information to be transmitted.
  • additional information e.g., additional head- ers
  • the Transport Layer 212 in the Transport Layer 212, the Real-Time Transport Protocol (RTP) 228, the User Datagram Protocol (UDP) 230, and the Internel Protocol (IP) 232 are employed.
  • RTP Real-Time Transport Protocol
  • UDP User Datagram Protocol
  • IP Internel Protocol
  • the Data Link Layer 214 the Logical Link Control Layer 248, the Radio Link Control Layer (RLC) 250, and the Medium Access Control Layer (MAC) 251 are employed in this embodiment.
  • LLC Logical Link Control Layer
  • RLC Radio Link Control Layer
  • MAC Medium Access Control Layer
  • Each layer is equipped with one or more control modules 234 - 246.
  • These control modules control the functionality of the layers in various ways, depending on the layer itself.
  • the Media Control Module 234 controls the settings of the audio and/or video codec(s) (e.g., the quantization, see above) of the application layer.
  • the RTP control module 236 controls the FEC (forward error correction) packet generation, the I-frame buffering and retransmission as well as the read ⁇ ing out of RTCP (RTP Control Protocol or Real Time Control Protocol) quality feedback information.
  • FEC forward error correction
  • the Logical Link Control Layer (LLC) 248 and the Radio Link Control Layer (RLC) 250 are controlled by a common control module 242 (RRC, Radio Ressource Control) .
  • RRC Radio Ressource Control
  • most or all of the layers have one or more buffers 252, 254 (symbolically depicted by the hatched boxes in Fig. 2) at their disposal. These buffers may be used for different purposes, such as for storing I-frames for delayed transmis ⁇ sion when the prediction of the future state of the data flow indicates a high risk of loss (see above) .
  • control modules 234 - 246 allow for an easy access to actual parameters of the data flow.
  • Information on the quality of the transmission e.g., the Bit Error Rate BER
  • infor ⁇ mation on the available lives in each layer e.g., the fill levels of the various buffers 252, 254, can be obtained.
  • Acquiring these Actual Parameters 256 is the first step 110 of the method depicted in Fig. 1.
  • the State Predictor Module based on the informa- tion 256, estimates the development of one or more relavant variables indicating the state of the data transfer for the near future (step 112 in Fig. 1) . In the preferred embodiment, for this purpose, the algorithm disclosed in DE 102 47 581is used. Thus, the State Predictor Module 258 may predict that the Bit Error Rate (BER) will be below a level of ICf 9 for the upcoming 10 TTIs (transmission time intervals) . These predic ⁇ tions 260 are passed on to a Decider Module 262.
  • BER Bit Error Rate
  • the Decider Module 262 compares the predicted parameters 260 with a set of parameters stored in a Lookup-Table 264.
  • this Lookup-Table which preferrably consists of a multi ⁇ dimensional matrix, the possible states of the future flow control predicted by the State Predictor 258 are divided into a numer of "cases", i. e. into a number of intervals for each relevant predicted parameter. Thus, for each case, a set of control parameters is referenced in this Lookup-Table.
  • the "decision" on flow optimization the Decider Module 262 takes in step 114 basically, has the form of a certain set of control parameters 266, which are picked from the Lookup-Table 264 according to the Predictions 260 of the State Predictor 258. These Control Parameters 266 are passed on to the respective control modules 234 - 246, in order to adjust the data flow.
  • steps 116 - 124 in Fig. 1 are possible.
  • the steps disclosed in this invention all refer to optimization of video streaming of encoded video data, but means for optimizing the data flow of encoded audio data may ba taken in parallel.
  • the decider may decide that according to the predic ⁇ tions of the future data flow, the method dynamic forward er ⁇ ror correction (FEC) 116 may be employed. As explained above, this method will preferrably be chosen when high BERs are pre ⁇ dicted for the near future.
  • FEC forward er ⁇ ror correction
  • control parameters for several control modules may be generated.
  • control parameters controlling the Transport Layer 212 or the Data Link Layer 214 to store the most re ⁇ cently transmitted I-frame in one of the buffers 252, 254 may be generated and passed on to one of the control modules 236 - 244.
  • the I-frame is buffered in the buffer 252 of the RTP 228.
  • the I-frame can be re ⁇ transmitted in case a loss during transmission occurs.
  • RTCP RTP Control Protocol or Real Time Control Protocol
  • FEC for RTP module (both not shown) .
  • the RTCP will get the quality feedback packet from the receiving side.
  • the decider 262 uses these pieces of informa ⁇ tion to decide that the I-frame should be retransmitted with additional ,FEC and signals to the RTP and FEC for RTP modules to retransmit the packet and to create an FEC packet.
  • I-frames when using the term "I-frames", it is obvious that not only the actual I-frames (224 in Fig. 2) are meant, but that the term also may include "modified” I-frames 226, i. e. after having passed layers below the Application Layer 210. These frames, as described above, also include additional in ⁇ formation, such as additional headers.
  • control parameters controlling the Physical Layer 216 or the Data Link Layer 214 to apply a certain schedule of er- ror correction especially forward error correction (FEC) may be passed on to the control modules 236 - 246.
  • FEC forward error correction
  • the Decider Module 262 may control the Data Link Layer Control Module RRC 242 to increase a redundancy factor (i.e. the factor controlling the error correction information) from 1.15 to 1.30 or change the coding scheme.
  • control parameters may first control the layers to buffer an I-frame and then, in case a loss of this I-frame is detected, to re-transmit it with increased redundancy factor.
  • the Decider Module 262 may decide that the method of dynamic I-frame generation (118 in Fig. 1) is to be applied. As explained above, this method is especially useful in case the Actual Parameters 256 indicate that the loss of an I-frame could not be prevented and there was no possibility to re ⁇ transmit a copy of the lost I-frame or if since the last I- frame generation more than a pre-defined time-interval has passed.
  • control pa ⁇ rameters 266 for the Media Control Module 234 are generated controlling the codec 220 to create, an I-frame at the nearest possible point in time.
  • the Decider Module 262 may decide that the method of dynamic I-frame generation delaying 120 is to be employed, which is especially useful if the Actual Parameters 256 indi ⁇ cate that the bandwidth available for data transmission is be- low a given level, an information which can be gained from a readout of the parameters of the Control Modules 242 - 246 by determining the used coding scheme in layer 1, the physical layer, the allocated timeslots in the data ling layer and the allocated transmission blocks in these timeslots, which can be collected from the RLC 250.
  • the decider may provide control parameters 266 to the Media Con- trol Module 234 preventing the codec 220 from creating an I- frame within a pre-defined time-interval .
  • the decider may provide control parameters 266 to the control modules 234 - 246 of one of the layers, prefer- rably of the Transport Layer 212 or of the Data Link Layer 214 to store the I-frames to be sent within this temporal interval in one of their buffers 252, 254 rather than to transmit them.
  • the fill-level of the buffers may be checked as part of the Actual Parameters 256, in order for the Decider 262 to make a new decision about transmission of the buffered data.
  • the Decider Module 262 may decide that the method of dynamic P-frame rate adjustment 122 is to be applied. As men- tioned above, this might be the case especially when the Ac ⁇ tual Parameters 256, in particular mainly control parameters of the Data Link Layer 214 or the Physical Layer 216 indicate that the bandwidth available for data transfer is too low for a transfer of all P-frames.
  • the Decider Module 262 may generate two types of control parameters 266: First, control parameters controlling the Application Layer 210, the Transport Layer 212, the Data Link Layer 214, the Physical Layer 216, or another layer not depicted in Fig. 2, to erase a certain number of P-frames up ⁇ coming for transmission from one or more of the buffers, e.g., from the buffers 252, 254. As indicated above, preferrably, these P-frames to be erased are selected stochastically rather than in sequence.
  • the Decider Module 262 may generate control parame ⁇ ters for the Media Control Module 234 of the Application Layer 210 controlling the codec 220 not to create a P-frame within a given time-interval in the future (e.g., the upcoming 10 TTIs) or to reduce the number of P-frames to be generated within this time-interval below a certain number or rate.
  • the Decider Module 262 may decide that the method of Dynamic quantization adaptation 124 is to be applied. As men ⁇ tioned above, this method may be chosen in cases where the State Predictor 258 indicates that the state of the data flow for a given time-interval in the future, e.g., the upcoming 10 TTIs, will be such that the amount of data that can be trans- mitted will be above a given level. In this case, the Decider Module 262 may generate one or more control parameters 266 controlling the Media Control Module 234 to operate the codec 220 in a way that the quantization of the data for this time- interval is reduced, thus increasing the quality of the video data.
  • Control Module for MAC Medium Access Control Layer
  • Radio Link Control Layer 251 Radio Link Control Layer (MAC) 252 buffer

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Quality of service management for video streaming applications is an important issue for wireless mobile devices, such as for upload or download of video data between personal digital assistants (PDAs) or 3G cellular phones, and a wireless communication network. The invention discloses a method for dynamically adapting a video stream during the transmission. The adaptation is mainly based on a prediction of the future state of the data flow. According to the predictions, a decider module decides on a method (116, 118, 120, 122, 124) for either adapting the encoding of the video data or for adapting the transmission of these data, depending on the kind of frame to be transmitted. By using this quality of service management, the mobile system can react to changes in the quality of the wireless link at an early stage and, thus, greatly improve the quality video data transmission.

Description

Dynamic optimization of wireless real-time video data flow
Field of the invention
The present invention relates to the quality of service (QoS) management for mobile video applications. More specifically, the invention relates to dynamic optimization of wireless real-time video data flow, based on general predictions of mo¬ bile link characteristics.
Background of the invention
By now, the standardization of 3rd-generation wireless systems is almost completed in most major economic regions of the world. These systems are known under names such as IMT-2000
(ITU), UMTS (ETSI 3GPP) , EDGE and ANSI 3GPP2. A large number of suitable 3G wireless devices, such as cell phones or per¬ sonal digital assistants (PDAs) are commercially available. The transfer of data using the above-mentioned systems is pre¬ dominantly packed-switched, mostly using the well-known inter¬ net protocol (IP) . The transport of IP packets over the air interface not only extends the reach of the internet, it also opens the opportunity to migrate all of the communication to a packet switched environment.
Using IP protocol for mobile radio networks has its chal- lenges, which are mainly due to frequent changes in the qual¬ ity of the connection between the mobile user and the corre¬ sponding base stations. These changes are a result of a number of complex factors, such as geographical factors, meteorologic factors or the movement of the mobile user, resulting in fre- quent network cell changes.
The impact of these frequent changes in the quality of connec¬ tion, resulting, e.g., in a frequent change of the bit error rate (BER) or the frame loss rate (FLR) , on typical real-time applications (mostly including voice and/or video packet transfer) strongly depends on the application itself: In Email data transfer, e.g., the reliability of the packet transfer is an esential factor, whereas, e.g., the speed of the transfer is of minor importance. For real-time audio- and video appli- cations on the other hand, the delay of the data packages has to be minimized, since the mobile user regards delayed se¬ quences as highly disturbing, whereas missing packages (re¬ sulting, e.g., in "crackling" voice transfer) are less notice¬ able.
In PCT/EP 02/03018 and DE 102 47 581, quality of service state predictors are described, which use a method for predicting link quality parameters in 2.5 and 3G mobile access networks. The predicted link quality parameters are used for controlling lower layer corrective mechanisms such as transmission power control, to aid QoS systems and applications in their quality management process. E.g., the codec mode and the bit rate can be adapted according to the current and predicted link qual¬ ity. Nevertheless, since video encoding is largely different from the encoding schemes of voice packages, the method for adapt¬ ing the data transfer described in PCT/EP 02/03018 or DE 102 47 581 is rather limited with regard to video streaming.
Modern video encoders, called codecs, such as the H.264 video codec, include features for adapting a video stream for a wide range of different links. Note that the terms "video" as well as "video streaming", in this context, are used in a technical meaning. They do not include audio and audio transmission but only the (moving) pictures.
The features for adapting a video stream also include features for adapting to links with low bandwidth and/or high bit error rate. In prior art solutions, this adaptation is static during the video session, i. e., once the parameters are fixed for a transmission session, they are not to be changed again until the transmission is over. This means that the video stream will be adapted to the underlying link only at the beginning of the video transmission.
Summary of the invention
It is therefore an object of the present invention to provide a method and a system for dynamic optimization of wireless real-time video data flow.
The invention is based on the finding that quality of the video transmission can be improved dramatically if the adapta¬ tion is done dynamically during the transmission knowing the actual status of the underlying link (e.g., bandwidth, band¬ width variations, delay, jitter, bit error rate) or even pre¬ dict the (near) future development of these parameters. Other objects and advantages of the present invention may be ascertained from a reading of the specification and appended claims in conjunction with the drawings.
Part of the invention is a method for dynamically adapting a video stream during the transmission. The preferred embodi¬ ments of the invention are set forth in the dependent claims.
A method for dynamic optimization of real-time video data flow between a mobile device, such as a personal digital assistant
(PDA) or a 3G cellular phone, and a wireless communication network is diclosed. The mobile device is assumed to comprise at least one application generating and encoding video data using at least one codec. Further, a hardware system is dis- closed, enabling the realization of the method in one of the listed variations.
As it is a standard for video data encoding, the encoded video data is assumed to be comprising P-frames and I-frames. I- frames are including the full (compressed) video picture. For a device receiving and decoding video data, in order to decode an I-frame, the decoder does not need information on previ¬ ously send frames. The picture used for encoding this I-frame is the starting point for P-frame generation. P-frames are in- eluding information on how parts of previous I-frames are changing (e.g., shifts, size variations, etc.) . This informa¬ tion is mainly vector based. Without the previously generated I-frame it is not possible to decode a video picture only by using a P-frame.
Generally, the amount of data of a P-frame is considerably smaller than the amount of data of an I-frame. Consequently, the transmission of P-frames requires a lower bandwidth than the transmission of I-frames. In more or less regular temporal intervals, I-frames are sent, whereas in between, only P-frames are transmitted, in order to reduce the overall amount of transmitted data. Thus, e.g., every 200 P-frames, one I-frame is generated and transmitted.
The loss of a P-frame results in the loss of one decoded video picture, thus reducing the decoded video frame rate and dete¬ riorating the quality of the video. The loss of an I-frame re¬ sults in the loss of a whole sequence of video pictures.
The method described in the following comprises several steps. These steps not necessarily have to be taken in the given or¬ der. One or more steps can be performed in parallel. Addi¬ tional steps not listed can be performed.
First, a set of actual parameters indicating the current state of the data flow are acquired. These parameters preferrably comprise one or more of the following parameters: parameters of the air link, as e.g. coding scheme, and/or parameters of a transmission protocol stack, and/or available bandwidth, and/or maximum buffer sizes, and/or buffer fill levels, and/or information about PDP (Packet Data Protocol) contexts, e.g. quality of service (QoS) settings for the PDP context, and/or radio resource management information, and/or received signal code power (RSCP) , and/or signal to interference ratio (SIR) , and/or received signal strength indicator (RSSI) , and/or signal strength of the wireless connection, and/or traffic volume measurement, and/or position of the mobile device, and/or altitude of the mobile device, and/or direction of the mobile device, and/or velocity of the mobile device, and/or block size (i.e. the size of the Data Link Layer transmission blocks. Remember that IP packets are cut in blocks and the blocks are transferred to the re¬ ceiving side.) , and/or block error rate (which is similar to the bit error rate but seen for the whole Data Link Layer transmis¬ sion block) , and/or the codec employed, and/or the compression of header data, and/or bit error rate, and/or frame loss rate, and/or transmission delay.
In a next step, on the basis of these actual parameters and other necessary data, a prediction of a future state of the data flow is made for a given time-interval . As an example, this prediction may refer to one or more of the following types of information on the data flow: predictions related to cell reselections, and/or predictions related to throughput, and/or predictions related to signal to interference ratio (SIR) , and/or predictions related to bit error rate, and/or predictions related to the suitable coding scheme (In GPRS and EDGE (at least) a certain number of bits, i.e. a transmission block can transferred in one time- slot. Some of the bits can be used for the data trans¬ fer, others are used to secure these data bits. There are existing different coding schemes (4 in GPRS, 9 in EDGE) which are standing for a different ratio of data bits / protection bits. This means: using coding scheme one you have the highest protection for the data bits but the lowest number of usable bits for data transfer, coding scheme 4 provides less protec¬ tion but more bits for data transfer, resulting in a higher bandwidth.) , and/or
- predictions related to transmission delay, and/or predictions related to block error rate (an error in a transmission block (if the above mentioned protection failed) are leading to an error in the IP packet which may lead to the loss of an IP packet. Predicting this rate means predicting packet losses or damaged pack¬ ets) , and/or predictions related to round trip time, and/or
- predictions related to the increased and decreased bandwidth available for transmission of the video data.
Algorithms for calculating predictions of this type are state of the art and are disclosed, e.g., in PCT/EP 02/03018 or DE 102 47 581. The meaning of the expression "time-interval" is not necessarily restricted to an actual time, it can, e.g., equally well designate an internal clock of a computer. Other time scales, not necessarily having a continuous and steady succession in time, but indicating, e.g., the progress of a transmission, might be used. A widely used time scale is the TTI-time-scale (transmission time interval, i.e. 10 or 20 ms per interval) .
In a next step, one or more measures are taken, in order to dynamically adapt the video data flow, especially in order to reduce the risk of loss of important video data, to the pre¬ dicted state of the data flow during the given time-interval in the near future. The goal of this adaptation is to provide the best video quality on the receiving side for each situa¬ tion, i. e. for each possible state of the the link or data flow quality. The measures may comprise one or more of the following steps: dynamic I-frame forward error correction (FEC) dynamic I-frame generation dynamic I-frame generation delaying dynamic P-frame rate adjusting dynamic quantization adaptation
These steps are described in detail in the following. Some of the steps may be performed in one or more of the described variations.
Dynamic I-frame forward error correction (FEC)
In a first preferred embodiment, the method of dynamic I-frame forward error correction (FEC) is employed. Dynamic I-frame FEC is typically used in cases of high bit error rates with the goal to prevent the loss of an I-frame, which would result in the loss of a whole sequence of frames in a row on the de¬ coded side. Note that P-frames based on a lost I-frame cannot be de-coded.
Two possible cases may be considered for this method:
1. The loss of an I-frame during transmission is detected.
2. It can be predicted (or estimated) that the probability to lose a certain I-frame during transmission is high.
Especially (but not solely) in the first case, a copy of each recently sent I-frame is buffered, and if a loss of this I- frame during transmission is detected, the I-frame is re¬ transmitted. There is no direct acknowledgment for receiving an I-frame. But the RTCP packets (RTP Control Protocol or Real Time Control Protocol) , used for quality feedback information for the RTP streams, can be used to give a feedback from the receiver to the sender. In a preferred embodiment, the I-frame is re-transmitted with additional forward error correction (FEC) information. This step may be combined with the additional condition that the time that has passed between the generation of the respective I-frame and the detection of the loss of this I-frame is not too high, i. e. remains below a given threshold.
Several forward error correction means are known to the person skilled in the art and may be used. These forward error cor- rection algorithms include adding additional data allowing for the detection of transmission failures and the reconstruction of certain data packages by the receiving device, even if part of the transmitted data are lost during transmission. Besides forward error correction, other means of error correction usu- ally employed in wireless data transfer can be employed ac¬ cordingly. These are, e.g., convolutional coding or bit coding according to the used coding scheme (see above) .
Especially (but not solely) in the second case listed above, the following method is proposed: If the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, the I-frames to be transmitted within this time- interval are send with additional FEC information. This addi¬ tional FEC information increases the chance that (even if part of the transmitted data are lost) , the I-frame may be recon¬ structed by the addressee.
In a preferred embodiment, when using the method of dynamic I- frame forward error correction, the generation of P-frames as well as the transmission of these P-frames remains independent of the predicted future state of the data flow. Dynamic I-frame generation
In a second preferred embodiment, the method of dynamic I- frame generation is used with the goal to send a new I-frame, i.e. synchronization point, to the decoder of the frames in the receiving device.
Dynamic I-frame generation might especially be useful or even necessary in one of the following cases: 1. The loss of an I-frame could not be prevented and there was no possibility to re-transmit a copy of the lost I- frame.
2. The last I-frame was generated more than a given time- interval ago. In this case the video pictures are de- coded on the basis of the P-frames combined with a rather "old" I-frame. This will reduce the accuracy of the P-frames so that the difference between encoded and decoded picture are increasing. One reason for this situation may be a delayed I-frame generation as a con- sequence of a reduced available bandwidth (dynamic I- frame generation delaying, see below) . If more band¬ width becomes available, the generation of an I-frame may be enforced.
Thus, the following step is proposed: If since the last sucessful transmission of an I-frame a time-period longer than a pre-defined time-period has passed, one or more control sig¬ nals are generated. These control signals trigger a codec to create an I-frame at the nearest possible point in time.
In a preferred embodiment, when using the method of dynamic I- frame generation, the generation of P-frames as well as the transmission of P-frames remains independent of the predicted future state of the data flow. Dynamic I-frame generation delaying
In a third preferred embodiment, dynamic I-frame generation delaying is used. This method is especially useful in cases where the bandwidth available for data transmission is re¬ duced.
Thus, if the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, the following means may be taken: First, one or more control signals may be generated controlling the at least one codec not to create an I-frame within this time-interval. Alternatively or additionally, I- frames to be sent within this time-interval are buffered, and the transmission of these I-frames is delayed until the time- interval has passed.
During this period of delay, only P-frames are transmitted. As described above, the transmission of P-frames typically re¬ quires a lower bandwidth. Nevertheless, the vector based in¬ formation encoded in the P-frames are becoming more inaccurate with the rising "age" of the corresponding I-frame resulting in an inaccuracy of the decoded video picture.
In a preferred embodiment, when using the method of dynamic I- frame generation delaying, the generation of P-frames as well as the transmission of P-frames remains independent of the predicted future state of the data flow.
Dynamic P-frame rate adjustment In a fourth preferred embodiment, dynamic P-frame rate adjust¬ ment is used, preferrably in those cases where the bandwidth available for data transfer is too low for a transfer of all P-frames.
Thus, if the prediction of the state of the data flow for a given time-interval in the future indicates a limitation of the amount of data that can be transmitted at a given quality below a given level, the transmission of one or more P-frames to be sent within this time-interval may be suppressed. In this case single P-frames are dropped (i. e. erased from a transmission buffer) rather than transmitted, in order to re¬ duce the number of P-frames. Preferrably, P-frames are not dropped "in sequence" but out of sequence, stochastically, in order to prevent "jumps" in the decoded video stream.
Alternatively, instead of simply suppressing the transmission of these P-frames, one or more control signals may be gener¬ ated controlling the at least one codec not to create a P- frame within this time-interval or to reduce the number of P- frames to be generated within this time-interval below a cer¬ tain number or rate.
Dynamic quantization adaptation
In a fifth preferred embodiment, dynamic quantization adapta¬ tion is used to adapt the amount of transferred data according to the available bandwidth. Note that there is a negative re- lationship between the quantization and the quality of a transmitted video picture received and decoded by an ad¬ dressee: Increasing the quantization results in a reduced quality of the decoded video picture and vice versa. Quantiza¬ tion is data reduction. Quantization typically is the "lossy" part of the video pic¬ ture compression. It can be compared with the JPEG compression of a (still) picture, as known to the person skilled in the art. With higher quantization the quality of the decoded pic- ture is lower but also the needed bandwidth to transfer this video picture is reduced. The video codec H.264, e.g., con¬ tains 52 quantization levels.
Thus, if the prediction for the state of the data flow for a given time-interval in the future indicates that the amount of data that can be transmitted will be above a given level (i. e. in the case of high available bandwidth), one or more con¬ trol signals may be generated controlling the at least one co¬ dec to reduce the quantization of the data, thus increasing the quality of the video data. On the other hand, if the pre¬ dictions indicate a low available bandwidth, the quantization of the data may be increased.
Note that whereas the first four methods and embodiments are working on a (very granular) frame level, quantization adapta¬ tion will have an influence on a whole group of frames accord¬ ing to long-time but bigger bandwidth changes (on a temporal timescale of app. 500 ms) . The other four methods typically might be of special use in cases of occurrence of very fast but rather small short-time changes of the state of the data flow.
Also note that, if besides the video stream an audio stream is to be transmitted, the audio stream typically will have the higher priority. As described above, the mobile device user typically is willing to accept some lost video pictures rather than lost voice fragments. Thus, in this document, the term "available bandwidth" for video streaming is defined as:
available bandwidth = provided bandwidth - audio bandwidth. Furthermore, the present invention includes:
- a computer loadable data structure that is adapted to per¬ form the method according to one of the embodiments described in this description while the data structure is being executed on a computer,
a computer program, wherein the computer program is adapted to perform the method according to one of the embodi¬ ments described in this description while the program is being executed on a computer,
- a computer program comprising program means for performing the method according to one of the embodiments described in this description while the computer program is being executed on a computer or on a computer network,
- a computer program comprising such program means, wherein the program means are stored on a storage medium readable to a computer,
a storage medium, wherein a data structure is stored on the storage medium and wherein the data structure is adapted to perform the method according to one of the embodiments de¬ scribed in this description after having been loaded into a main and/or working storage of a computer or of a computer network, and
a computer program product having program code means, wherein the program code means can be stored or are stored on a storage medium, for performing the method according to one of the embodiments described in this description, if the pro- gram code means are executed on a computer or on a computer network.
Brief description of the drawings
For a more complete understanding of the present invention, reference is made to the following description made in connec¬ tion with accompanying drawings in which:
Fig. 1 shows a a schematic overview of the method for dy¬ namic optimization of real-time video data flow be¬ tween a mobile device and a wireless communication network; and Fig. 2 shows a schematic diagram of a system for performing the method depicted in Fig. 1 in one of its embodi¬ ments;
Detailed description of preferred embodiments
In Fig. 1, a schematic overview of the method for dynamic op¬ timization of real-time video data flow between mobile devices and a mobile network is depicted. In Fig. 2, a physical and/or embedded system is depicted, adapted for realizing the method of Fig. 1 in one or more of its variations. The arrows in Fig. 2 indicate the direction of data flow. In the following, Fig. 2 will be described in conjunction with the respective steps depicted in Fig. 1.
On the left hand side, Fig. 2 shows the typical system of functional layers of a mobile device real-time video streaming as known to the person skilled in the art from the OSI refer¬ ence model. Out of the seven OSI layers, in Fig. 2 only the Application Layer 210, the Transport Layer 212, the Data Link Layer 214, and the Physical Layer 216 are depicted. The Net¬ work Layer, the Session Layer and the Presentation Layer are omitted for the sake of simplicity, but may be controlled in a similar way.
On the level of the Application Layer 210, several applica¬ tions 218 comprising applications generating video data may be run. As an example, video data acquisistion using a cell phone equipped with a video camera may be named, including the re- spective application software. Also included in the Applica¬ tion layer are one or more codec modules 220, 222, including codecs for video encoding 220 and for voice encoding 222. As an example, the implementation of the video codec H.264 and the audio codec AMR (adaptive multi rate) is assumed in the following. The codecs 220, 222 transform the data streams gen¬ erated by the various applications 218 into encoded data frames 224, which are passed down from the Application Layer 210 via the various other layers to the Physical Layer 216 to be transmitted via the wireless network.
On their way down to the Physical Layer 216, in each of the various OSI layers, the frames 224 may be modified (symboli¬ cally depicted by the frames 226 in Fig. 2) , especially equipped with additional information (e.g., additional head- ers) , according to the respective protocols used for the type of information to be transmitted. Thus, in the example de¬ picted in Fig. 2, in the Transport Layer 212, the Real-Time Transport Protocol (RTP) 228, the User Datagram Protocol (UDP) 230, and the Internel Protocol (IP) 232 are employed. In the Data Link Layer 214, the Logical Link Control Layer (LLC) 248, the Radio Link Control Layer (RLC) 250, and the Medium Access Control Layer (MAC) 251 are employed in this embodiment.
Each layer is equipped with one or more control modules 234 - 246. These control modules control the functionality of the layers in various ways, depending on the layer itself. Thus, e.g., the Media Control Module 234 controls the settings of the audio and/or video codec(s) (e.g., the quantization, see above) of the application layer. The RTP control module 236 controls the FEC (forward error correction) packet generation, the I-frame buffering and retransmission as well as the read¬ ing out of RTCP (RTP Control Protocol or Real Time Control Protocol) quality feedback information.
In the example depicted in Fig. 2, in the Data Link Layer 214, the Logical Link Control Layer (LLC) 248 and the Radio Link Control Layer (RLC) 250 are controlled by a common control module 242 (RRC, Radio Ressource Control) .
Further, most or all of the layers have one or more buffers 252, 254 (symbolically depicted by the hatched boxes in Fig. 2) at their disposal. These buffers may be used for different purposes, such as for storing I-frames for delayed transmis¬ sion when the prediction of the future state of the data flow indicates a high risk of loss (see above) .
Besides controlling the settings and parameters of the single OSI layers, the control modules 234 - 246 allow for an easy access to actual parameters of the data flow. Thus, e.g., by accessing the control modules 242 and 244 of the Data Link Layer 214, Information on the quality of the transmission (e.g., the Bit Error Rate BER) can be gained. Further, infor¬ mation on the available ressources in each layer, e.g., the fill levels of the various buffers 252, 254, can be obtained. Acquiring these Actual Parameters 256 is the first step 110 of the method depicted in Fig. 1.
These Actual Parameters are passed on to a State Predictor
Module 258. The State Predictor Module, based on the informa- tion 256, estimates the development of one or more relavant variables indicating the state of the data transfer for the near future (step 112 in Fig. 1) . In the preferred embodiment, for this purpose, the algorithm disclosed in DE 102 47 581is used. Thus, the State Predictor Module 258 may predict that the Bit Error Rate (BER) will be below a level of ICf9 for the upcoming 10 TTIs (transmission time intervals) . These predic¬ tions 260 are passed on to a Decider Module 262.
The Decider Module 262 compares the predicted parameters 260 with a set of parameters stored in a Lookup-Table 264. In this Lookup-Table, which preferrably consists of a multi¬ dimensional matrix, the possible states of the future flow control predicted by the State Predictor 258 are divided into a numer of "cases", i. e. into a number of intervals for each relevant predicted parameter. Thus, for each case, a set of control parameters is referenced in this Lookup-Table.
Thus, in this embodiment, the "decision" on flow optimization the Decider Module 262 takes in step 114, basically, has the form of a certain set of control parameters 266, which are picked from the Lookup-Table 264 according to the Predictions 260 of the State Predictor 258. These Control Parameters 266 are passed on to the respective control modules 234 - 246, in order to adjust the data flow.
As disclosed above, several ways of controlling the data flow are possible (steps 116 - 124 in Fig. 1) . The steps disclosed in this invention all refer to optimization of video streaming of encoded video data, but means for optimizing the data flow of encoded audio data may ba taken in parallel.
First, the decider may decide that according to the predic¬ tions of the future data flow, the method dynamic forward er¬ ror correction (FEC) 116 may be employed. As explained above, this method will preferrably be chosen when high BERs are pre¬ dicted for the near future.
When dynamic I-frame forward error correction (FEC) 116 is chosen, control parameters for several control modules may be generated. First, control parameters controlling the Transport Layer 212 or the Data Link Layer 214 to store the most re¬ cently transmitted I-frame in one of the buffers 252, 254 may be generated and passed on to one of the control modules 236 - 244. In the preferred embodiment, the I-frame is buffered in the buffer 252 of the RTP 228. Thus, the I-frame can be re¬ transmitted in case a loss during transmission occurs. Besides the RTP 228, there is a RTCP (RTP Control Protocol or Real Time Control Protocol) and an FEC for RTP module (both not shown) . The RTCP will get the quality feedback packet from the receiving side. The decider 262 uses these pieces of informa¬ tion to decide that the I-frame should be retransmitted with additional ,FEC and signals to the RTP and FEC for RTP modules to retransmit the packet and to create an FEC packet.
In this context, when using the term "I-frames", it is obvious that not only the actual I-frames (224 in Fig. 2) are meant, but that the term also may include "modified" I-frames 226, i. e. after having passed layers below the Application Layer 210. These frames, as described above, also include additional in¬ formation, such as additional headers.
Further, control parameters controlling the Physical Layer 216 or the Data Link Layer 214 to apply a certain schedule of er- ror correction, especially forward error correction (FEC) may be passed on to the control modules 236 - 246. Thus, as an ex¬ ample, in case a BER above 10~3 is predicted for the upcoming 10 TTIs, the Decider Module 262 may control the Data Link Layer Control Module RRC 242 to increase a redundancy factor (i.e. the factor controlling the error correction information) from 1.15 to 1.30 or change the coding scheme.
As explained above, the methods may be combined. E.g., the control parameters may first control the layers to buffer an I-frame and then, in case a loss of this I-frame is detected, to re-transmit it with increased redundancy factor.
Secondly, the Decider Module 262 may decide that the method of dynamic I-frame generation (118 in Fig. 1) is to be applied. As explained above, this method is especially useful in case the Actual Parameters 256 indicate that the loss of an I-frame could not be prevented and there was no possibility to re¬ transmit a copy of the lost I-frame or if since the last I- frame generation more than a pre-defined time-interval has passed. In case of dynamic I-frame generation 118, control pa¬ rameters 266 for the Media Control Module 234 are generated controlling the codec 220 to create, an I-frame at the nearest possible point in time.
Thirdly, the Decider Module 262 may decide that the method of dynamic I-frame generation delaying 120 is to be employed, which is especially useful if the Actual Parameters 256 indi¬ cate that the bandwidth available for data transmission is be- low a given level, an information which can be gained from a readout of the parameters of the Control Modules 242 - 246 by determining the used coding scheme in layer 1, the physical layer, the allocated timeslots in the data ling layer and the allocated transmission blocks in these timeslots, which can be collected from the RLC 250.
In this case, i.e. if the prediction indicates a high risk of loss of the frames to be transmitted in the near future, the decider may provide control parameters 266 to the Media Con- trol Module 234 preventing the codec 220 from creating an I- frame within a pre-defined time-interval . Alternatively or ad¬ ditionally, the decider may provide control parameters 266 to the control modules 234 - 246 of one of the layers, prefer- rably of the Transport Layer 212 or of the Data Link Layer 214 to store the I-frames to be sent within this temporal interval in one of their buffers 252, 254 rather than to transmit them.
As soon as the temporal interval has passed, the fill-level of the buffers may be checked as part of the Actual Parameters 256, in order for the Decider 262 to make a new decision about transmission of the buffered data.
Fourthly, the Decider Module 262 may decide that the method of dynamic P-frame rate adjustment 122 is to be applied. As men- tioned above, this might be the case especially when the Ac¬ tual Parameters 256, in particular mainly control parameters of the Data Link Layer 214 or the Physical Layer 216 indicate that the bandwidth available for data transfer is too low for a transfer of all P-frames.
In this case, the Decider Module 262 may generate two types of control parameters 266: First, control parameters controlling the Application Layer 210, the Transport Layer 212, the Data Link Layer 214, the Physical Layer 216, or another layer not depicted in Fig. 2, to erase a certain number of P-frames up¬ coming for transmission from one or more of the buffers, e.g., from the buffers 252, 254. As indicated above, preferrably, these P-frames to be erased are selected stochastically rather than in sequence.
Further, the Decider Module 262 may generate control parame¬ ters for the Media Control Module 234 of the Application Layer 210 controlling the codec 220 not to create a P-frame within a given time-interval in the future (e.g., the upcoming 10 TTIs) or to reduce the number of P-frames to be generated within this time-interval below a certain number or rate.
Fifthly, the Decider Module 262 may decide that the method of Dynamic quantization adaptation 124 is to be applied. As men¬ tioned above, this method may be chosen in cases where the State Predictor 258 indicates that the state of the data flow for a given time-interval in the future, e.g., the upcoming 10 TTIs, will be such that the amount of data that can be trans- mitted will be above a given level. In this case, the Decider Module 262 may generate one or more control parameters 266 controlling the Media Control Module 234 to operate the codec 220 in a way that the quantization of the data for this time- interval is reduced, thus increasing the quality of the video data.
While the present inventions have been described and illus¬ trated in conjunction with a number of specific embodiments, those skilled in the art will appreciate that variations and modifications may be made without departing from the princi¬ ples of the inventions as herein illustrated, as described and claimed. The present inventions may be embodied in other spe¬ cific forms without departing from their spirit or essential characteristics. The described embodiments are considered in all respects to be illustrative and not restrictive. The scope of the inventions are, therefore, indicated by the appended claims, rather than by the foregoing description. All changes which come within the meaning and range of equivalence of the claims are to be embraced within their scope. Reference List
110 Acquisition of Actual Parameters of the data flow 112 Prediction of the future state of the data flow
114 Decision on flow optimization
116 Dynamic I-frame forward error correction
118 dynamic I-frame generation
120 dynamic I-frame generation delaying 122 dynamic P-frame adjusting
124 dynamic quantization adaptation
210 Application Layer
212 Transport Layer
214 Data Link Layer 216 Physical Layer
218 application
220 video codec
222 voice codec
224 data frames 226 modified data frames
228 Real Time Protocol, RTP
230 User Datagram Protocol, UDP 232 Internet Protocol, IP
234 Media Control Module 236 RTP Control Module
238 UDP Control Module
240 IP Control Module
242 Common Control Module RRC (Radio Ressource Control) for
LLC (Logical Link Control Layer) and RLC (Radio Link Con- trol Layer)
244 Control Module for MAC (Medium Access Control Layer)
246 Control Module for Physical Layer
248 Logical Link Control Layer (LLC)
250 Radio Link Control Layer (RLC) 251 Medium Access Control Layer (MAC) 252 buffer
254 buffer
256 Actual Parameters of the data flow
258 State Predictor Module
260 Predictions on the future state of the data flow
262 Decider Module
264 Lookup-Table
266 Set of Control Parameters

Claims

Claims
1. A method for dynamic optimization of real-time video data flow between a mobile device, such as a personal digital assistant (PDA) or a 3G cellular phone, and a wireless commu¬ nication network;
- wherein the mobile device comprises at least one applica¬ tion generating and encoding video data using at least one co- dec;
- wherein the encoded video data comprise at least one full (compressed) video picture (I-frame) and at least one frame containing only the changes of a video picture since the last I-frame (P-frame) ; - the method comprising the following steps: a) actual parameters indicating the current state of the data flow are acquired; b) on the basis of the actual parameters, parameters indi¬ cating a future state of the data flow are predicted for a given time-interval; c) according to the predicted future state of the data flow, one or more of the following steps of dynamic flow opti¬ mization are taken: cl) a copy of each recently sent I-frame is buffered, and if a loss of a recently sent I-frame during transmission is detected, the buffered I-frame is re-transmitted; and/or c2) if the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, the I-frames to be transmitted within this time-interval are sent with additional FEC infor¬ mation; and/or c3) if since the last sucessful transmission of an I-frame a time-period longer than a pre-defined time-period has passed, one or more control signals are generated controlling the codec to create an I-frame at the nearest possible point in time; and/or c4) if the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, one or more control signals are generated controlling the codec not to create an I-frame within this time-interval; and/or c5) if the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, I-frames to be sent within this time-interval are buffered, and the transmission is de¬ layed until the time-interval has passed; and/or c6) if the prediction of the state of the data flow for a given time-interval in the future indicates a limitation of the amount of data that can be transmitted below a given level, the transmission of one or more P-frames to be sent within this time-interval is suppressed; and/or cl) if the prediction of the state of the data flow for a given time-interval in the future indicates a limitation of the amount of data that can be transmitted below a given level, one or more control signals are generated controlling the codec not to create a P-frame within this time-interval or to reduce the number of P-frames to be generated within this time-interval; and/or c8) if the prediction of the state of the data flow for a given time-interval in the future indicates that the amount of data that can be transmitted will be at a given level, at least one control signal is generated controlling the codec to adapt the quantization of the video data to be transmitted.
2. A method according to the previous claim, characterized in that - one or more of the steps cl) , c2) , c3) , c4) or c5) are performed; and
- that the encoding and transmission of P-frames remains independent of the predicted future state of the data flow.
3. A method according to one of the previous claims, characterized in that
- step c6) is performed in a way that P-frames, which are selected to be suppressed during transmission, are chosen sto¬ chastically out of the video data stream.
4. A method according to one of the previous claims, characterized in that the decision, which of the steps cl) - c7) is to be taken and/or the set of parameters to be employed for dynamic flow optimization, is based on a set of pre-defined decisions re¬ corded in a lookup-table.
5. A method according to one of the previous claims, characterized in that audio data are transmitted in parallel to the video data.
6. A method according to one of the previous claims, characterized in that in step cl) the buffered I-frame is re-transmitted with ad¬ ditional forward error correction (FEC) information.
7. A system for dynamic optimization of real-time video data flow between a mobile device, such as a personal digital assistant (PDA) or a 3G cellular phone, and a wireless commu¬ nication network;
- wherein the mobile device comprises at least one applica¬ tion generating and encoding video data using at least one co¬ dec; - wherein the encoded video data comprise at least one full (compressed) video picture (I-frame) and at least one frame containing only the changes of a video picture since the last I-frame (P-frame) ; the system comprising: a) means for acquisition of actual parameters indicating the current state of the data flow; b) means for predicting, based on the actual parameters, parameters indicating a future state of the data flow for a given time-interval; c) means for choosing one or more of the following steps of dynamic flow optimization according to the predicted future state of the data flow: cl) buffering a copy of each recently sent I-frame, and if a loss of a recently sent I-frame during transmission is de¬ tected, re-transmitting the buffered I-frame; and/or c2) if the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, sending the I-frames to be transmitted within this time-interval with additional FEC in¬ formation; and/or c3) if since the last sucessful transmission of an I-frame a time-period longer than a pre-defined time-period has passed, generating one or more control signals, controlling the codec to create an I-frame at the nearest possible point in time; and/or c4) if the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, generating one or more control signals, controlling the codec not to create an I-frame within this time-interval; and/or c5) if the prediction of the state of the data flow for a given time-interval in the future indicates a risk of loss of the frames to be transmitted within this time-interval being above a pre-defined risk level, buffering I-frames to be sent within this time-interval, and delaying the transmission until the time-interval has passed; and/or c6) if the prediction of the state of the data flow for a given time-interval in the future indicates a limitation of the amount of data that can be transmitted below a given level, suppressing the transmission of one or more P-frames to be sent within this time-interval; and/or cl) if the prediction of the state of the data flow for a given time-interval in the future indicates a limitation of the amount of data that can be transmitted below a given level, generating one or more control signals controlling the codec not to create a P-frame within this time-interval or to reduce the number of P-frames to be generated within this time-interval; and/or c8) if the prediction of the state of the data flow for a given time-interval in the future indicates that the amount of data that can be transmitted will be at a given level, gener¬ ating at least one control signal controlling the codec to adapt the quantization of the video data to be transmitted.
8. At least one of an operating system, a computer readable medium having stored thereon a plurality of computer- executable instructions, a co-processing device, a computing device and a modulated data signal carrying computer executa- ble instructions for performing the method of one of the pre¬ vious claims referring to a method.
9. At least one computer readable medium comprising com¬ puter executable modules including computer executable in- structions for dynamic optimization of real-time video data flow between a mobile device, such as a personal digital as¬ sistant (PDA) or a 3G cellular phone, and a wireless communi¬ cation network, wherein the mobile device comprises at least one application generating and encoding video data using a co¬ dec, wherein the encoded video data comprise at least one full (compressed) video picture (I-frame) and at least one frame containing only the changes of video pictures (P-frame) , the computer executable modules comprising means for performing one or more of the steps cl) - c8) in claim 1.
10. At least one of an operating system, a co-processing device, a computing device and a modulated data signal carry- ing the computer executable instructions of the computer ex¬ ecutable modules of the at least one computer readable medium of the previous claim.
PCT/EP2004/008688 2004-08-03 2004-08-03 Dynamic optimization of wireless real-time video data flow WO2006012911A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2004/008688 WO2006012911A1 (en) 2004-08-03 2004-08-03 Dynamic optimization of wireless real-time video data flow

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2004/008688 WO2006012911A1 (en) 2004-08-03 2004-08-03 Dynamic optimization of wireless real-time video data flow

Publications (1)

Publication Number Publication Date
WO2006012911A1 true WO2006012911A1 (en) 2006-02-09

Family

ID=34958370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2004/008688 WO2006012911A1 (en) 2004-08-03 2004-08-03 Dynamic optimization of wireless real-time video data flow

Country Status (1)

Country Link
WO (1) WO2006012911A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7519038B2 (en) 2006-03-06 2009-04-14 Hitachi, Ltd. Adaptive EDCA algorithm using traffic prediction in wireless local area networks
WO2010051703A1 (en) * 2008-11-10 2010-05-14 华为技术有限公司 A method, an apparatus and a video transmission system for repairing the video data stream
KR100982880B1 (en) * 2007-08-31 2010-09-16 (주)아이뮤직소프트 Contents upload system and method
CN101964904A (en) * 2010-10-11 2011-02-02 北京中科大洋科技发展股份有限公司 Method for realizing live broadcast of television news by using wireless communication network
WO2011012009A1 (en) * 2009-07-29 2011-02-03 中兴通讯股份有限公司 Method, device and system for error control in wireless video communication system
US7969979B2 (en) 2003-09-29 2011-06-28 Runcom Technologies Ltd. Distribution of multicast data to users
CN102143348A (en) * 2011-01-18 2011-08-03 中国联合网络通信集团有限公司 Video uploading system and video uploading method for mobile terminal
CN102378012A (en) * 2011-11-26 2012-03-14 南京邮电大学 Data hiding-based H.264 video transmission error code recovery method
CN102572362A (en) * 2010-12-15 2012-07-11 盛乐信息技术(上海)有限公司 Video signal transmission method
US8498401B2 (en) 2011-07-21 2013-07-30 T-Mobile Usa, Inc. Mobile-to-mobile call determination
US8601153B2 (en) 2009-10-16 2013-12-03 Qualcomm Incorporated System and method for optimizing media playback quality for a wireless handheld computing device
US8723913B2 (en) 2010-10-07 2014-05-13 T-Mobile Usa, Inc. Rate adaptation for video calling
US9118801B2 (en) 2011-10-24 2015-08-25 T-Mobile Usa, Inc. Optimizing video-call quality of service
US9124642B2 (en) 2009-10-16 2015-09-01 Qualcomm Incorporated Adaptively streaming multimedia
US20160007045A1 (en) * 2007-12-05 2016-01-07 Sony Computer Entertainment America Llc System and Method for Utilizig Forward Error Correction With Video Compression
WO2020020705A1 (en) 2018-07-27 2020-01-30 Q.ant GmbH Laser light source and laser projector having same
CN113242438A (en) * 2021-04-12 2021-08-10 郑州阿帕斯数云信息科技有限公司 Video data transmission method and device
CN113923513A (en) * 2021-09-08 2022-01-11 浙江大华技术股份有限公司 Video processing method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002089516A1 (en) * 2001-04-25 2002-11-07 Fg Microtec Gmbh Quality of service state predictor for advanced mobile devices

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002089516A1 (en) * 2001-04-25 2002-11-07 Fg Microtec Gmbh Quality of service state predictor for advanced mobile devices

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
KHANSARI M ET AL: "A low-complexity error-resilient H.263 coder", SIGNAL PROCESSING. IMAGE COMMUNICATION, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 14, no. 6-8, May 1999 (1999-05-01), pages 493 - 504, XP004165390, ISSN: 0923-5965 *
LEI Z ET AL: "A rate adaptation transcoding scheme for real-time video transmission over wireless channels", SIGNAL PROCESSING. IMAGE COMMUNICATION, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 18, no. 8, September 2003 (2003-09-01), pages 641 - 658, XP004452903, ISSN: 0923-5965 *
TOSUN A S ET AL: "On improving quality of video for H.263 over wireless CDMA networks", WIRELESS COMMUNICATIONS AND NETWORKING CONFERNCE, 2000. WCNC. 2000 IEEE 23-28 SEPTEMBER 2000, PISCATAWAY, NJ, USA,IEEE, vol. 3, 23 September 2000 (2000-09-23), pages 1421 - 1426, XP010532755, ISBN: 0-7803-6596-8 *
YIAO LIU ET AL: "A novel adaptive error control scheme for real time wireless video streaming", INFORMATION TECHNOLOGY: RESEARCH AND EDUCATION, 2003. PROCEEDINGS. ITRE2003. INTERNATIONAL CONFERENCE ON AUG. 11-13, 2003, PISCATAWAY, NJ, USA,IEEE, 11 August 2003 (2003-08-11), pages 460 - 463, XP010685492, ISBN: 0-7803-7724-9 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7969979B2 (en) 2003-09-29 2011-06-28 Runcom Technologies Ltd. Distribution of multicast data to users
US7519038B2 (en) 2006-03-06 2009-04-14 Hitachi, Ltd. Adaptive EDCA algorithm using traffic prediction in wireless local area networks
KR100982880B1 (en) * 2007-08-31 2010-09-16 (주)아이뮤직소프트 Contents upload system and method
US9878241B2 (en) * 2007-12-05 2018-01-30 Sony Interactive Entertainment America Llc System and method for utilizing forward error correction with video compression
US20160007045A1 (en) * 2007-12-05 2016-01-07 Sony Computer Entertainment America Llc System and Method for Utilizig Forward Error Correction With Video Compression
WO2010051703A1 (en) * 2008-11-10 2010-05-14 华为技术有限公司 A method, an apparatus and a video transmission system for repairing the video data stream
EP2412160A4 (en) * 2009-03-23 2017-08-02 Sony Computer Entertainment America LLC System and method for utilizing forward error correction with video compression
KR101732200B1 (en) 2009-03-23 2017-05-02 소니 인터랙티브 엔터테인먼트 아메리카 엘엘씨 System and method for utilizing forward error correction with video compression
WO2011012009A1 (en) * 2009-07-29 2011-02-03 中兴通讯股份有限公司 Method, device and system for error control in wireless video communication system
US8601153B2 (en) 2009-10-16 2013-12-03 Qualcomm Incorporated System and method for optimizing media playback quality for a wireless handheld computing device
US9124642B2 (en) 2009-10-16 2015-09-01 Qualcomm Incorporated Adaptively streaming multimedia
US8723913B2 (en) 2010-10-07 2014-05-13 T-Mobile Usa, Inc. Rate adaptation for video calling
US9131103B2 (en) 2010-10-07 2015-09-08 T-Mobile Usa, Inc. Video presence sharing
US9706047B2 (en) 2010-10-07 2017-07-11 T-Mobile Usa, Inc. Video presence sharing
CN101964904A (en) * 2010-10-11 2011-02-02 北京中科大洋科技发展股份有限公司 Method for realizing live broadcast of television news by using wireless communication network
CN101964904B (en) * 2010-10-11 2012-08-29 北京中科大洋科技发展股份有限公司 Method for realizing live broadcast of television news by using wireless communication network
CN102572362A (en) * 2010-12-15 2012-07-11 盛乐信息技术(上海)有限公司 Video signal transmission method
CN102572362B (en) * 2010-12-15 2016-04-06 盛乐信息技术(上海)有限公司 Video-signal transmission method
CN102143348A (en) * 2011-01-18 2011-08-03 中国联合网络通信集团有限公司 Video uploading system and video uploading method for mobile terminal
US8498401B2 (en) 2011-07-21 2013-07-30 T-Mobile Usa, Inc. Mobile-to-mobile call determination
US9118801B2 (en) 2011-10-24 2015-08-25 T-Mobile Usa, Inc. Optimizing video-call quality of service
CN102378012A (en) * 2011-11-26 2012-03-14 南京邮电大学 Data hiding-based H.264 video transmission error code recovery method
WO2020020705A1 (en) 2018-07-27 2020-01-30 Q.ant GmbH Laser light source and laser projector having same
CN113242438A (en) * 2021-04-12 2021-08-10 郑州阿帕斯数云信息科技有限公司 Video data transmission method and device
CN113923513A (en) * 2021-09-08 2022-01-11 浙江大华技术股份有限公司 Video processing method and device
CN113923513B (en) * 2021-09-08 2024-05-28 浙江大华技术股份有限公司 Video processing method and device

Similar Documents

Publication Publication Date Title
WO2006012911A1 (en) Dynamic optimization of wireless real-time video data flow
CN108476091B (en) Method, system and user equipment for determining transmission conditions of real-time media stream of wireless communication network
Wu et al. Improving multipath video transmission with raptor codes in heterogeneous wireless networks
KR100907194B1 (en) Radio base station apparatus and scheduling method
CN101491138B (en) Compressed delay packet transmission scheduling
US8165083B2 (en) Communication system, base station, and mobile station
US8406199B2 (en) Data flow amount control device and data flow amount control method
US9451248B2 (en) Data processing device and data processing method
JP5721699B2 (en) Wireless communication apparatus and wireless communication method
JP4659838B2 (en) Device for predictively coding a sequence of frames
KR20020080496A (en) Channel quality measurement in data transmission using hybrid arq
WO2014046610A1 (en) A circuit arrangement and method of determining a priority of packet scheduling
CN102160340B (en) Transmission rate control device and transmission rate control method
CN110326357A (en) Data processing method and equipment
CN102611521B (en) Adjusting method for channel quality indicator (CQI)
WO2011013768A1 (en) Wireless terminal and transmission speed prediction method
US10700818B2 (en) System and method for improving efficiency of wirelessly transmitting video packets
KR101533240B1 (en) Rate matching device for controlling rate matching in mobile communication system and method thereof
Rapaport et al. Adaptive HARQ and scheduling for video over LTE
CN111262659B (en) Semi-TCP grouping batch acknowledgement reply method based on fountain codes
EP2843956A1 (en) Method and device for encoding a video
Zhang et al. A novel retransmission scheme for video services in hybrid wireline/wireless networks
CN110839164A (en) Video transmission method and device
Luo et al. Position assisted coordinate HARQ in LTE systems for high speed railway
JP2003018207A (en) Method for improving outgoing traffic channel transmission delay time in radio communication system and base station transmission device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase