EP2417766A1 - Method and apparatus for asynchronous video transmission over a communication network - Google Patents

Method and apparatus for asynchronous video transmission over a communication network

Info

Publication number
EP2417766A1
EP2417766A1 EP10727998A EP10727998A EP2417766A1 EP 2417766 A1 EP2417766 A1 EP 2417766A1 EP 10727998 A EP10727998 A EP 10727998A EP 10727998 A EP10727998 A EP 10727998A EP 2417766 A1 EP2417766 A1 EP 2417766A1
Authority
EP
European Patent Office
Prior art keywords
frame
video
video frame
encoded
request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10727998A
Other languages
German (de)
French (fr)
Inventor
Mark Edwards
Dean S. Dyson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motorola Solutions Inc
Original Assignee
Motorola Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Solutions Inc filed Critical Motorola Solutions Inc
Publication of EP2417766A1 publication Critical patent/EP2417766A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/613Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for the control of the source by the destination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/162User input
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/164Feedback from the receiver or from the transmission channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44209Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6375Control signals issued by the client directed to the server or network components for requesting retransmission, e.g. of data packets lost or corrupted during transmission from server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • H04N21/6379Control signals issued by the client directed to the server or network components directed to server directed to encoder, e.g. for requesting a lower encoding rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Definitions

  • the technical field relates generally to communication systems, and in particular, it relates to a method and apparatus for asynchronously transmitting video over wired or wireless networks.
  • video in addition to or in place of audio, is transmitted over wired and wireless networks.
  • Various protocols and network requirements and usages impose limitations, however, on the transmission of video.
  • meeting public safety requirements for live video transmission in a wireless network is difficult today.
  • Content passes through a number of nodes, routers, and/or computers on its way to the end user or client.
  • bandwidth is the data transmission rate capacity or maximum amount of information, in bits/second, that can be transmitted along a channel
  • end user clients buffer streamed video content such that a certain amount of the video data is downloaded before it begins to play so that the end client is working through previously received video data while more video data is being downloaded.
  • Video is more susceptible than audio to such interruptions.
  • a video encoder for example one included in a video codec that can both encode and decode a digital data stream or signal
  • available data bandwidth in a wireless network can vary significantly during a video session due to client mobility (e.g., due to signal strength variations, cell handovers, sharing with other clients, etc.).
  • the network For a mobile video client in a wireless network, frequently the network is not able to provide enough bandwidth to transport the video encoder's total bit stream for at least a portion of the video session due to, for example, traffic congestion. This can lead to pauses and breaks in the video transmission and reception and a steady increase in the end-to-end delay making it difficult to watch the video with any degree of continuity. Thus, a client's operational requirements for video will not be reliably met, even in a broadband wireless network.
  • bandwidth variations mean that video servers have to employ "workarounds", such as manually reconfiguring video encoder bit rates during the video session in an attempt to match prevailing wireless conditions or configuring the encoder to work continuously at a minimum acceptable bandwidth and hope that the chosen bandwidth is available through the entire video session.
  • workarounds such as manually reconfiguring video encoder bit rates during the video session in an attempt to match prevailing wireless conditions or configuring the encoder to work continuously at a minimum acceptable bandwidth and hope that the chosen bandwidth is available through the entire video session.
  • an end-to-end delay builds up, such as at nodes and routers, in the network infrastructure, and the video sequence is interrupted as bandwidth drops below the minimum level needed.
  • FIG. 1 is a block diagram illustrating a video streaming system in accordance with some embodiments.
  • FIG. 2 is a high level flow diagram of a method for asynchronous live video transmission in accordance with some embodiments.
  • FIG. 3 is a detailed flow diagram of a method for asynchronous live video transmission in accordance with some embodiments.
  • Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of various elements. In addition, the description and drawings do not necessarily require the order illustrated. It will be further appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required.
  • a video server implements a method for asynchronous video transmission over one or more communication networks.
  • the method includes: receiving, from a video client, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client; capturing a video frame that has not been previously sent to the video client, wherein the video frame comprises a portion of live video captured by a video camera; encoding the video frame using the first set of encoding parameters to produce one encoded video frame; and transmitting the one encoded video frame such that only the one encoded video frame is in transit over the communication network in response to the frame request.
  • the video server uses a motion compensation algorithm to encode the video frames as predicted (P) frames and intra (I) frames, wherein I-frame periodicity is determined by the video client.
  • the video client can request retransmission of lost P frames and I frames from the video server, wherein the video server is able to adapt P frame encoding if necessary depending on information from the video client regarding which frame has been lost.
  • the video server can generate P frames asynchronously, and the video client accordingly decodes P frames asynchronously.
  • the video server receives a frame request
  • the video server updates its codec encoding parameters using the second set of encoding parameters; retrieves and encodes a next video frame (or a previously sent video frame needing retransmission) using the second encoding parameters; and transmits (retransmits) the encoded video frame in response to the frame request (or retransmission request).
  • the second set of encoding parameters are determined by the video client in response to detecting an increased delay in receiving an encoded video frame or in response to a failure to receive the encoded video frame altogether, wherein the video client adjusts the encoding parameters for the video server to decrease the size of the encoded video frame so that an acceptable frame rate is achieved at the video client given the current network bandwidth conditions and/or video client connectivity to the network.
  • the video client changes the encoding parameters to balance a tradeoff between quality of displayed video versus frame rate at the video client.
  • the video client changes the encoding parameters in response to improved network bandwidth and/or video client network connectivity.
  • Such a dynamic methodology enables the video client to minimize breaks and delays in the video stream being displayed on the client side when the network is experiencing bandwidth limitations or the video client is experiencing network connectivity limitations.
  • the video client can adjust the encoding parameters to enable higher resolution and/or frame rate when network bandwidth and/or connectivity improves.
  • FIG. 1 there is shown a block diagram illustrating an asynchronous live video transmission system 100 (i.e., a video transmission system that asynchronously transmits live video frames) in accordance with principles of the present disclosure.
  • asynchronous i.e., a video transmission system that asynchronously transmits live video frames
  • the term "asynchronous”, “asynchronously”, or other variations of the word is defined as of or pertaining to operations without the use of fixed time intervals, or may also be defined as having a operation start only after a previous operation has been completed.
  • live video is defined as an electronic representation of live scenes that is captured by a video camera in real time as the scenes are occurring, wherein the live video comprises a sequence of video frames (also referred to herein as live video frames), wherein a video frame represents the smallest media unit of live video that may be processed and transmitted by a video server over a communication network.
  • Video transmission system 100 uses a client-pull technology, wherein a video server transmits encoded video frames only in response to a frame request (or retransmission request) from a video client.
  • System 100 comprises a video server 104 and a video client 132 communicatively coupled together over a communication network 130.
  • Communication network 130 may represent one or more wired or wireless communication networks, or combinations thereof, and includes one or more infrastructure devices used to facilitate communications (including live video communications) over the network 130.
  • An infrastructure device is a device that is part of a fixed network infrastructure and can receive information (either control or media, e.g., voice (audio), video, etc.) in a signal and transmit information in signals to one or more end user devices via a communication link.
  • Examples of infrastructure devices include, but are not limited to, equipment commonly referred to as controllers, base stations, base transceiver stations, access points, routers, and the like.
  • System 100 further comprises a video camera 102 operatively coupled to the video server 104, wherein the video camera 102 provides live video that is processed by the video server and transmitted as a sequence of encoded video frames over communication network 130 to the video client 104.
  • system 100 includes a video display 134 operatively to the video client 132 to display video frames that are received and decoded by the video client.
  • Video server 104 comprises an analog video input 106 that receives live video from the video camera 102 in an analog signal format such as: an RGB (red, green, blue) component analog signal carried as three separate signals, wherein the video input 106 includes three BNC or RCA electrical jacks coupled to the video camera 102 via one or more suitable cables having corresponding plugs or connectors at both ends of the cable; or a composite analog signal such as NTSC, PAL, or SECAM, wherein the video input 106 includes an RCA jack coupled to the video camera 102 via a suitable cable having corresponding plugs or connectors at both ends of the cable.
  • an RGB red, green, blue
  • analog video input 106 comprises known elements (e.g., processing and transceiver (i.e., transmitter and receiver) elements, wherein some of the functionality of the processing and transceiver may be performed in processing device(s) 110) that enable short range wireless transmission techniques such as Bluetooth technology.
  • Video server 104 further comprises: one or more processing devices
  • I/O input/output
  • I/O interface 108 may comprise a serial port interface (e.g., RS-232-C, EIA-232-D, or EIA-232-E) or known elements (e.g., processing and transceiver elements, wherein some of the functionality of the processing and transceiver may be performed in processing device(s) 110) that enable short range wireless transmission techniques such as Bluetooth technology.
  • serial port interface e.g., RS-232-C, EIA-232-D, or EIA-232-E
  • known elements e.g., processing and transceiver elements, wherein some of the functionality of the processing and transceiver may be performed in processing device(s) 110 that enable short range wireless transmission techniques such as Bluetooth technology.
  • I/O interface 108 comprises known elements including processing, modulating, and transceiver elements that are operable in accordance with any one or more standard or proprietary wireless interfaces, wherein some of the functionality of the processing, modulating, and transceiver elements may be performed in processing device(s) 110.
  • Processing device(s) 110 comprise logic including, but not limited to, a frame capturer 112, a codec 114, and a frame request processor 116.
  • the frame capturer 112 "captures" or pulls analog video frames from the video camera 102 via the analog video input 106 and converts the analog frames into a raw digital format, also known as a raw image file or a digital negative.
  • a raw image file is defined as minimally processed data from the image sensor of a video camera that is not usable as an image having a viewable format but has all of the information needed to create an image that that has a viewable format. If the analog frames are provided by the video camera 102 at a faster rate than is requested by the video client 132, the video server 104 is configured to "drop" some of the analog frames or raw frames. CODEC 114 encodes the raw image file using programmed encoding parameters to generate an encoded video frame having a viewable format that may or may not use motion compensation (i.e., the use of I, P, and/or B frames) such as, for instance, JPEG, MPEG, H.264, etc.
  • motion compensation i.e., the use of I, P, and/or B frames
  • the encoding algorithm used generally compresses the amount of data that needs to be transmitted relative to the data included in the corresponding raw video frame.
  • the frame request processor 116 receives and processes frame requests and retransmission requests received at the I/O interface 108 from the video client 132.
  • Memory unit(s) 120 include suitable data storage capability to implement: a transmit buffer 122 that stores one or more encoded video frames that have not yet been transmitted to the video client; and a frame storage 124 that stores one or more previously transmitted frames, such as a last transmitted (Tx) frame 126 that may be used in response to a retransmission request and/or motion compensation processing.
  • a transmit buffer 122 that stores one or more encoded video frames that have not yet been transmitted to the video client
  • a frame storage 124 that stores one or more previously transmitted frames, such as a last transmitted (Tx) frame 126 that may be used in response to a retransmission request and/or motion compensation processing.
  • the video server 104 stores a most recently transmitted video frame in frame storage 124 to produce a recorded video frame, and uses the recorded video frame as a reference frame to derive a next frame for encoding and transmitting as a predicted frame.
  • the transmit buffer is implemented as a rolling buffer that stores a plurality of sequential encoded video frames.
  • each encoded video frame is transmitted such that only one encoded video frame is transmitted in response to any given frame (or retransmission) request processed by the frame request processor 116 and such that only one encoded video frame at a time is in transit over the communication network 130 in response to a given frame (or retransmission) request.
  • Video client 132 comprises an input/output (I/O) interface 136 suitable for receiving the encoded video frames over the communication network 130 depending on the type of network being used.
  • I/O interface 136 may comprise a serial port interface (e.g., RS-232-C, EIA-232-D, or EIA-232-E) or known elements (e.g., processing and transceiver elements, wherein some of the functionality of the processing and transceiver may be performed in processing device(s) 140) that enable short range wireless transmission techniques such as Bluetooth technology.
  • I/O interface 136 comprises known elements including processing, modulating, and transceiver elements that are operable in accordance with any one or more standard or proprietary wireless interfaces (as corresponds to those implemented in the video server), wherein some of the functionality of the processing, modulating and transceiver elements may be performed in processing device(s) 140.
  • Video client 132 further comprises: one or more memory storage units
  • a display driver 138 that includes the necessary processing and hardware for presenting the decoded video frames to the video display 134 so that the live video can be seen by a viewer of the video display 134.
  • Memory unit(s) 150 include at least a receive buffer 152 and frame storage 154 and may also include further storage capability for storing application software used to program processing device(s) 110, thereby enabling the video client functionality in accordance with the teachings herein.
  • Receive buffer 152 stores a single encoded video frame that is currently being received via interface 136 (in response to a frame or retransmission request) and that will next be processed by the video client.
  • Frame storage 154 stores a last encoded frame 156 that was received and processed by the video client. The last received (Rx) video frame 156 is stored, for example, for use in motion compensation processing.
  • Processing device(s) 140 comprise at least a frame request processor
  • Frame request processor 142 generates frame requests and if needed retransmission requests, each for a single encoded video frame from the video server 104. In one embodiment, the frame request processor 142 generates a next frame request only after receiving an encoded video frame into receive buffer 152. In another embodiment, the frame request processor 142 generates a next frame request only after receiving an encoded video frame into receive buffer 152 and after the CODEC 144 properly decodes the frame that is stored in the receive buffer 152.
  • the frame request processor 142 can change encoding parameters such as: resolution (which means a number of pixels included in a video frame); quantization (which means a number of bits used to encode each pixel of the video frame); whether a next frame to be encoded is an infra (I) frame or a predicted (P) frame; an infra frame period or I frame periodicity (which means a ratio of P frames generated for every I frame generated), etc.
  • an I frame is an encoded video frame generated using a motion compensation algorithm or process, which contains all necessary rendering information within itself for the video client to decode and display the video frame.
  • a P frame is an encoded video frame generated using a motion compensation algorithm or process, which contains only information on the differences from a previous frame.
  • the frame request processor 142 can monitor its current network connectivity and current network conditions such as available bandwidth in the network. If the network connectivity falls below a threshold or if the level of congestion in the network 130 rises about a threshold, then the frame request processor 142 can adjust the encoding parameters to decrease the size of the encoded video frame, for example, by decreasing the resolution and/or the quantization of the encoded video frame, in order to maintain a certain frame rate at the video client.
  • the frame request processor 142 can adjust the encoding parameters to increase the size of the encoded video frame, for example, by increasing the resolution and/or the quantization of the encoded video frame to maintain the frame rate at the video client.
  • the frame request processor 142 includes the newly determined encoding parameters in a frame request or retransmission request to the video client.
  • the frame request processor 142 is connected to the I/O interface 136 wherein it can measure one or more parameters that indicate signal connectivity such as received signal strength, signal-to-noise ratio, signal-to-interference ratio, and the like.
  • the frame request processor 142 also contains logic used to determine the existence and perhaps the level of network congestion or network loading within communication network 130.
  • each encoded video frame may include a time stamp provided by the video server that indicates the time that the video frame was encoded. The frame request processor 142 can then compare the time stamp to the time at which the encoded video frame was received to determine a time delay of receipt of the encoded video frame.
  • the frame request processor can determine that there is a decrease in available bandwidth within network 130 and determine new encoding parameters to decrease the size of the encoded video frame. Also, some communication protocols enable the video client to receive information regarding available network bandwidth directly from infrastructure devices within the network, and the video client can, thereby, adjust the encoding parameters responsive to receiving such data from the network. [0034] In a worst case scenario, the video client 132 never receives a transmitted encoded video frame or receives it "late.” In a simplest implementation, the frame request processor 142 sets a timer upon sending a frame request and then determines that a frame is lost if an encoded video frame is not received in response to the frame request before expiration of the timer.
  • the frame request processor 142 thereupon requests a new I frame from the video server 104.
  • Video sequence numbers also referred to herein as frame numbers
  • frame numbers can be used to facilitate this implementation so that the video client can request (within the frame request) that the next I frame have a certain sequence (frame) number and so that both the video server and video client are aware of a next encoded video frame in the sequence. Moreover, if the "lost" frame eventually arrives at the video client (as indicated by the sequence number), the video client simply discards the late frame.
  • the frame request processor 142 may send a retransmission request to the video server to retransmit the frame and may further include new encoding parameters that change (in this case decrease) the size of the next one or more encoded video frames.
  • the frame request processor 142 could issue the retransmission request without changing the encoding parameters.
  • the frame request processor 142 sends a frame request for encoded video frame (N+ 1) to the server with an indication that encoded video frame N did not arrive. Note that the default in the frame request is to indicate to the server that the last frame did arrive and was successfully decoded.
  • the video server 104 If the video server 104 is configured to send P frames, and a P frame is now due, the video server encodes the next P frame based on the last frame that it knows the client successfully received, which requires that frame storage 124 needs to store at least the last transmitted frame and the frame before that, which was confirmed as received by the video client 132. Where the video server is next due to send an I frame, then an I frame is sent.
  • the encoded video frame might be late at the video client because network bandwidth has suddenly reduced (due to, for example, wireless fading, handover to a busy cell, handover from a broadband network to a narrowband network, etc.).
  • the video client 132 upon determining that that the cause of the late or lost frame is due to bandwidth limitations, requests that encoded frame N be retransmitted with a smaller size, using a change in encoding parameters.
  • the video server 104 learns that the last transmitted encoded video frame has not arrived (for instance upon receiving a second request from the video client for frame N), it clears out its own transmit buffer in case the congestion is on its own wireless uplink and sends a Null frame to the video client 132 to indicate that the missing frame was actually attempted and to indicate its size.
  • the Null frame serves the purpose of indicating to the video client that the encoded video frame was lost and not the frame request and further indicates the size of the encoded video frame to assist the video client in determining how the respond.
  • the video client 132 can respond to this information by: sending a retransmission request without changing the encoding parameters, so the video server re-sends the encoded video frame at the same size; sending a retransmission request with adjusted encoding parameters that specifies, for instance, a lower resolution or quantization, which results in the retransmitted frame having a smaller size, thereby, making it less likely to be held up in the network; requesting a next new encoded video frame using the current encoding parameters; or requesting a next new encoded video frame using new encoding parameters.
  • the video client's decision of how to proceed is based on its own estimate of network latency, which it can derive by examining the delay between sending its last frame request and receiving the video server's Null frame. If the delay is larger than a configured threshold, the video client assumes congestion and requests a smaller frame size (lower resolution and/or quantization). If the delay is smaller than the threshold, the video client assumes that the problem is just a temporary loss of coverage and requests the same size frame with unchanged resolution and quantization.
  • the frame request processor 142 may decide to change the encoding parameters at the video server for other reasons. For example, it may be important for a user viewing video on video display 134 to see high resolution video (perhaps to see some greater detail in the video); so the frame request processor would change the encoding parameters to increase the resolution at the expense of having a decreased frame rate (i.e., rate at which successive encoded video frames are received into the receive buffer). Alternatively, the user may determine that a high frame rate is more desirable at the expense of resolution, and the frame request processor 142 accordingly adjusts the encoding parameters. The user (via a graphical user interface) may adjust the encoding parameters in response to user preference, or the frame request processor 142 may automatically adjust the encoding parameters in response to network connectivity and/or available network bandwidth.
  • CODEC 144 decodes the encoded video frame sitting in the receive buffer 152 using programmed decoding parameters that correspond to the encoding parameters programmed into CODEC 114 to generate a video frame having a format that is viewable on video display 134.
  • CODEC 144 may use codec standards such as JPEG, MPEG, H.264, etc., that may or may not use a motion compensation algorithm. Where motion compensation is used, the CODEC 144 retrieves the last Rx frame 156 from frame storage 154 to properly decode the current frame where the current frame is a P frame.
  • a flow diagram is shown illustrating a high level method 200 for asynchronous live video transmission in accordance with an embodiment of the teachings herein.
  • the steps of method 200 do not necessarily have to be performed in the order indicated, and more or fewer of the steps may be performed at any given point in time in order to implement the teachings herein.
  • the frame request processor 116 in the video server 104 receives, from the video client 132 via the I/O interface 108, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client.
  • the frame capturer 112 captures a video frame that has not been previously sent to the video client 132, wherein the video frame comprises a portion of live video captured by the video camera 102 and received into analog video input 106.
  • CODEC 114 encodes the video frame using the first set of encoding parameters to produce one encoded video frame that is provided to the transmit buffer 122 and transmitted via I/O interface 108 to the video client 132 such that only the one encoded video frame is in transit over the communication network in response to the frame request.
  • the video server need not transmit encoded frames at a known synchronous frame rate.
  • the rate at which frames are transmitted is asynchronous and is determined by the video client since each encoded video frame transmitted by the video server is transmitted only in response to a frame (or retransmission) request and only one encoded video frame is in transit over the communication network 130 in response to a given frame (or retransmission) request.
  • Such asynchronous video transmission enables a control loop mechanism that allows the video client to adjust the rate of video frame transmission in response to changing network conditions and network connectivity and user preference and to match the video server CODEC configurations to available network bandwidth. This enables a customer's maximum latency requirements to be met over changing network conditions.
  • FIG. 3 a flow diagram is shown illustrating a detailed method 300 for asynchronous live video transmission in accordance with an embodiment of the teachings herein.
  • the steps of method 300 do not necessarily have to be performed in the order indicated, and more or fewer of the steps may be performed at any given point in time in order to implement the teachings herein.
  • the frame request processor 116 in the video server 104 receives (302) a frame request.
  • the frame request processor 116 determines (304) that there are new encoding parameters (i.e., a second set of encoding parameters that are different from a first (current) set of encoding parameters) included in the frame request to change the size of a next encoded video frame (by, for instance, changing one or more of the frame rate at the encoder, the resolution of the encoded video frames, the quantization of the encoded video frames, etc.), the frame request processor 116 updates (306) the encoding parameters for the CODEC 114 with the new (second set of) encoding parameters so that the next one or more video frames are encoded using the new (second set of) encoding parameters. If not, the next one or more video frames are encoded using the current (first) encoding parameters.
  • new encoding parameters i.e., a second set of encoding parameters that are different from a first (current) set of encoding parameters
  • the frame request may further include a frame number for a next encoded frame.
  • the frame request processor 116 determines that the frame request is a retransmission request, the last transmitted frame 126 is retrieved (312) from frame storage 124. If the frame request contains a particular frame number, the encoded video frame having that frame number is retrieved from frame storage 124.
  • the retransmission request contained updated encoding parameters (for instance to change the size of the encoded video frame): the last transmitted frame undergoes transcoding (320) in the codec 114 using the updated encoding parameters in the retransmission request; as appropriate, a frame number and/or time stamp to indicate when the video frame was encoded may be inserted with the encoded video frame during the codec processing 320; the encoded video frame having a different frame size is provided to the transmit buffer 122 for temporary storage before retransmission (322) via the I/O interface 108 to the video client 132 in response to the retransmission request; and frame storage 124 is updated (324) with the last transmitted encoded video frame.
  • the retransmission request does not contain updated encoding parameters
  • the last transmitted frame can simply be pulled from frame storage 124 into the transmit buffer 122 for retransmission to the video client.
  • the frame request processor 116 determines that the frame request is not a request for retransmission but is a request for a new encoded video frame, the subsequent video server processing depends on whether (310) the video server implements the transmit buffer 122 as a rolling buffer.
  • the video server implements a rolling buffer, and it is determined (310) that the next encoded frame is in the rolling buffer, and the frame request contained (314) updated encoding parameters (for instance to change the size of the encoded video frame): the next encoded frame is retrieved (312) from the rolling buffer to undergo transcoding (320) in the codec 114 using the updated encoding parameters in the frame request; as appropriate, a frame number and/or time stamp to indicate when the video frame was encoded may be inserted with the encoded video frame; the encoded video frame having a different frame size from the previous encoded video frame is provided to the transmit buffer 122 for temporary storage before transmission (322) via the I/O interface 108 to the video client 132 in response to the frame request; and frame storage 124 is updated (324) with the last transmitted encoded video frame.
  • the frame capturer 112 captures (316) the next analog frame via the analog video input 106, said capturing including converting (318) the analog frame to a raw digital format.
  • the raw digital frame undergoes codec processing (320) using the encoding parameters (i.e., first encoding parameters) used to encode the last video frame or using updated (i.e., second) encoding parameters if updated encoding parameters were included in the frame request, and as appropriate, a frame number and/or time stamp to indicate when the video frame was encoded may be inserted with the encoded video frame.
  • the encoded video frame is provided to the transmit buffer 122 for temporary storage before transmission (322) via the I/O interface 108 to the video client 132 in response to the frame request; and frame storage 124 is updated (324) with the last transmitted encoded video frame.
  • Coupled as used herein is defined as connected, although not necessarily directly and not necessarily mechanically.
  • a device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
  • sequence of steps in a flow diagram or elements in the claims, even when preceded by a letter does not imply or require that sequence.
  • processors such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and apparatus for indicating status of channels assigned to a talkgroup described herein.
  • the non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and client input devices.
  • these functions may be interpreted as steps of a method to perform the indicating of status of channels assigned to a talkgroup described herein.
  • some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic.
  • ASICs application specific integrated circuits
  • Both the state machine and ASIC are considered herein as a "processing device" for purposes of the foregoing discussion and claim language.
  • an embodiment can be implemented as a computer-readable storage element or medium having computer readable code stored thereon for programming a computer (e.g., comprising a processing device) to perform a method as described and claimed herein.
  • Examples of such computer-readable storage elements include, but are not limited to, a hard disk, a CD-ROM (Compact Digital Read Only Memory), an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory.

Abstract

A method (200) for asynchronous live video transmission over a communication network includes: receiving (202), from a video client, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client; capturing (204) a video frame that has not been previously sent to the video client, wherein the video frame comprises a portion of live video captured by a video camera; encoding (206) the video frame using the first set of encoding parameters to produce one encoded video frame; and transmitting (208) the one encoded video frame such that only the one encoded video frame is in transit over the communication network in response to the frame request.

Description

METHOD AND APPARATUS FOR ASYNCHRONOUS VIDEO TRANSMISSION OVER A COMMUNICATION NETWORK
TECHNICAL FIELD
[0001] The technical field relates generally to communication systems, and in particular, it relates to a method and apparatus for asynchronously transmitting video over wired or wireless networks.
BACKGROUND
[0002] More and more, video, in addition to or in place of audio, is transmitted over wired and wireless networks. Various protocols and network requirements and usages impose limitations, however, on the transmission of video. For example, meeting public safety requirements for live video transmission in a wireless network is difficult today. Content passes through a number of nodes, routers, and/or computers on its way to the end user or client. At any point along the way, too much traffic or a narrowing of the bandwidth (wherein bandwidth is the data transmission rate capacity or maximum amount of information, in bits/second, that can be transmitted along a channel) will cause the streamed video content to slow down or pause. Therefore, end user clients buffer streamed video content such that a certain amount of the video data is downloaded before it begins to play so that the end client is working through previously received video data while more video data is being downloaded.
[0003] Video is more susceptible than audio to such interruptions. On the one hand, a video encoder (for example one included in a video codec that can both encode and decode a digital data stream or signal) compresses and digitizes video to generate a bit stream at a data rate determined by the encoder's compression algorithm and encoding parameter configuration for required resolution, frame rate, and quality. On the other hand, available data bandwidth in a wireless network can vary significantly during a video session due to client mobility (e.g., due to signal strength variations, cell handovers, sharing with other clients, etc.). For a mobile video client in a wireless network, frequently the network is not able to provide enough bandwidth to transport the video encoder's total bit stream for at least a portion of the video session due to, for example, traffic congestion. This can lead to pauses and breaks in the video transmission and reception and a steady increase in the end-to-end delay making it difficult to watch the video with any degree of continuity. Thus, a client's operational requirements for video will not be reliably met, even in a broadband wireless network.
[0004] Currently, where video streaming mechanisms are used over a wireless network, bandwidth variations mean that video servers have to employ "workarounds", such as manually reconfiguring video encoder bit rates during the video session in an attempt to match prevailing wireless conditions or configuring the encoder to work continuously at a minimum acceptable bandwidth and hope that the chosen bandwidth is available through the entire video session. In products in which neither of the aforementioned options is followed, an end-to-end delay builds up, such as at nodes and routers, in the network infrastructure, and the video sequence is interrupted as bandwidth drops below the minimum level needed. [0005] Current solutions to enhance video streaming in mobile wireless environments have drawbacks. The wireless data bandwidth available to a video codec can change rapidly and significantly over the duration of a video session. This can lead to a mismatch where the video codec generates data at a higher rate than can be transported over a wireless network, resulting in breaks and delays in the received video stream. It is difficult to build a control loop that can dynamically reconfigure the video codec's frame rate and resolution to match the bandwidth available. This is a problem for, for example, public safety customers who have strong requirements for latency to not exceed a maximum time frame, say, for instance, two seconds. [0006] Accordingly, it is desirable to have a novel video transmission mechanism that minimizes latency when transmitting video frames over a communication network.
BRIEF DESCRIPTION OF THE FIGURES
[0007] The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification and serve to further illustrate various embodiments of concepts that include the i claimed invention, and to explain various principles and advantages of those embodiments.
[0008] FIG. 1 is a block diagram illustrating a video streaming system in accordance with some embodiments.
[0009] FIG. 2 is a high level flow diagram of a method for asynchronous live video transmission in accordance with some embodiments.
[0010] FIG. 3 is a detailed flow diagram of a method for asynchronous live video transmission in accordance with some embodiments. [0011] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of various elements. In addition, the description and drawings do not necessarily require the order illustrated. It will be further appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. [0012] Apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the various embodiments so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein. Thus, it will be appreciated that for simplicity and clarity of illustration, common and well- understood elements that are useful or necessary in a commercially feasible embodiment may not be depicted in order to facilitate a less obstructed view of these various embodiments.
DETAILED DESCRIPTION
[0013] Generally speaking, pursuant to the various embodiments, a video server implements a method for asynchronous video transmission over one or more communication networks. The method includes: receiving, from a video client, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client; capturing a video frame that has not been previously sent to the video client, wherein the video frame comprises a portion of live video captured by a video camera; encoding the video frame using the first set of encoding parameters to produce one encoded video frame; and transmitting the one encoded video frame such that only the one encoded video frame is in transit over the communication network in response to the frame request. [0014] In one embodiment, the video server uses a motion compensation algorithm to encode the video frames as predicted (P) frames and intra (I) frames, wherein I-frame periodicity is determined by the video client. In addition, the video client can request retransmission of lost P frames and I frames from the video server, wherein the video server is able to adapt P frame encoding if necessary depending on information from the video client regarding which frame has been lost. Thus, the video server can generate P frames asynchronously, and the video client accordingly decodes P frames asynchronously.
[0015] In yet another embodiment, the video server receives a frame request
(or a retransmission request) having a second set of encoding parameters that is different from the first set of encoding parameters. The video server: updates its codec encoding parameters using the second set of encoding parameters; retrieves and encodes a next video frame (or a previously sent video frame needing retransmission) using the second encoding parameters; and transmits (retransmits) the encoded video frame in response to the frame request (or retransmission request). [0016] In an embodiment, the second set of encoding parameters are determined by the video client in response to detecting an increased delay in receiving an encoded video frame or in response to a failure to receive the encoded video frame altogether, wherein the video client adjusts the encoding parameters for the video server to decrease the size of the encoded video frame so that an acceptable frame rate is achieved at the video client given the current network bandwidth conditions and/or video client connectivity to the network. In another embodiment, the video client changes the encoding parameters to balance a tradeoff between quality of displayed video versus frame rate at the video client. In yet another embodiment, the video client changes the encoding parameters in response to improved network bandwidth and/or video client network connectivity. [0017] Such a dynamic methodology enables the video client to minimize breaks and delays in the video stream being displayed on the client side when the network is experiencing bandwidth limitations or the video client is experiencing network connectivity limitations. By the same token, the video client can adjust the encoding parameters to enable higher resolution and/or frame rate when network bandwidth and/or connectivity improves. Those skilled in the art will realize that the above recognized advantages and other advantages described herein are merely illustrative and are not meant to be a complete rendering of all of the advantages of the various embodiments.
[0018] Referring now to the figures, and in particular FIG. 1, there is shown a block diagram illustrating an asynchronous live video transmission system 100 (i.e., a video transmission system that asynchronously transmits live video frames) in accordance with principles of the present disclosure. As used herein, the term "asynchronous", "asynchronously", or other variations of the word is defined as of or pertaining to operations without the use of fixed time intervals, or may also be defined as having a operation start only after a previous operation has been completed. The term live video is defined as an electronic representation of live scenes that is captured by a video camera in real time as the scenes are occurring, wherein the live video comprises a sequence of video frames (also referred to herein as live video frames), wherein a video frame represents the smallest media unit of live video that may be processed and transmitted by a video server over a communication network. [0019] Video transmission system 100 uses a client-pull technology, wherein a video server transmits encoded video frames only in response to a frame request (or retransmission request) from a video client. More particularly, a novel aspect of the teachings herein is that upon receipt of a single frame request, the video server sends only one encoded video frame to the video client such that only the one encoded video frame is in transit over the communication network in response to the single frame request. In addition, the video client upon detecting and analyzing network conditions can adjust the encoding parameters in response to the network conditions and include the changed encoding parameters within a frame request or retransmission request so that the video server can encode a next video frame to be transmitted with the new encoding parameters. [0020] Turning again to the details of FIG. 1. System 100 comprises a video server 104 and a video client 132 communicatively coupled together over a communication network 130. Communication network 130 may represent one or more wired or wireless communication networks, or combinations thereof, and includes one or more infrastructure devices used to facilitate communications (including live video communications) over the network 130. An infrastructure device is a device that is part of a fixed network infrastructure and can receive information (either control or media, e.g., voice (audio), video, etc.) in a signal and transmit information in signals to one or more end user devices via a communication link. Examples of infrastructure devices include, but are not limited to, equipment commonly referred to as controllers, base stations, base transceiver stations, access points, routers, and the like.
[0021] System 100 further comprises a video camera 102 operatively coupled to the video server 104, wherein the video camera 102 provides live video that is processed by the video server and transmitted as a sequence of encoded video frames over communication network 130 to the video client 104. Finally, system 100 includes a video display 134 operatively to the video client 132 to display video frames that are received and decoded by the video client.
[0022] Video server 104 comprises an analog video input 106 that receives live video from the video camera 102 in an analog signal format such as: an RGB (red, green, blue) component analog signal carried as three separate signals, wherein the video input 106 includes three BNC or RCA electrical jacks coupled to the video camera 102 via one or more suitable cables having corresponding plugs or connectors at both ends of the cable; or a composite analog signal such as NTSC, PAL, or SECAM, wherein the video input 106 includes an RCA jack coupled to the video camera 102 via a suitable cable having corresponding plugs or connectors at both ends of the cable. In another embodiment, analog video input 106 comprises known elements (e.g., processing and transceiver (i.e., transmitter and receiver) elements, wherein some of the functionality of the processing and transceiver may be performed in processing device(s) 110) that enable short range wireless transmission techniques such as Bluetooth technology. [0023] Video server 104 further comprises: one or more processing devices
110 programmed with logic to convert the analog video signal into digital video frames, to encode the digital video frames to produce encoded video frames, and to perform additional functionality as needed or as described herein; one or more memory units 120 that temporarily store video frames (usually encoded video frames) prior to and after transmission and may further store application software used to program processing device(s) 110, thereby enabling video server functionality in accordance with the teachings herein; and an input/output (I/O) interface 108 suitable for transmitting the encoded video frames over the communication network 130 depending on the type of network being used.
[0024] For example, where network 130 comprises a wired network, I/O interface 108 may comprise a serial port interface (e.g., RS-232-C, EIA-232-D, or EIA-232-E) or known elements (e.g., processing and transceiver elements, wherein some of the functionality of the processing and transceiver may be performed in processing device(s) 110) that enable short range wireless transmission techniques such as Bluetooth technology. Where network 130 comprises a wireless network, I/O interface 108 comprises known elements including processing, modulating, and transceiver elements that are operable in accordance with any one or more standard or proprietary wireless interfaces, wherein some of the functionality of the processing, modulating, and transceiver elements may be performed in processing device(s) 110. [0025] Processing device(s) 110 comprise logic including, but not limited to, a frame capturer 112, a codec 114, and a frame request processor 116. The frame capturer 112 "captures" or pulls analog video frames from the video camera 102 via the analog video input 106 and converts the analog frames into a raw digital format, also known as a raw image file or a digital negative. A raw image file is defined as minimally processed data from the image sensor of a video camera that is not usable as an image having a viewable format but has all of the information needed to create an image that that has a viewable format. If the analog frames are provided by the video camera 102 at a faster rate than is requested by the video client 132, the video server 104 is configured to "drop" some of the analog frames or raw frames. CODEC 114 encodes the raw image file using programmed encoding parameters to generate an encoded video frame having a viewable format that may or may not use motion compensation (i.e., the use of I, P, and/or B frames) such as, for instance, JPEG, MPEG, H.264, etc. The encoding algorithm used generally compresses the amount of data that needs to be transmitted relative to the data included in the corresponding raw video frame. The frame request processor 116 receives and processes frame requests and retransmission requests received at the I/O interface 108 from the video client 132.
[0026] Memory unit(s) 120 include suitable data storage capability to implement: a transmit buffer 122 that stores one or more encoded video frames that have not yet been transmitted to the video client; and a frame storage 124 that stores one or more previously transmitted frames, such as a last transmitted (Tx) frame 126 that may be used in response to a retransmission request and/or motion compensation processing. When motion compensation processing is used, the video server 104 stores a most recently transmitted video frame in frame storage 124 to produce a recorded video frame, and uses the recorded video frame as a reference frame to derive a next frame for encoding and transmitting as a predicted frame. In one embodiment, the transmit buffer is implemented as a rolling buffer that stores a plurality of sequential encoded video frames. This helps to eliminate latency at the video server 104 between receiving a frame request from the video client 132 and transmitting the encoded video frame in response to the request, since there is some latency involved with the encoding process. However, even where the CODEC 114 encodes a plurality of sequential video frames that are stored in the rolling buffer 122, each encoded video frame is transmitted such that only one encoded video frame is transmitted in response to any given frame (or retransmission) request processed by the frame request processor 116 and such that only one encoded video frame at a time is in transit over the communication network 130 in response to a given frame (or retransmission) request.
[0027] Video client 132 comprises an input/output (I/O) interface 136 suitable for receiving the encoded video frames over the communication network 130 depending on the type of network being used. For example, where network 130 comprises a wired network, I/O interface 136 may comprise a serial port interface (e.g., RS-232-C, EIA-232-D, or EIA-232-E) or known elements (e.g., processing and transceiver elements, wherein some of the functionality of the processing and transceiver may be performed in processing device(s) 140) that enable short range wireless transmission techniques such as Bluetooth technology. Where network 130 comprises a wireless network, I/O interface 136 comprises known elements including processing, modulating, and transceiver elements that are operable in accordance with any one or more standard or proprietary wireless interfaces (as corresponds to those implemented in the video server), wherein some of the functionality of the processing, modulating and transceiver elements may be performed in processing device(s) 140. [0028] Video client 132 further comprises: one or more memory storage units
150 that include suitable data storage capability for storing one or more encoded video frames received via the I/O interface 136 from the video server 104; one or more processing devices 140 programmed with logic to compose frame requests and retransmission requests each for a single encoded video frame, to decode the encoded video frames (received in response to said frame requests and retransmission requests), and to perform additional functionality as needed or as described herein; and a display driver 138 that includes the necessary processing and hardware for presenting the decoded video frames to the video display 134 so that the live video can be seen by a viewer of the video display 134.
[0029] Memory unit(s) 150 include at least a receive buffer 152 and frame storage 154 and may also include further storage capability for storing application software used to program processing device(s) 110, thereby enabling the video client functionality in accordance with the teachings herein. Receive buffer 152 stores a single encoded video frame that is currently being received via interface 136 (in response to a frame or retransmission request) and that will next be processed by the video client. Frame storage 154 stores a last encoded frame 156 that was received and processed by the video client. The last received (Rx) video frame 156 is stored, for example, for use in motion compensation processing.
[0030] Processing device(s) 140 comprise at least a frame request processor
142 and a CODEC 144. Frame request processor 142 generates frame requests and if needed retransmission requests, each for a single encoded video frame from the video server 104. In one embodiment, the frame request processor 142 generates a next frame request only after receiving an encoded video frame into receive buffer 152. In another embodiment, the frame request processor 142 generates a next frame request only after receiving an encoded video frame into receive buffer 152 and after the CODEC 144 properly decodes the frame that is stored in the receive buffer 152. [0031] In generating a frame request or a retransmission request, the frame request processor 142 can change encoding parameters such as: resolution (which means a number of pixels included in a video frame); quantization (which means a number of bits used to encode each pixel of the video frame); whether a next frame to be encoded is an infra (I) frame or a predicted (P) frame; an infra frame period or I frame periodicity (which means a ratio of P frames generated for every I frame generated), etc. As used herein, an I frame is an encoded video frame generated using a motion compensation algorithm or process, which contains all necessary rendering information within itself for the video client to decode and display the video frame. As used herein, a P frame is an encoded video frame generated using a motion compensation algorithm or process, which contains only information on the differences from a previous frame.
[0032] To determine the encoding parameters for one or more frames, the frame request processor 142 can monitor its current network connectivity and current network conditions such as available bandwidth in the network. If the network connectivity falls below a threshold or if the level of congestion in the network 130 rises about a threshold, then the frame request processor 142 can adjust the encoding parameters to decrease the size of the encoded video frame, for example, by decreasing the resolution and/or the quantization of the encoded video frame, in order to maintain a certain frame rate at the video client. Similarly, if network connectivity improves or the level of congestion decreases, then the frame request processor 142 can adjust the encoding parameters to increase the size of the encoded video frame, for example, by increasing the resolution and/or the quantization of the encoded video frame to maintain the frame rate at the video client. The frame request processor 142 includes the newly determined encoding parameters in a frame request or retransmission request to the video client.
[0033] In one illustrative implementation, the frame request processor 142 is connected to the I/O interface 136 wherein it can measure one or more parameters that indicate signal connectivity such as received signal strength, signal-to-noise ratio, signal-to-interference ratio, and the like. The frame request processor 142 also contains logic used to determine the existence and perhaps the level of network congestion or network loading within communication network 130. For example, each encoded video frame may include a time stamp provided by the video server that indicates the time that the video frame was encoded. The frame request processor 142 can then compare the time stamp to the time at which the encoded video frame was received to determine a time delay of receipt of the encoded video frame. If each successive encoded video frame is received at a slower rate, the frame request processor can determine that there is a decrease in available bandwidth within network 130 and determine new encoding parameters to decrease the size of the encoded video frame. Also, some communication protocols enable the video client to receive information regarding available network bandwidth directly from infrastructure devices within the network, and the video client can, thereby, adjust the encoding parameters responsive to receiving such data from the network. [0034] In a worst case scenario, the video client 132 never receives a transmitted encoded video frame or receives it "late." In a simplest implementation, the frame request processor 142 sets a timer upon sending a frame request and then determines that a frame is lost if an encoded video frame is not received in response to the frame request before expiration of the timer. The frame request processor 142 thereupon requests a new I frame from the video server 104. Video sequence numbers (also referred to herein as frame numbers) can be used to facilitate this implementation so that the video client can request (within the frame request) that the next I frame have a certain sequence (frame) number and so that both the video server and video client are aware of a next encoded video frame in the sequence. Moreover, if the "lost" frame eventually arrives at the video client (as indicated by the sequence number), the video client simply discards the late frame.
[0035] Other more complicated mechanisms can be implemented by the frame request processor 142 to deal with delayed or lost video frames. In such a scenario and depending on a cause of the lost video frame (such cause being determined by the frame request processor 142 and such cause being, for example, an increase in network congestion as indicated, for instance, by a decrease in available bandwidth) the frame request processor 142 may send a retransmission request to the video server to retransmit the frame and may further include new encoding parameters that change (in this case decrease) the size of the next one or more encoded video frames. However, where the cause of the lost video frame is determined to be a result of a temporary loss of connectivity, the frame request processor 142 could issue the retransmission request without changing the encoding parameters. [0036] More particularly, when the timer expires and encoded video frame (N) has not arrived in response to the corresponding frame request and the video client has determined that a temporary loss in connectivity or coverage is the cause, the frame request processor 142 sends a frame request for encoded video frame (N+ 1) to the server with an indication that encoded video frame N did not arrive. Note that the default in the frame request is to indicate to the server that the last frame did arrive and was successfully decoded. If the video server 104 is configured to send P frames, and a P frame is now due, the video server encodes the next P frame based on the last frame that it knows the client successfully received, which requires that frame storage 124 needs to store at least the last transmitted frame and the frame before that, which was confirmed as received by the video client 132. Where the video server is next due to send an I frame, then an I frame is sent.
[0037] As mentioned above, the encoded video frame might be late at the video client because network bandwidth has suddenly reduced (due to, for example, wireless fading, handover to a busy cell, handover from a broadband network to a narrowband network, etc.). In that case, the video client 132, upon determining that that the cause of the late or lost frame is due to bandwidth limitations, requests that encoded frame N be retransmitted with a smaller size, using a change in encoding parameters. For example, as soon as the video server 104 learns that the last transmitted encoded video frame has not arrived (for instance upon receiving a second request from the video client for frame N), it clears out its own transmit buffer in case the congestion is on its own wireless uplink and sends a Null frame to the video client 132 to indicate that the missing frame was actually attempted and to indicate its size. The Null frame serves the purpose of indicating to the video client that the encoded video frame was lost and not the frame request and further indicates the size of the encoded video frame to assist the video client in determining how the respond. The video client 132 can respond to this information by: sending a retransmission request without changing the encoding parameters, so the video server re-sends the encoded video frame at the same size; sending a retransmission request with adjusted encoding parameters that specifies, for instance, a lower resolution or quantization, which results in the retransmitted frame having a smaller size, thereby, making it less likely to be held up in the network; requesting a next new encoded video frame using the current encoding parameters; or requesting a next new encoded video frame using new encoding parameters.
[0038] The video client's decision of how to proceed is based on its own estimate of network latency, which it can derive by examining the delay between sending its last frame request and receiving the video server's Null frame. If the delay is larger than a configured threshold, the video client assumes congestion and requests a smaller frame size (lower resolution and/or quantization). If the delay is smaller than the threshold, the video client assumes that the problem is just a temporary loss of coverage and requests the same size frame with unchanged resolution and quantization.
[0039] The frame request processor 142 may decide to change the encoding parameters at the video server for other reasons. For example, it may be important for a user viewing video on video display 134 to see high resolution video (perhaps to see some greater detail in the video); so the frame request processor would change the encoding parameters to increase the resolution at the expense of having a decreased frame rate (i.e., rate at which successive encoded video frames are received into the receive buffer). Alternatively, the user may determine that a high frame rate is more desirable at the expense of resolution, and the frame request processor 142 accordingly adjusts the encoding parameters. The user (via a graphical user interface) may adjust the encoding parameters in response to user preference, or the frame request processor 142 may automatically adjust the encoding parameters in response to network connectivity and/or available network bandwidth.
[0040] Turning again to the description of the video client 132, CODEC 144 decodes the encoded video frame sitting in the receive buffer 152 using programmed decoding parameters that correspond to the encoding parameters programmed into CODEC 114 to generate a video frame having a format that is viewable on video display 134. CODEC 144 may use codec standards such as JPEG, MPEG, H.264, etc., that may or may not use a motion compensation algorithm. Where motion compensation is used, the CODEC 144 retrieves the last Rx frame 156 from frame storage 154 to properly decode the current frame where the current frame is a P frame. [0041] Turning now to FIG. 2, a flow diagram is shown illustrating a high level method 200 for asynchronous live video transmission in accordance with an embodiment of the teachings herein. The steps of method 200 do not necessarily have to be performed in the order indicated, and more or fewer of the steps may be performed at any given point in time in order to implement the teachings herein. At 202, the frame request processor 116 in the video server 104 receives, from the video client 132 via the I/O interface 108, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client. At 204, the frame capturer 112 captures a video frame that has not been previously sent to the video client 132, wherein the video frame comprises a portion of live video captured by the video camera 102 and received into analog video input 106. At 206, CODEC 114 encodes the video frame using the first set of encoding parameters to produce one encoded video frame that is provided to the transmit buffer 122 and transmitted via I/O interface 108 to the video client 132 such that only the one encoded video frame is in transit over the communication network in response to the frame request.
[0042] In accordance with method 200, the video server need not transmit encoded frames at a known synchronous frame rate. The rate at which frames are transmitted is asynchronous and is determined by the video client since each encoded video frame transmitted by the video server is transmitted only in response to a frame (or retransmission) request and only one encoded video frame is in transit over the communication network 130 in response to a given frame (or retransmission) request. Such asynchronous video transmission enables a control loop mechanism that allows the video client to adjust the rate of video frame transmission in response to changing network conditions and network connectivity and user preference and to match the video server CODEC configurations to available network bandwidth. This enables a customer's maximum latency requirements to be met over changing network conditions.
[0043] Turning now to FIG. 3, a flow diagram is shown illustrating a detailed method 300 for asynchronous live video transmission in accordance with an embodiment of the teachings herein. The steps of method 300 do not necessarily have to be performed in the order indicated, and more or fewer of the steps may be performed at any given point in time in order to implement the teachings herein. [0044] At 302, the frame request processor 116 in the video server 104 receives (302) a frame request. If the frame request processor 116 determines (304) that there are new encoding parameters (i.e., a second set of encoding parameters that are different from a first (current) set of encoding parameters) included in the frame request to change the size of a next encoded video frame (by, for instance, changing one or more of the frame rate at the encoder, the resolution of the encoded video frames, the quantization of the encoded video frames, etc.), the frame request processor 116 updates (306) the encoding parameters for the CODEC 114 with the new (second set of) encoding parameters so that the next one or more video frames are encoded using the new (second set of) encoding parameters. If not, the next one or more video frames are encoded using the current (first) encoding parameters. The frame request may further include a frame number for a next encoded frame. [0045] At 308, if the frame request processor 116 determines that the frame request is a retransmission request, the last transmitted frame 126 is retrieved (312) from frame storage 124. If the frame request contains a particular frame number, the encoded video frame having that frame number is retrieved from frame storage 124. If (314) the retransmission request contained updated encoding parameters (for instance to change the size of the encoded video frame): the last transmitted frame undergoes transcoding (320) in the codec 114 using the updated encoding parameters in the retransmission request; as appropriate, a frame number and/or time stamp to indicate when the video frame was encoded may be inserted with the encoded video frame during the codec processing 320; the encoded video frame having a different frame size is provided to the transmit buffer 122 for temporary storage before retransmission (322) via the I/O interface 108 to the video client 132 in response to the retransmission request; and frame storage 124 is updated (324) with the last transmitted encoded video frame. If (314) the retransmission request does not contain updated encoding parameters, the last transmitted frame can simply be pulled from frame storage 124 into the transmit buffer 122 for retransmission to the video client. [0046] If at 308, the frame request processor 116 determines that the frame request is not a request for retransmission but is a request for a new encoded video frame, the subsequent video server processing depends on whether (310) the video server implements the transmit buffer 122 as a rolling buffer. If the video server implements a rolling buffer, and it is determined (310) that the next encoded frame is in the rolling buffer, and the frame request contained (314) updated encoding parameters (for instance to change the size of the encoded video frame): the next encoded frame is retrieved (312) from the rolling buffer to undergo transcoding (320) in the codec 114 using the updated encoding parameters in the frame request; as appropriate, a frame number and/or time stamp to indicate when the video frame was encoded may be inserted with the encoded video frame; the encoded video frame having a different frame size from the previous encoded video frame is provided to the transmit buffer 122 for temporary storage before transmission (322) via the I/O interface 108 to the video client 132 in response to the frame request; and frame storage 124 is updated (324) with the last transmitted encoded video frame. If (314) the frame request does not contain updated encoding parameters, the next encoded video frame in the rolling buffer 122 is transmitted (322) to the video client, and frame storage 124 is updated (324) with the last transmitted encoded video frame. [0047] If the video server does not implement a rolling buffer, the frame capturer 112 captures (316) the next analog frame via the analog video input 106, said capturing including converting (318) the analog frame to a raw digital format. The raw digital frame undergoes codec processing (320) using the encoding parameters (i.e., first encoding parameters) used to encode the last video frame or using updated (i.e., second) encoding parameters if updated encoding parameters were included in the frame request, and as appropriate, a frame number and/or time stamp to indicate when the video frame was encoded may be inserted with the encoded video frame. The encoded video frame is provided to the transmit buffer 122 for temporary storage before transmission (322) via the I/O interface 108 to the video client 132 in response to the frame request; and frame storage 124 is updated (324) with the last transmitted encoded video frame.
[0048] In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art will appreciate that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
[0049] Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," "has", "having," "includes", "including," "contains", "containing" or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by "comprises ...a", "has ...a", "includes ...a", "contains ...a" does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms "a" and "an" are defined as one or more unless explicitly stated otherwise herein. The terms "substantially", "essentially", "approximately", "about" or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term "coupled" as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is "configured" in a certain way is configured in at least that way, but may also be configured in ways that are not listed. Also, the sequence of steps in a flow diagram or elements in the claims, even when preceded by a letter does not imply or require that sequence.
[0050] It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or "processing devices") such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and apparatus for indicating status of channels assigned to a talkgroup described herein. The non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and client input devices. As such, these functions may be interpreted as steps of a method to perform the indicating of status of channels assigned to a talkgroup described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used. Both the state machine and ASIC are considered herein as a "processing device" for purposes of the foregoing discussion and claim language.
[0051] Moreover, an embodiment can be implemented as a computer-readable storage element or medium having computer readable code stored thereon for programming a computer (e.g., comprising a processing device) to perform a method as described and claimed herein. Examples of such computer-readable storage elements include, but are not limited to, a hard disk, a CD-ROM (Compact Digital Read Only Memory), an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
[0052] The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.

Claims

CLAIMSWhat is claimed is:
1. A method for asynchronous live video transmission over a communication network, the method comprising: at a video server: receiving, from a video client, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client; capturing a video frame that has not been previously sent to the video client, wherein the video frame comprises a portion of live video captured by a video camera; encoding the video frame using the first set of encoding parameters to produce one encoded video frame; and transmitting the one encoded video frame such that only the one encoded video frame is in transit over the communication network in response to the frame request.
2. The method of claim 1 , wherein capturing a video frame comprises: capturing a video frame comprising an analog format; and converting the video frame into a raw digital format.
3. The method claim 1, wherein the video frame is encoded using a motion compensation algorithm for encoding predicted (P) frames and intra (I) frames, the method further comprising: recording the encoded video frame to produce a recorded frame; and using the recorded frame as a reference frame to derive a next frame for encoding and transmitting as a P frame.
4. The method of claim 3, wherein I frame periodicity is determined by the video client.
5. The method of claim 1 further comprising: receiving, from the video client, a second frame request for a single video frame, wherein the second frame request includes a second set of encoding parameters for encoding a next video frame to produce a next encoded video frame having a different size than the previously transmitted encoded video frame.
6. The method of claim 5, wherein the second set of encoding parameters is based on a determination by the video client of available bandwidth in the communication network.
7. The method of claim 1 further comprising: capturing a next video frame; receiving, from the video client, a frame request for a next video frame, encoding the next video frame using the first set of encoding parameters to produce a next encoded video frame; and transmitting the next encoded video frame to the video client in response to the request for a next video frame.
8. The method of claim 1 further comprising: receiving a request for retransmission, wherein the request for retransmission includes a second set of encoding parameters to change the size of the encoded video frame; changing the first set of encoding parameters to the second set of encoding parameters; encoding the video frame using the second set of encoding parameters to produce the encoded video frame; and retransmitting the encoded video frame in response to the request for retransmission.
9. The method of claim 8, wherein the second set of encoding parameters acts on the video frame to at least one of: change a resolution of the video frame; or change a quantization of the video frame.
10. The method of claim 8, wherein the second set of encoding parameters is based on a determination by the video client of available bandwidth in the communication network.
11. The method of claim 1 , wherein the encoded video frame is transmitted with a frame number.
12. The method of claim 11 , wherein a frame number of a next encoded frame is included in the frame request.
13. The method of claim 1 , wherein the encoded video frame is transmitted with a time stamp to indicate when the video frame was encoded.
14. The method of claim 1, wherein the video frame is stored, after encoding, in a rolling buffer.
15. A computer-readable storage element having computer readable code stored thereon for programming a computer to perform a method for asynchronous live video transmission over a communication network, the method comprising: receiving, from a video client, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client; capturing a video frame that has not been previously sent to the video client, wherein the video frame comprises a portion of live video captured by a video camera; encoding the video frame using the first set of encoding parameters to produce one encoded video frame; and transmitting the one encoded video frame such that only the one encoded video frame is in transit over the communication network in response to the frame request.
16. A video server for asynchronous live video transmission over a communication network, the video server comprising: a frame request processor that receives, from a video client, a frame request for a single video frame, wherein the frame request includes a first set of encoding parameters determined by the video client; a frame capturer that captures a video frame that has not been previously sent to a video client, wherein the video frame comprises a portion of live video captured by a video camera; an encoder that is programmed to encode the video frame using the first set of encoding parameters to produce one encoded video frame; and an input/output interface that transmits the one encoded video frame such that only the one encoded video frame is in transit over the communication network in response to the frame request.
17. The video server of claim 16 further comprising: a buffer that temporarily stores the one encoded video frame prior to it being transmitted.
18. The video server of claim 17, wherein the buffer comprises a rolling buffer that temporarily stores at least one other encoded video frame.
19. The video server of claim 16, wherein the encoder is programmed to perform a motion compensation algorithm for generating intra (I) frames and predicted (P) frames having an I frame period determined by the video client.
EP10727998A 2009-04-06 2010-03-25 Method and apparatus for asynchronous video transmission over a communication network Withdrawn EP2417766A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0905977A GB2469281B (en) 2009-04-06 2009-04-06 Method and apparatus for asynchronous video transmission over a communication network
PCT/US2010/028564 WO2010117644A1 (en) 2009-04-06 2010-03-25 Method and apparatus for asynchronous video transmission over a communication network

Publications (1)

Publication Number Publication Date
EP2417766A1 true EP2417766A1 (en) 2012-02-15

Family

ID=40750211

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10727998A Withdrawn EP2417766A1 (en) 2009-04-06 2010-03-25 Method and apparatus for asynchronous video transmission over a communication network

Country Status (3)

Country Link
EP (1) EP2417766A1 (en)
GB (1) GB2469281B (en)
WO (1) WO2010117644A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10085050B2 (en) 2014-08-15 2018-09-25 Xiaomi Inc. Method and apparatus for adjusting video quality based on network environment
CN104202614B (en) * 2014-08-15 2016-03-09 小米科技有限责任公司 A kind of method of Network Environment adjustment video image quality and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2848326B2 (en) * 1996-03-28 1999-01-20 日本電気株式会社 MPEG encoded image decoding device
JP4495821B2 (en) * 2000-03-06 2010-07-07 株式会社東芝 Data transmission system and its communication device
WO2001080558A2 (en) * 2000-04-14 2001-10-25 Solidstreaming, Inc. A system and method for multimedia streaming
GB0021891D0 (en) * 2000-09-06 2000-10-25 Wave Ltd M Adaptive video delivery
NO315887B1 (en) * 2001-01-04 2003-11-03 Fast Search & Transfer As Procedures for transmitting and socking video information
US7225459B2 (en) * 2001-10-17 2007-05-29 Numerex Investment Corproation Method and system for dynamically adjusting video bit rates

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2010117644A1 *

Also Published As

Publication number Publication date
WO2010117644A1 (en) 2010-10-14
GB2469281B (en) 2011-08-10
GB0905977D0 (en) 2009-05-20
GB2469281A (en) 2010-10-13

Similar Documents

Publication Publication Date Title
US9716737B2 (en) Video streaming in a wireless communication system
US8711929B2 (en) Network-based dynamic encoding
JP3931595B2 (en) Data correction apparatus and data correction method
US20150373075A1 (en) Multiple network transport sessions to provide context adaptive video streaming
US9585062B2 (en) System and method for implementation of dynamic encoding rates for mobile devices
US11197051B2 (en) Systems and methods for achieving optimal network bitrate
US9014277B2 (en) Adaptation of encoding and transmission parameters in pictures that follow scene changes
EP3016395B1 (en) Video encoding device and video encoding method
KR20090124993A (en) Transmission apparatus, transmission method, and reception apparatus
US20120239999A1 (en) Video transmitting apparatus, video receiving apparatus, and video transmission system
AU2021200428B2 (en) System and method for automatic encoder adjustment based on transport data
CN110012363B (en) Video chat system based on SIP protocol
US20120236927A1 (en) Transmission apparatus, transmission method, and recording medium
US10085029B2 (en) Switching display devices in video telephony
US9313508B1 (en) Feeding intra-coded video frame after port reconfiguration in video telephony
EP2417766A1 (en) Method and apparatus for asynchronous video transmission over a communication network
US9118803B2 (en) Video conferencing system
US8290063B2 (en) Moving image data conversion method, device, and program
WO2021140768A1 (en) Transmission device and transmission method
CN104702970A (en) Video data synchronization method, device and system
WO2020214048A1 (en) Method for adapting video image transmission
KR20190067386A (en) Method and System for reducing battery consumption

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20111104

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20120216