WO2012020686A1 - 情報処理装置および方法、並びにプログラム - Google Patents
情報処理装置および方法、並びにプログラム Download PDFInfo
- Publication number
- WO2012020686A1 WO2012020686A1 PCT/JP2011/067802 JP2011067802W WO2012020686A1 WO 2012020686 A1 WO2012020686 A1 WO 2012020686A1 JP 2011067802 W JP2011067802 W JP 2011067802W WO 2012020686 A1 WO2012020686 A1 WO 2012020686A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- transmission
- unit
- time
- data
- reception buffer
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 78
- 230000010365 information processing Effects 0.000 title claims abstract description 23
- 230000005540 biological transmission Effects 0.000 claims abstract description 303
- 239000000872 buffer Substances 0.000 claims abstract description 219
- 230000006835 compression Effects 0.000 claims abstract description 28
- 238000007906 compression Methods 0.000 claims abstract description 28
- 238000012545 processing Methods 0.000 claims description 136
- 230000008569 process Effects 0.000 claims description 59
- 230000001360 synchronised effect Effects 0.000 claims description 22
- 238000003672 processing method Methods 0.000 claims description 5
- 230000001934 delay Effects 0.000 abstract description 3
- 230000015556 catabolic process Effects 0.000 abstract 1
- 238000006731 degradation reaction Methods 0.000 abstract 1
- 238000013139 quantization Methods 0.000 description 25
- 238000004458 analytical method Methods 0.000 description 22
- 238000001914 filtration Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 14
- 230000008859 change Effects 0.000 description 6
- 230000006866 deterioration Effects 0.000 description 6
- 238000009499 grossing Methods 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 239000002699 waste material Substances 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000011084 recovery Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 238000007796 conventional method Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L7/00—Arrangements for synchronising receiver with transmitter
- H04L7/0016—Arrangements for synchronising receiver with transmitter correction of synchronization errors
- H04L7/0033—Correction by delay
- H04L7/0041—Delay of data signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44004—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04J—MULTIPLEX COMMUNICATION
- H04J3/00—Time-division multiplex systems
- H04J3/02—Details
- H04J3/06—Synchronising arrangements
- H04J3/062—Synchronisation of signals having the same nominal but fluctuating bit rates, e.g. using buffers
- H04J3/0632—Synchronisation of packets and cells, e.g. transmission of voice via a packet network, circuit emulation service [CES]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L47/00—Traffic control in data switching networks
- H04L47/10—Flow control; Congestion control
- H04L47/24—Traffic characterised by specific attributes, e.g. priority or QoS
- H04L47/2416—Real-time traffic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L47/00—Traffic control in data switching networks
- H04L47/10—Flow control; Congestion control
- H04L47/28—Flow control; Congestion control in relation to timing considerations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/637—Control signals issued by the client directed to the server or network components
- H04N21/6375—Control signals issued by the client directed to the server or network components for requesting retransmission, e.g. of data packets lost or corrupted during transmission from server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
- H04N21/6437—Real-time Transport Protocol [RTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/647—Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
Definitions
- the present invention relates to an information processing apparatus, method, and program, and more particularly, to an information processing apparatus, method, and program that can suppress deterioration in content quality.
- a broadcasting station there is a system that performs uncompressed synchronous transmission by connecting a camera and its control unit (so-called CCU (Camera Control Unit)) by an HD-SDI (High Definition Serial Digital Interface) cable.
- CCU Camera Control Unit
- HD-SDI High Definition Serial Digital Interface
- the system HD-SDI cable is replaced with an Ethernet (registered trademark) cable, and transmission is performed while performing GENLOCK synchronization using IP packets on the Ethernet (registered trademark).
- encoding can be started without waiting until all the data in the picture is input. Therefore, even when the generated encoded data is transmitted over the network and decoded on the receiving side, the decoding process can be started before receiving all the data in the picture. That is, if the network propagation delay is sufficiently small, real-time (immediate) moving image transmission with a delay equal to or shorter than the frame interval becomes possible.
- the receiving device needs to buffer (temporarily hold) the received data in order to cope with delays such as encoding processing, data transmission processing, and QoS control processing.
- settings such as encoding processing and QoS control processing are performed according to the buffer time on the receiving side (so as not to overflow). That is, the image quality and transmission quality of the decoded image depend on the buffer time.
- the reception buffer time for each transmission device is set independently of each other, or in advance, It was set to a predetermined time (a common reception buffer time was used).
- the delay (transmission delay) related to data transmission from each transmission device is often different for each transmission device because the data transmission path and bandwidth are different from each other.
- the reception buffer time is set without considering the difference in transmission delay (set independently of each other or set in common).
- the received data is held longer than the reception buffer time.
- the image quality and the transmission quality of the decoded image may not be improved. That is, there is a possibility that unnecessary waiting time may occur. In other words, there is a risk that the quality of content is unnecessarily deteriorated by data transmission.
- the present invention has been made in view of such a situation, and allows the time margin of each process to be used for improving image quality and transmission quality, thereby reducing waste in delay time.
- the purpose is to do so.
- a difference in transmission delay which is a delay time generated on the transmission path, of each data transmission
- adjusting means for adjusting the reception buffer time which is set for each data transmission and is a buffer time for synchronizing each data in the receiving device, and the reception buffer time adjusted by the adjusting means.
- an information processing apparatus including setting means for setting parameters for processing related to each data transmission.
- the adjustment means obtains a maximum value of the transmission delay, and adds a difference between the transmission delay and the maximum value of each data to a predetermined reception buffer time that is a predetermined reception buffer time,
- the reception buffer time can be set.
- the processing related to data transmission is QoS control processing of the data transmission
- the setting means is a redundant time which is a time from reception of the first packet to the last packet of a redundant coding block as a parameter of the QoS control processing. It is possible to set an encoding block reception waiting time, a retransmission packet waiting time which is a time for waiting for a retransmission packet, and a network jitter corresponding buffer time for absorbing network jitter.
- the data is encoded at the transmission source, the obtained encoded data is transmitted, the encoded data is decoded at the transmission destination, and the setting means is rate-controlled in the encoding as a parameter of the processing. It is possible to set a variable compression encoding delay request time required when the generated encoded data is smoothed and transmitted.
- the image processing apparatus further comprises an accepting unit that accepts an image quality request that is a request related to the image quality of the data and a transmission quality request that is a request related to the transmission quality in the data transmission, and the adjustment unit receives the image quality received by the accepting unit
- the reception buffer time can be adjusted based on the request and the transmission quality request.
- the adjustment means sets the temporary reception buffer time based on the image quality request and the transmission quality request received by the reception means, and uses the temporary reception buffer time to set the reception buffer time. Can be adjusted.
- It may further comprise output means for displaying a GUI for assisting the input of the image quality request and the transmission quality request received by the receiving means.
- One aspect of the present invention is also an information processing method of an information processing device, wherein the adjustment unit of the information processing device transmits synchronized data from a plurality of transmission devices to one reception device.
- the adjustment unit of the information processing device transmits synchronized data from a plurality of transmission devices to one reception device.
- it is a buffer time for synchronizing each data in the receiving device, which is set for each data transmission using a difference in transmission delay which is a delay time generated on the transmission path of each data transmission.
- the reception buffer time is adjusted, and the setting unit of the information processing apparatus sets the parameters of the processing related to each data transmission using the adjusted reception buffer time.
- a computer for controlling data transmission is further provided in a data transmission for transmitting synchronized data from a plurality of transmission apparatuses to one reception apparatus on a transmission path of each data transmission.
- Adjusting means for adjusting a reception buffer time which is a buffer time for synchronizing each data in the receiving apparatus, set for each data transmission using a difference in transmission delay which is a generated delay time;
- This is a program for functioning as a setting means for setting a parameter for processing relating to each data transmission, using the reception buffer time adjusted by the above.
- a transmission delay that is a delay time generated on the transmission path of each data transmission is determined.
- the reception buffer time which is set for each data transmission and is a buffer time for synchronizing each data in the receiving device, is adjusted, and the adjusted reception buffer time is used for each data transmission. Processing parameters are set.
- data can be transmitted.
- it is possible to suppress deterioration in content quality.
- FIG. 17 is a flowchart following FIG. 16 for explaining an example of the flow of reception buffer time determination processing. It is a flowchart explaining the example of the flow of a reception buffer dynamic change transmission process. It is a block diagram which shows the main structural examples of the personal computer to which this invention is applied.
- FIG. 1 is a block diagram showing a main configuration example of a transmission system to which the present invention is applied.
- the transmission system 100 receives image data from a plurality of transmission apparatuses (transmission apparatuses 101-1 to 101-N (N is an integer of 2 or more)) via a network 102 that is a general-purpose transmission path such as the Internet or a LAN.
- the data is transmitted to the receiving apparatus 103.
- the network 102 includes not only cables and the like, but also devices such as routers and hubs.
- Image data (video data) input to each transmitting apparatus 101 is transmitted in real time (immediately) via the network 102 to the receiving apparatus 103 and output from the receiving apparatus 103. That is, video data input to each transmitting apparatus 101 at a predetermined (for example, normal playback) frame rate is encoded by the transmitting apparatus 101, transmitted to the receiving apparatus 103 as encoded data, and decoded by the receiving apparatus 103. Then, the signal is output from the receiving apparatus 103 at a predetermined (for example, normal reproduction) frame rate with a delay of a predetermined delay time.
- a predetermined for example, normal playback
- the receiving device 103 synthesizes and outputs (reproduces) the data from the transmitting devices 101 while synchronizing with each other. That is, data transmission from each transmission apparatus 101 is performed so that such synchronous reproduction does not break.
- a camera and its control unit are connected by an HD-SDI cable, and uncompressed synchronous transmission is performed between them.
- this HD-SDI cable has been replaced with an Ethernet (registered trademark) cable, and transmission is performed while performing GENLOCK synchronization using IP packets on the Ethernet (registered trademark).
- Patent Document 1 proposes a coding method in which several lines of each picture of a moving picture are set as one coding block (line block).
- the transmission apparatus 101 can start encoding without waiting until all the data in the picture is input. Further, the transmission apparatus 101 can sequentially transmit the encoded data obtained by the encoding to the reception apparatus 103 via the network 102.
- the receiving apparatus 103 can start decoding before receiving all the data in the picture. If the transmission delay of the network 102 (the delay occurring in the network 102 of data transmission from the transmission apparatus 101 to the reception apparatus 103) is sufficiently small, real-time moving image transmission with a delay equal to or less than the frame interval (the frame when the transmission apparatus 101 is input) Data transmission that can be output from the receiving apparatus 103 at a rate).
- a reception buffer is provided in a reception unit in the reception device corresponding to each transmission device for the synchronization processing, and different reception buffers are provided. Synchronous processing was realized by setting the time.
- the parameters in the encoding process and the QoS control process are not changed according to the difference in the reception buffer time and the reception buffer time for each reception unit corresponding to a plurality of transmission apparatuses.
- the transmission delay is not increased by adjusting the encoding process and the QoS control process parameter, they are not adjusted and a delay time is wasted.
- the transmission system 100 adjusts parameters such as encoding processing and QoS control processing in each data transmission according to the difference in the length of the transmission delay, thereby increasing the image without increasing the delay time. Quality and video quality can be improved.
- the receiving apparatus 103 includes a transmission unit 111, a reference signal synchronization unit 112, reception units 113-1 to 113-N, an integrated reception buffer time adjustment unit 114, and a synthesis unit 115.
- the transmission unit 111 communicates with each transmission device 101, transmits information supplied from the reference signal synchronization unit 112 and the reception units 113-1 to 113-N, and information supplied from each transmission device 101. Are supplied to the reference signal synchronization unit 112 and the reception units 113-1 to 113-N.
- the reference signal synchronization unit 112 communicates with the transmission device 101 via the transmission unit 111, and synchronizes the reference signal with the transmission device 101.
- the reference signal is a signal for synchronizing processing between the transmission apparatus 101 and the reception apparatus 103.
- the synchronized reference signal is supplied to the receivers 113-1 to 113-N.
- the receiving apparatus 103 (the reference signal synchronization unit 112 thereof) serves as a master, and all the transmitting apparatuses 101 synchronize with the receiving apparatus 103, and as a result, all the transmitting apparatuses 101 and the receiving apparatuses 103 can be synchronized. it can.
- the reception units 113-1 to 113-N correspond to the transmission devices 101-1 to 101-N, respectively, receive RTP packets supplied from the corresponding transmission devices 101, decode them, and video data Is output.
- the receiving unit 113 when it is not necessary to distinguish between the receiving unit 113-1 to the receiving unit 113-N, they are simply referred to as the receiving unit 113. That is, the reception unit 113 is prepared in advance with the number of transmission devices 101 or more. In other words, the reception device 103 can receive packets transmitted from the transmission devices 101 that are equal to or less than the number of built-in reception units 113.
- the integrated reception buffer time adjustment unit 114 adjusts the reception buffer time of each reception unit 113.
- the synthesizer 115 synthesizes the video data output from each receiver 113, and outputs the synthesized video data from the video output terminal “video OUT” of the receiver 103.
- the video data output timings of the receiving units 113 are synchronized with each other.
- the synthesizing unit 115 appropriately synthesizes these video data based on, for example, a user instruction and outputs them.
- the transmission apparatus 101 includes an encoding unit 131, an FEC (Forward Error Correction) unit 132, an RTP (Real-time Transport Protocol) unit 133, a smoothing unit 134, a reference signal synchronization unit 135, media A synchronization unit 136, an RTCP (RTP Control Protocol) unit 137, an ARQ (Auto Repeat Request) unit 138, and a reception buffer time / processing parameter setting unit 139 are provided.
- FEC Forward Error Correction
- RTP Real-time Transport Protocol
- smoothing unit 134 a reference signal synchronization unit 135, media A synchronization unit 136, an RTCP (RTP Control Protocol) unit 137, an ARQ (Auto Repeat Request) unit 138, and a reception buffer time / processing parameter setting unit 139 are provided.
- Video data (moving image data) input from the video input IF “Video IN” via a video camera or the like is supplied to the encoding unit 131.
- the encoding unit 131 performs encoding processing of moving image data using a predetermined encoding method. This encoding method is arbitrary, but a lower delay is desirable. An example of the encoding method will be described later.
- the encoding unit 131 includes a rate control unit 141.
- the rate control unit 141 controls the bit rate of the encoded data generated by the encoding unit 131.
- the encoding unit 131 converts the generated encoded data into an RTP packet and supplies it to the FEC unit 132.
- the FEC unit 132 generates a redundant packet of the RTP packet supplied from the encoding unit 131.
- the RTP unit 133 converts the redundant packet of encoded data supplied from the FEC unit 132 into an RTP packet.
- the smoothing unit 134 temporarily holds the RTP packet supplied from the RTP unit 133, and smoothes the RTP packet to a predetermined data rate before transmission.
- the reference signal synchronization unit 135 communicates with the reference signal synchronization unit 112 of the receiving apparatus 103 that is the transmission destination via the network 102 to synchronize the reference signal clock.
- the reference signal synchronization unit 135 supplies the reference signal synchronized with the receiving apparatus 103 to the media synchronization unit 136.
- the media synchronization unit 136 encodes the time synchronized with the sampling time of the data input to the video IN with the time supplied from the reference signal synchronization unit 135 as an encoding unit (RTP unit in the encoding unit 131). And supplied to the RTP unit 133. This time is added to the RTP packet as an RTP time stamp.
- the RTCP unit 137 exchanges RTCP messages with the receiving apparatus 103 as a transmission destination, and transmits and receives control messages (QoS control messages) for QoS (Quality of Service) control processing.
- the RTCP unit 137 supplies the acquired control message to the ARQ unit 138 and the reception buffer time / processing parameter setting unit 139.
- the ARQ unit 138 controls the smoothing unit 134 according to the retransmission request message supplied from the RTCP unit 137, and retransmits the requested RTP packet.
- the reception buffer time / processing parameter setting unit 139 sets parameters of the encoding unit 131 and the FEC unit 132 in accordance with various delay times and the like supplied from the RTCP unit 137.
- FIG. 3 is a block diagram illustrating a configuration example of the encoding unit 131 of the transmission apparatus 101.
- the encoding unit 131 performs hierarchical encoding in which image data is hierarchized in descending order of importance with respect to resolution, and encoding is performed for each hierarchy. For example, the encoding unit 131 generates hierarchized data that is hierarchized in descending order of the importance of spatial resolution. Further, for example, the encoding unit 131 generates hierarchical data that is hierarchized in order from data having a high importance regarding the resolution in the time direction.
- the encoding unit 131 generates hierarchized data that is hierarchized in order from the data with the highest importance regarding SNR (Signal to Noise Ratio).
- the encoding unit 131 encodes the hierarchical data generated in this way for each hierarchy.
- Hierarchical encoding for example, there is a JPEG (Joint Photographic Experts Group) 2000 system in which each picture of moving image data is wavelet transformed and entropy encoded.
- JPEG Joint Photographic Experts Group 2000 system
- each picture of moving image data is wavelet transformed and entropy encoded.
- the hierarchical encoding method is arbitrary, a case will be described below where the encoding unit 131 performs wavelet transform on each moving image data and entropy encoding.
- the encoding unit 131 includes a wavelet transform unit 161, a quantization unit 162, an entropy encoding unit 163, a rate control unit 141, and an RTP unit 164.
- the wavelet transform unit 161 performs wavelet transform on each picture of the moving image for each of a plurality of lines.
- Wavelet transform is a process of performing analysis filter processing for dividing input data into a low frequency component and a high frequency component in both the horizontal direction and the vertical direction of the screen.
- the input data by the wavelet transform processing is a low-frequency component (LL component) in the horizontal and vertical directions, a high-frequency component in the horizontal direction and a low-frequency component in the vertical direction (HL component), and a vertical component in the low direction in the horizontal direction.
- LL component low-frequency component
- HL component high-frequency component in the horizontal direction and a low-frequency component in the vertical direction
- HH component high-frequency component
- the wavelet transform unit 161 recursively repeats such wavelet transform processing a predetermined number of times for the low-frequency component (LL component) in the horizontal and vertical directions obtained by the analysis filter processing. That is, by the wavelet transform process, each picture of the moving image data is divided into a plurality of hierarchized subbands (frequency components) (hierarchical data is generated).
- the entropy encoding unit 163 performs encoding for each subband.
- the image data of each picture of the moving image is input to the wavelet transform unit 161 line by line in the order from the top to the bottom of the image.
- the image data of each line is input by one sample (one column) in order from the left to the right of the image.
- the wavelet transform unit 161 performs analysis filtering in the horizontal direction of the image (horizontal analysis filtering) every time data having the number of samples that can be subjected to analysis filtering is obtained (as soon as it is obtained) for the image data input as such. Execute. For example, the wavelet transform unit 161 performs horizontal analysis filtering on the baseband image data 181 shown on the left in FIG. 4 every time M columns are input, and performs a low-frequency component (L ) And high-frequency component (H).
- the horizontal analysis filter processing result 182 shown on the right side of FIG. 4 shows a low-frequency component (L) and a high-frequency component (H) in the horizontal direction for N lines divided by the wavelet transform unit 161.
- the wavelet transform unit 161 performs analysis filtering (vertical analysis filtering) in the vertical direction on each component of the horizontal analysis filter processing result 182.
- analysis filtering vertical analysis filtering
- the wavelet transform unit 161 performs the vertical analysis filtering for each column with respect to the coefficient for the vertical line necessary for the vertical analysis filtering. I do.
- the horizontal analysis filter processing result 182 has a low frequency component (LL component) in both the horizontal direction and the vertical direction, a high frequency in the horizontal direction, and a low frequency in the vertical direction, as shown on the left of FIG.
- Wavelet transform coefficients of four components a component (HL component), a component that is low in the horizontal direction and a component that is high in the vertical direction (LH component), and a component that is high in both the horizontal and vertical directions (HH component) (Hierarchical data 183).
- HL component, LH component, and HH component among the obtained analysis filtering results are output to the outside until a coefficient of a predetermined hierarchy (division level) is obtained.
- the remaining LL component is subjected to analysis filtering again by the wavelet transform unit 161. That is, for example, the hierarchized data 183 shown on the left of FIG. 5 is converted into the hierarchized data 184 shown on the right of FIG. In the hierarchized data 184, four components of LLLL component, LLHL component, LLLH component, and LLHH component are generated from the LL component.
- FIG. 6 is a diagram illustrating an example of hierarchized data hierarchized up to division level 3 (three hierarchies).
- the hierarchized data 185 divided up to the division level 3 includes the 3HL component, the 3LH component, and the 3HH component at the division level 1 (hierarchy number 3), the 2HL component, the 2LH component at the division level 2 (hierarchy number 2). And 2HH components, and 1LL component, 1HL component, 1LH component, and 1HH component of division level 3 (hierarchy number 1).
- each time the filtering process is repeated (every time the hierarchy is lowered by one), the number of lines to be generated is decreased by one power of 2.
- the number of baseband lines necessary to generate one line of the coefficient of the final division level (hierarchy number 1) is determined by how many times the filtering process is repeated (the number of hierarchies at the final division level). Usually, the number of hierarchies is predetermined.
- the baseband image data (image data for a plurality of lines) necessary for generating one line of the final division level coefficient is collectively referred to as a line block (or precinct).
- the hatched portion is a coefficient constituting one line block.
- the line block includes a coefficient for one line of each component of layer number 1, a coefficient for two lines of each component of layer number 2, and four lines of each component of layer number 3. It is comprised by the coefficient of.
- image data before analysis filtering corresponding to these that is, image data for eight lines in this example is also referred to as a line block (or precinct).
- the quantization unit 162 quantizes the coefficient of each component generated by the wavelet transform unit 161 by dividing the coefficient by, for example, a quantization step size, and generates a quantization coefficient. At this time, the quantization unit 162 can set the quantization step size for each line block (precinct). Since this line block includes coefficients of all frequency components (10 frequency components from 1LL to 3HH in the case of FIG. 6) in a certain image region, if quantization is performed for each line block, the wavelet The advantage of multi-resolution analysis, which is a feature of conversion, can be utilized. Further, since only the number of line blocks needs to be determined on the entire screen, the quantization load can be reduced.
- the energy of the image signal is generally concentrated in the low-frequency component, and the characteristic of the deterioration of the low-frequency component is conspicuous in human vision. It is effective to perform weighting so that the quantization step size in the band becomes a small value as a result. By this weighting, a relatively large amount of information is assigned to the low frequency component, and the overall subjective image quality is improved.
- the entropy encoding unit 163 encodes the quantization coefficient generated by the quantization unit 162 as an information source, generates compressed encoded data, and supplies the encoded data to the rate control unit 141.
- information source coding for example, Huffman coding used in the JPEG system or MPEG (Moving Picture Experts Group) system or higher-precision arithmetic coding used in the JPEG2000 system can be used.
- the range of coefficients for which entropy encoding is performed is a very important factor directly related to the compression efficiency.
- DCT Discrete Cosine Transform
- the wavelet transform unit 161 performs the wavelet transform in units of lines. Therefore, the entropy coding unit 163 performs each frequency band independently for each frequency band (subband). The information source is encoded for each P line.
- 1 line is the minimum for P, but if the number of lines is small, less reference information is required and the memory capacity can be reduced. On the contrary, when the number of lines is large, the amount of information increases accordingly, so that the coding efficiency can be improved. However, when P exceeds the number of lines in the line block in each frequency band, the next line block is required. For this reason, it is necessary to wait until the quantized coefficient data of the line block is generated by wavelet transform and quantization, and this time becomes a delay time.
- P needs to be less than the number of lines in the line block.
- the rate control unit 141 finally performs control to match the target bit rate or compression rate, and supplies the encoded data after rate control to the RTP unit 164. For example, the rate control unit 141 compares the bit rate (compression rate) of the encoded data output from the entropy encoding unit 163 with a target value, and when increasing the bit rate, decreases the quantization step size, When the bit rate is lowered, a control signal is transmitted to the quantization unit 162 so as to increase the quantization step size.
- the RTP unit 164 converts the encoded data supplied from the rate control unit 141 into RTP packets and supplies the RTP packets to the FEC unit 132.
- FIG. 7 is a block diagram illustrating a main configuration example of the reception unit 113.
- the reception unit 113 includes a reception unit 201, a reception buffer 202, an RTP unit 203, an FEC unit 204, a decoding unit 205, an ARQ unit 206, an RTCP unit 207, a reception buffer time setting unit 208, and media.
- a synchronization unit 210 is included.
- the reception unit 201 receives the RTP packet transmitted from the transmission device 101 corresponding to itself via the network 102 and supplied via the transmission unit 111, and supplies it to the reception buffer 202.
- the reception buffer 202 temporarily holds the RTP packet supplied from the reception unit 201 in order to synchronize with data transmission by the other reception unit 113, and then receives the reception buffer time setting unit 208 and the media synchronization unit.
- the data is supplied to the RTP unit 203 at a time determined based on information from 210.
- the RTP unit 203 reconstructs the RTP packet, generates FEC redundant encoded data that is encoded data including the redundant packet, and supplies it to the FEC 204.
- the FEC unit 204 detects packet loss, and recovers lost packet data by redundant encoding / decoding processing as necessary.
- the FEC unit 204 supplies the encoded data after processing to the decoding unit 205.
- the decoding unit 205 decodes the encoded data by a decoding method corresponding to the encoding processing method by the encoding unit 131.
- the decoded video data (moving image data) is output from the video output IF (video OUT) of the receiving device 103 to, for example, a video display device (not shown) such as a display.
- the ARQ unit 206 detects a lost packet (packet that could not be received) in the receiving unit 201. If the ARQ unit 206 detects a lost packet, the ARQ unit 206 controls the RTCP unit 207 and transmits a retransmission request message to the ARQ unit 138 of the transmission apparatus 101. Let The RTCP unit 207 supplies the retransmission request message requested from the ARQ unit 206 and various setting information supplied from the reception buffer time setting unit 208 to the RTCP unit 137 of the transmission apparatus 101 as an RTCP message.
- the reception buffer time setting unit 208 sets and adjusts the reception buffer time based on the control of the integrated reception buffer time adjustment unit 114 and the like.
- the media synchronization unit 210 and the RTP packet output timing of the reception buffer 202 and the decoding unit 205 Controls the decoding processing start timing and the like.
- FIG. 8 is a block diagram illustrating a configuration example of the decoding unit 205 of the reception unit 113.
- the decoding unit 205 includes an RTP unit 230, an entropy decoding unit 231, an inverse quantization unit 232, and a wavelet inverse transform unit 233.
- the RTP unit 230 converts the RTP packet supplied from the FEC unit 204 into encoded data and supplies it to the entropy decoding unit 231.
- the entropy decoding unit 231 decodes the encoded data by a method corresponding to the encoding method of the entropy encoding unit 163, and generates quantized coefficient data. For example, Huffman decoding or highly efficient arithmetic decoding can be used.
- the entropy decoding unit 163 performs encoding for each P line
- the entropy decoding unit 231 decodes each subband independently and decodes the inside of each subband for each P line.
- the inverse quantization unit 232 performs inverse quantization by multiplying the quantization coefficient data by the quantization step size to generate coefficient data.
- This quantization step size is normally described in a header of encoded data supplied from the transmission apparatus 101 or the like. If the quantization step size is set for each line block in the quantization unit 162, the inverse quantization step size is set for each line block in the inverse quantization unit 232 as well. Quantized.
- the wavelet inverse transform unit 233 performs an inverse process of the wavelet transform unit 161. That is, the wavelet inverse transform unit 233 performs filter processing (synthesis filter processing) that synthesizes the low frequency component and the high frequency component on the coefficient data divided into a plurality of frequency bands by the wavelet transform unit 161 in the horizontal direction and the vertical direction. Do for both directions.
- the wavelet inverse transformation unit 233 restores the baseband video data by the wavelet inverse transformation processing in this way, and outputs the video data from the video OUT to the outside of the receiving apparatus 103.
- the encoding unit 131 and the decoding unit 205 described above are examples, and other encoding / decoding methods may be used.
- the transmission apparatus 101 inputs video data input from a video input IF “video IN” via a video camera or the like to an encoding unit 131 (FIG. 2) that performs compression encoding processing of moving image data, and encodes the video data. .
- the encoded data generated by this encoding is subjected to RTP packetization processing by the RTP unit 164 (FIG. 3) inside the encoding unit 131, and is supplied to the FEC unit 132 (FIG. 2) as an RTP packet.
- the FEC unit 132 performs FEC redundancy encoding processing on the supplied RTP packet to generate a redundant packet.
- the FEC unit 132 supplies the generated redundant packet to the RTP unit 133 together with the RTP packet of the original data.
- the RTP unit 133 converts the redundant packet supplied from the FEC unit 132 into an RTP packet.
- the smoothing unit 134 smoothes the rate of the RTP packet supplied from the RTP unit 133 and transmits it to the network 102.
- the RTP time stamp for synchronization designated by the media synchronization unit 136 is set in the RTP unit 164 (FIG. 3) or the RTP unit 133 (FIG. 2).
- the receiving device 103 temporarily receives the RTP packets transmitted from the plurality of transmitting devices 101 by the transmitting unit 111 and distributes them to the receiving units 113 corresponding to each transmitting device 101.
- the receiving unit 113-K are distributed to the receiving unit 113-K.
- the reception unit 113 supplies the RTP packet to the reception buffer 202 via the reception unit 201 (FIG. 7) and stores it. If a lost packet is detected at this time, this is notified to the ARQ unit 206. Upon receiving the notification, the ARQ unit 206 performs a retransmission request process.
- the reception buffer 202 uses the reception buffer time determined by the reception buffer time setting unit 208, the time information notified from the media synchronization unit 210, and the RTP timestamp value set in each RTP packet, from the reception buffer 202.
- a reception buffer output time which is a time for outputting an RTP packet, is determined, and each RTP packet is output to the RTP unit 203 at that time.
- the RTP unit 203 reconstructs the RTP packet and supplies the obtained FEC redundant encoded data to the FEC unit 204.
- the FEC unit 204 recovers lost packet data through redundant encoding / decoding processing.
- the FEC unit 204 outputs the processed RTP packet to the decoding unit 205.
- the decoding unit 205 extracts encoded data from the RTP packet, performs decoding processing on the encoded data, and generates baseband video data.
- the decoded video data is supplied to the synthesis unit 115 (FIG. 1).
- the synthesizing unit 115 synthesizes the video data images from the plurality of receiving units 113 and outputs them to a video display device such as a display from the video output IF “video OUT”.
- the reference signal synchronization unit 112 (FIG. 1) synchronizes the reference signal clock between the transmission apparatus 101 and the reception apparatus 103 using IEEE 1588 PTP (Precision Time Protocol). Note that, as the frequency of the reference signal clock, for example, the pixel sampling frequency of the input video image may be used. In this case, synchronization with the video output device input from “Video IN” can also be performed.
- the receiving apparatus 103 may not include the reference signal synchronization unit 112 and may not synchronize between the transmitting apparatus 101 and the receiving apparatus 103.
- the reception apparatus 103 may be a master and all the transmission apparatuses 101 may be synchronized with the reception apparatuses 103.
- the receiving apparatus 103 can synchronize with all the transmitting apparatuses 101, and as a result, all the transmitting apparatuses 101 and the receiving apparatuses 103 can be synchronized.
- the media synchronization unit 136 (FIG. 2) of the transmission device 101 sets the time synchronized with the sampling time of the data input from “video IN” to the RTP time stamp.
- the frequency is converted into a frequency, and is added as an RTP time stamp to each RTP packet by the RTP unit 164 (FIG. 3) and the RTP unit 133 (FIG. 2) in the encoding unit 131.
- the media synchronization unit 210 (FIG. 7) of the receiving apparatus 103 holds the reference clock time information notified from the reference signal synchronization unit 112 (FIG. 1) as the system time converted to the RTP timestamp frequency.
- the reception buffer 202 (FIG. 7) of the reception device 103 receives the RTP time stamp value of the RTP packet, the reception buffer time notified from the reception buffer time setting unit 208, and the media synchronization unit 210.
- the reception buffer output time of each RTP packet is determined from the supplied RTP timestamp frequency time.
- the reception buffer 202 supplies the held RTP packet to the RTP unit 203 at the reception buffer output time.
- the RTP packet reception buffer output time is set as follows. For example, the first packet of encoded data is output from the reception buffer after the reception buffer time TSTIME_BUF elapses from the reception time, and the subsequent packet includes the RTP timestamp value of the first packet and the RTP timestamp value of the packet. And synchronously output at the time calculated from the difference value.
- the reception buffer output time TSSYS_BO_n of RTP packet n is calculated by the following equation (1), for example.
- the RTP timestamp value of the first packet of the encoded data is TSPKT_init
- the reception time (system time conversion: the frequency is the RTP timestamp frequency)
- TSSYS_init the reception time (system time conversion: the frequency is the RTP timestamp frequency)
- TSSYS_init the reception time (system time conversion: the frequency is the RTP timestamp frequency)
- the RTP timestamp value of the RTP packet n is TSPKT_n.
- TSSYS_BO_n (TSPKT_n ⁇ SPKT_init) + TSSYS_init + TSTIME_BUF (1)
- the encoding unit 131 is, for example, a hierarchical encoding method that performs wavelet transform using several lines of each picture of a moving image proposed in Patent Document 1 as one compression encoding block, and is associated with each layer. Use encoding schemes with different input data ranges.
- the encoding unit 131 (FIG. 2) of the transmission apparatus 101 performs encoding processing
- the decoding unit 205 (FIG. 7) of the receiving apparatus 103 performs decoding processing.
- the encoding unit 131 (FIG. 2) of the transmission apparatus 101 includes a rate control unit 141 as an internal processing unit.
- the rate control unit 141 sets the bucket size as the encoder assumed buffer size B (byte) and the bucket rate R (bps) as the encoding rate, so that the bucket does not overflow.
- the rate control for controlling the encoding rate is performed.
- the buffer time required by the receiving apparatus 103 when this is smoothed and transmitted is defined as a variable compression coding delay time.
- variable compression encoding delay time Bt_codec (sec) can be expressed by the following equation (2), for example.
- Encoder buffer size B is determined according to the image quality requirement specified by the user or the like. In the case of compression encoding by the VBR method, by enlarging the encoder assumed buffer size B, a larger amount of data can be used for a highly complex image portion, and the image quality can be improved. That is, when the image quality requirement is high, it is possible to meet the high requirement by increasing the encoder assumed buffer size B.
- variable compression coding delay request time Bt_codec_req (sec) increases, which may increase the delay.
- RTCP processing Next, RTCP processing of QoS control processing performed between each transmitting apparatus 101 and receiving apparatus 103 will be described.
- the RTCP unit 137 (FIG. 2) and the RTCP unit 207 (FIG. 7) use RTCP described in IETF RFC 3550 to transmit and receive RTCP messages between the transmission device 101 and the reception device 103, and to obtain a packet loss rate and a round trip. Collects information such as propagation delay (RTT) and network jitter, and transmits and receives control messages for QoS control processing. Examples of the QoS control message include a retransmission request message in ARQ processing.
- the FEC unit 132 (FIG. 2) performs FEC redundant encoding in units of RTP packets of encoded data supplied from the encoding unit 131.
- the FEC unit 132 performs redundant encoding using an erasure error correction code such as a Reed-Solomon code.
- the redundancy in FEC redundancy coding is determined by, for example, a transmission quality request specified by the user or the like.
- the degree of redundancy is specified in the form of (number of original data packets, number of redundant packets).
- the user also specifies the assumed packet loss rate p of the network.
- a set of (number of original data packets, number of redundant packets) is defined as one redundant code unit (so-called FEC block).
- FEC block a redundant code unit
- (original data packet number, redundant packet number) (10,5)
- the FEC unit 132 of the transmission apparatus 101 generates five redundant packets for the ten original data packets. . That is, a total of 15 packets are transmitted in this FEC block.
- the FEC unit 204 (FIG. 7) of the receiving apparatus 103 receives any 10 packets in the FEC block packet, it can decode the original data by the FEC decoding process.
- the packet loss rate specified by the user is p
- the number of packets in the FEC block is n
- the number of original data packets is k
- the number of redundant packets is nk
- the target FEC specified by the user is expressed by the following equation (3).
- the original data packet number k and the redundant packet number n-k are determined so as to satisfy this equation (3).
- the receiving apparatus 103 (FEC unit 204 (FIG. 7)) performs lost packet recovery by FEC redundant decoding processing, so that the last packet after the first packet of the FEC block arrives at the receiving apparatus 103 is received. It is necessary to set the reception buffer time longer than the time until arrival (so-called redundant encoding block reception waiting time).
- the redundant coding block reception waiting time corresponding to the FEC block set according to the transmission quality request designated by the user or the like is referred to as “redundant coding block reception waiting time”.
- the receiving unit 201 detects a lost packet using the sequence number of the RTP packet, and the ARQ unit 206 generates a retransmission request message for the lost packet.
- the RTCP unit 207 transmits the retransmission request message to the transmission apparatus 101 and makes a retransmission request.
- the ARQ unit 206 of the receiving apparatus 103 makes a retransmission request for packet loss and the receiving unit 201 does not receive a retransmission packet even after the round-trip propagation time (ARQ retransmission packet waiting time) has elapsed, A retransmission request may be made. Furthermore, the retransmission request may be repeated until the ARQ unit 206 determines that the arrival of the retransmission packet is not in time for the reception buffer output time of the packet.
- the ARQ unit 138 In the ARQ process in the transmission apparatus 101, when the RTCP unit 137 (FIG. 2) receives the retransmission request message, the ARQ unit 138 generates an RTP packet (retransmission packet) to be retransmitted and supplies the retransmission packet to the smoothing unit 134. And have them resend.
- the recovery performance in the ARQ process depends on the ARQ retransmission packet waiting time which is a time for waiting for a retransmission packet to be retransmitted in response to a request from the ARQ unit 206 (FIG. 7) in the receiving apparatus 103.
- the recovery performance improves as the ARQ retransmission packet waiting time increases.
- the reception buffer time longer than the ARQ retransmission packet waiting request time is required, which may increase the delay.
- This “ARQ retransmission packet wait request time” is determined by the transmission quality request specified by the user or the like.
- the reception buffer 202 (FIG. 7) of the reception apparatus 103 also has a network jitter handling processing mechanism.
- the reception buffer 202 sets a time longer than the “network jitter-corresponding buffer request time” determined by the transmission quality request specified by the user or the like as the reception buffer time. As a result, it is possible to synchronize packets that have received jitter that is less than or equal to the “network jitter-compatible buffer request time”.
- step S121 the RTCP unit 207 of each receiving unit 113 communicates with the transmitting apparatus 101 corresponding to itself and measures the network status.
- the RTCP unit 137 of each transmitting apparatus 101 also measures the network status in step S101.
- the status (network status information) related to the communication of the network 102 is observed by exchanging packets between the RTCP unit 137 and the RTCP unit 207.
- the RTCP unit 137 transmits dummy data for network status advance measurement before starting transmission of encoded data.
- the dummy data is transmitted to the RTCP unit 207 via the network 102.
- the RTCP unit 207 receives the dummy data, and performs network state measurement (preliminary measurement) from the information at the time of transmission described in the packet and the state at the time of reception.
- the reception buffer time setting unit 208 performs reception buffer time determination processing in step S122, performs settings in the reception unit 113, and notifies the transmission apparatus 101 about the reception buffer time. To do.
- step S102 the reception buffer time / processing parameter setting unit 139 of the transmission apparatus 101 performs reception buffer time / processing parameter setting processing, receives a notification from the reception apparatus 103, and sets various parameters.
- step S103 and step S123 the transmission device 101 and the reception device 103 perform data transmission processing for transmitting an RTP packet of video data from the transmission device 101 to the reception device 103 in cooperation with each other.
- the reception buffer time is adjusted so that all the reception units 113 are synchronized.
- the term “synchronization” as used herein means that adjustment is performed so that packets including data captured at the same time are output from the reception buffer 202 at the same timing when input to all reception units 113.
- step S141 the integrated reception buffer time adjustment unit 114 obtains the maximum transmission delay time that is the longest delay time among the transmission delays of data transmission performed by each reception unit 113. This maximum transmission delay time is obtained as shown in the following equation (4).
- Maximum transmission delay time MAX (transmission delay time 1, transmission delay time 2, ..., transmission delay time N) ... (4)
- MAX () is a function for calculating the maximum value.
- step S142 the integrated reception buffer time adjustment unit 114 selects a reception unit to be processed from unprocessed reception units 113 used for data transmission.
- step S143 the reception buffer time setting unit 208 of the reception unit 113 selected as the processing target uses the maximum transmission delay time, the transmission delay time of data transmission by the reception unit 113, and the specified reception buffer time to receive buffer time. Is calculated.
- Specified reception buffer time is the minimum reception waiting time required for encoding processing and QoS control processing, and is determined in advance.
- the reception buffer time is a buffer time for achieving synchronization by absorbing a difference in transmission delay between data transmissions from the respective transmission apparatuses 101.
- the reception buffer time includes a specified reception buffer time. That is, the reception buffer time is a value obtained by adjusting the specified reception buffer time according to the length of transmission delay of each data transmission (generally, a longer value).
- the reception buffer time can be calculated as in the following equation (5).
- Reception buffer time K maximum transmission delay time ⁇ transmission delay time K + specified reception buffer time (5)
- K 1, ..., N
- a transmission apparatus processing delay 251 that is a delay time due to processing in the transmission apparatus 101 and a transmission delay 252 that is a delay time when transmitting the network 102 are shown in the upper part of FIG. Suppose that it looks like an example.
- the transmission delay 252-1 of the transmission device 101-1 is the longest. Therefore, this transmission delay 252-1 is the maximum transmission delay time.
- the reception buffer time 254-1 for data transmission of the transmission apparatus 101-1 is the same as the specified reception buffer time 253.
- a difference in length of transmission delay is added to the specified reception buffer time 253.
- the sum of the transmission device processing delay, the transmission delay, and the reception buffer time is common in each data transmission. Therefore, the delay time of the entire data transmission does not increase.
- the transmission device 101 and the reception device 103 are Parameters can be set to improve image quality and transmission quality without increasing the delay time of the entire data transmission.
- the reception buffer time setting unit 208 calculates the variable compression coding delay time, the redundant coding block reception waiting time from the reception buffer time. Set various delay times and wait times, such as ARQ retransmission packet wait time and network jitter buffer time.
- a set value “variable compression encoding delay time K”, “redundant encoding” in each process in a certain transmitting apparatus 101-K (K 1,..., N) and a receiving unit 113-K corresponding to the transmitting apparatus.
- the “block reception waiting time K”, “ARQ retransmission packet waiting time K”, and “network jitter-corresponding buffer time K” are calculated from the “reception buffer time K” by the following equation (6).
- the reception buffer time setting unit 208 sets “ARQ retransmission packet waiting time” in the ARQ unit 206 in step S145 to determine whether to perform a retransmission request. Let it be used.
- step S 146 the reception buffer time setting unit 208 sets “network jitter corresponding buffer time” in the reception buffer 202.
- step S147 the reception buffer time setting unit 208 converts the “variable compression encoding delay time” and the “redundant encoding block waiting time” into, for example, an RTCP message via the RTCP unit 207 as a reception unit to be processed. 113 notifies the corresponding transmission apparatus 101.
- step S148 the integrated reception buffer time adjustment unit 114 determines whether all the reception units 113 have been processed, and if it is determined that there is an unprocessed reception unit 113 used for data transmission, The processing returns to step S142, a new processing target is selected, and the subsequent processing is repeated for the new processing target.
- step S148 If it is determined in step S148 that all the reception units 113 have been processed, the reception buffer time determination process is terminated, the process returns to step S122 in FIG. 9, and the subsequent processes are performed.
- the reception buffer time / processing parameter setting unit 139 When the reception buffer time / processing parameter setting process is started, the reception buffer time / processing parameter setting unit 139 includes “variable” included in the RTCP message supplied from the reception apparatus 103 via the RTCP unit 137 in step S161. “Compression encoding delay time” and “redundant encoding block waiting time” are acquired.
- step S162 the reception buffer time / processing parameter setting unit 139 sets “variable compression encoding delay time” in the encoding unit 131.
- step S 163 the reception buffer time / processing parameter setting unit 139 sets “redundant coding block waiting time” in the FEC unit 132.
- step S164 the rate control unit 141 sets a rate control parameter using the supplied “variable compression coding delay time”. For example, the rate control unit 141 sets the encoder assumed buffer size B (byte) so as to satisfy the following expression (7).
- Bt_codec (sec) indicates a variable compression encoding delay time.
- R (bps) indicates the packet rate.
- step S165 the FEC unit 132 sets the FEC block parameter using “redundant coding block reception waiting time”.
- the “redundant coding block waiting time” is the maximum value of the time required for smooth transmission of all the packets included in the FEC block.
- the FEC unit 132 adjusts the number of original data packets for each FEC block so that this time is equal to or less than the “redundant encoding block waiting time”. Further, the FEC unit 132 sets the number of redundant packets nk so as to satisfy, for example, the above equation (3).
- the reception buffer time / processing parameter setting unit 139 ends the reception buffer time / processing parameter setting processing, returns the processing to step S101 in FIG. 7, and performs the processing in step S102.
- the transmission apparatus 101 and the reception apparatus 103 can secure each delay time and waiting time as long as possible without increasing the delay, and the time margin is set as the image quality. And can be used to improve transmission quality.
- the transmitting apparatus 101 sets the “variable compression encoding delay time” of the encoding unit 131 longer and the encoder assumed buffer size B (byte) to a larger value so that the delay is not increased. be able to. Further, the transmission apparatus 101 can set the “redundant encoding block waiting time” of the FEC unit 132 to be longer and increase the FEC block and the redundancy so as not to increase the delay.
- the receiving apparatus 103 can set the “ARQ retransmission packet waiting time” of the ARQ unit 206 longer to the extent that the delay is not increased. In addition, the receiving apparatus 103 can set the “network jitter-corresponding buffer time” of the reception buffer 202 to be longer so as not to increase the delay.
- the transmission apparatus 101 and the reception apparatus 103 can reduce the waste in delay time by using the time margin of each process and improving the image quality and transmission quality. That is, the transmission device 101 and the reception device 103 can suppress deterioration in content quality due to various processes related to data transmission as described above.
- the reception buffer time is adjusted only in accordance with the network conditions.
- the present invention is not limited to this, and for example, a user or the like inputs a request for image quality or transmission quality, and based on the input.
- the reception buffer time may be adjusted.
- reception buffer time may be updated during data transmission.
- FIG. 13 is a block diagram showing a configuration example of the transmission system in that case.
- a transmission system 300 shown in FIG. 13 is basically the same system as the transmission system 100 in FIG. 1, but includes a reception device 303 instead of the reception device 103.
- the receiving device 303 is basically the same device as the receiving device 103, but includes an input unit 311 and an output unit 312 in addition to the configuration of the receiving device 103.
- the input unit 311 includes, for example, an arbitrary input device such as a keyboard, mouse, touch panel, or switch, or an external input terminal.
- the input unit 311 accepts an image quality request or a transmission quality request from the outside of the receiving device 303 such as a user. , And supplies it to the integrated reception buffer time adjustment unit 114.
- the output unit 312 includes, for example, an arbitrary output device such as a monitor or a speaker, an external output terminal, or the like, and displays a GUI image supplied from the integrated reception buffer time adjustment unit 114 or outputs audio.
- the image quality request and the transmission quality request input guidance and the input result are output.
- FIG. 14 is a diagram showing a display example of a request reception screen that is a GUI for receiving image quality requests, transmission quality requests, and the like.
- the request reception screen 321 is provided with a display unit 322 and a request unit 323.
- the request unit 323 indicates items that can be input by the user.
- the display unit 322 displays the result of information input by the user.
- image quality request (image quality request) and “transmission quality request” can be input to the request unit 323 as “user request”.
- the user selects “image quality request” and inputs the requested quality (eg, PSR value).
- the user selects “transmission quality request” and inputs the requested quality (for example, packet loss rate after QoS control).
- the user can also specify the reception buffer time. For example, when the user selects “reception buffer time” and inputs the requested time, the longest time permitted as “reception buffer time” is set.
- the display unit 322 displays various information reflecting the information input in the request unit 323 as described above.
- transmission delay is displayed on the display unit 322 as “network status” for each transmission device.
- network status is displayed on the display unit 322 as “network status” for each transmission device.
- other information may be displayed.
- the display unit 322 includes “variable compression coding delay request time”, “redundant coding block reception request wait time”, “ARQ retransmission packet wait request time”, and “network jitter” as “processing request time”.
- Corresponding buffer request time is displayed.
- reception buffer time recommended as the reception buffer time is displayed on the display unit 322 for each transmission device.
- the user can more easily set an image quality request, a transmission quality request, a reception buffer time, and the like.
- the receiving apparatus 303 first controls the input unit 311 and the output unit 312 in step S321 to accept an image quality request and a transmission quality request.
- the image quality request is a request for image quality of a decoded image (an image of video data output from the receiving device 303).
- the transmission quality request is a request for the packet loss rate and the like of the network 102.
- the receiving device 303 uses the received image quality request and transmission quality request for setting various parameters.
- step S323 of FIG. 15 the reception buffer time determination process is performed using the request accepted in step S321.
- step S303 and step S324 in FIG. 15 reception buffer dynamic change transmission processing is performed in which the reception buffer time is updated while data transmission is performed.
- each process is basically performed in the same manner as described with reference to the flowchart of FIG.
- the integrated reception buffer time adjustment unit 114 obtains a provisional reception buffer time (temporary reception buffer time) of each reception unit 113 instead of using the specified reception buffer time. .
- the integrated reception buffer time adjustment unit 114 selects the reception unit 113 to be processed in step S341, and in step S342, issues an image quality request and a transmission quality request for data transmission of the processing target reception unit 113. Then, “variable compression coding delay request time”, “redundant coding block reception wait request time”, “ARQ retransmission packet wait request time”, and “network jitter corresponding buffer request time” are calculated.
- step S343 the integrated reception buffer time adjusting unit 114 calculates the calculated “variable compression coding delay request time”, “redundant coding block reception wait request time”, “ARQ retransmission packet wait request time”, and “network jitter support”.
- the maximum value of “buffer request time” is set as the temporary reception buffer time.
- step S344 the integrated reception buffer time adjustment unit 114 determines whether or not all the reception units 113 used for data transmission have been processed, and performs the processing of steps S341 to S344 until all the reception units 113 are processed. repeat.
- the integrated reception buffer time adjustment unit 114 proceeds with the process to step S345 and obtains the maximum transmission delay time as in the case of step S141 in FIG.
- the reception buffer time of each reception unit 113 calculated without considering the transmission delay is not a common specified reception buffer time but a provisional reception buffer time set individually, and the length thereof is Have different possibilities.
- the calculation method of the reception buffer time is basically the same as that in FIG. 10, and the processes in steps S351 to S357 in FIG. Except that the maximum value (maximum transmission delay time) is applied, the processing is performed in the same manner as the processing in steps S142 to S148 in FIG.
- reception buffer time / processing parameter setting process by each transmission apparatus 101 is performed in the same manner as described with reference to the flowchart of FIG.
- the time margin of each process is set to the image quality as in the first embodiment.
- the transmission quality can be effectively used by improving the transmission quality, and waste in delay time can be further reduced. That is, it is possible to further suppress deterioration in content quality.
- step S371 each unit of the transmission apparatus 101 performs data transfer for a certain time.
- each unit of the receiving device 303 performs data transfer for a predetermined time in step S391.
- step S392 the RTCP unit 207 of the receiving device 303 transmits and receives data to and from the RTCP unit 137, measures the network status, and updates the network status information.
- the RTCP unit 137 of the transmission apparatus 101 measures the network status and updates the network status information in step S372.
- step S393 the reception buffer time setting unit 208 of the reception device 303 performs reception buffer time determination processing based on the updated network status information, and “variable compression coding delay request time” and “redundant coding block reception”.
- the "waiting request time”, “ARQ retransmission packet wait request time”, “network jitter buffer request time”, etc. are updated, and "variable compression encoding delay time”, “redundancy” according to the updated various request times Settings such as “encoding block reception waiting time”, “ARQ retransmission packet waiting time”, and “network jitter-corresponding buffer time” are updated using, for example, Expressions (4) and (5).
- This reception buffer time determination process is performed in the same manner as in FIGS.
- step S373 the reception buffer time / processing parameter setting unit 139 of the transmission apparatus 101 performs reception buffer time / processing parameter setting processing. This process is performed in the same manner as in FIG.
- the reception buffer time / processing parameter setting unit 139 uses the packet loss rate p included in the network status information measured by the RTCP unit 137 using encoded data transmission to The number of original data packets k and the number of redundant packets nk are determined so as to satisfy (3).
- step S374 the transmitting apparatus 101 determines whether or not to end the transmission. If it is determined that the transmission is to be ended, the process returns to step S371 to execute the subsequent processes. If it is determined in step S374 that the transmission is to be terminated, the reception buffer dynamic change transmission process by the transmission apparatus 101 is terminated.
- step S394 the receiving apparatus 303 determines whether or not to end the transmission. If it is determined to end the transmission, the receiving apparatus 303 returns the process to step S391 to execute the subsequent processes. If it is determined in step S394 that the transmission is to be terminated, the reception buffer dynamic change transmission process by the reception device 303 is terminated.
- the transmission apparatus 101 and the reception apparatus 303 can be more according to an actual situation. Image quality and transmission quality can be set efficiently, and waste in delay time can be further reduced. That is, the transmission device 101 and the reception device 303 can suppress deterioration in content quality.
- variable compression coding processing when moving images are transmitted from a plurality of transmission devices and synchronization processing is performed at the reception device, and QoS control processing such as variable compression coding processing and FEC is performed, data transmission with a small transmission delay time is performed.
- QoS control processing such as variable compression coding processing and FEC is performed
- data transmission with a small transmission delay time is performed.
- the video data to be transmitted is encoded.
- the present invention is not limited to this, and the video data may be transmitted without being compressed.
- a CPU (Central Processing Unit) 401 of the personal computer 400 performs various processes according to a program stored in a ROM (Read Only Memory) 402 or a program loaded from a storage unit 413 to a RAM (Random Access Memory) 403. Execute the process.
- the RAM 403 also appropriately stores data necessary for the CPU 401 to execute various processes.
- the CPU 401, the ROM 402, and the RAM 403 are connected to each other via a bus 404.
- An input / output interface 410 is also connected to the bus 404.
- the input / output interface 410 includes an input unit 411 including a keyboard and a mouse, a display including a CRT (Cathode Ray Tube) and an LCD (Liquid Crystal Display), an output unit 412 including a speaker, a hard disk, and the like.
- a communication unit 414 including a storage unit 413 and a modem is connected. The communication unit 414 performs communication processing via a network including the Internet.
- a drive 415 is connected to the input / output interface 410 as necessary, and a removable medium 421 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted, and a computer program read from these is loaded. It is installed in the storage unit 413 as necessary.
- a program constituting the software is installed from a network or a recording medium.
- the recording medium is distributed to distribute the program to the user separately from the apparatus main body, and includes a magnetic disk (including a flexible disk) on which the program is recorded, an optical disk ( It is only composed of removable media 421 consisting of CD-ROM (compact disc-read only memory), DVD (including digital versatile disc), magneto-optical disk (including MD (mini disc)), or semiconductor memory. Rather, it is composed of a ROM 402 on which a program is recorded and a hard disk included in the storage unit 413, which is distributed to the user in a state of being incorporated in the apparatus main body in advance.
- a magnetic disk including a flexible disk
- an optical disk It is only composed of removable media 421 consisting of CD-ROM (compact disc-read only memory), DVD (including digital versatile disc), magneto-optical disk (including MD (mini disc)), or semiconductor memory. Rather, it is composed of a ROM 402 on which a program is recorded and a hard disk included in the storage unit 413, which is distributed to the user in a
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the described order, but may be performed in parallel or It also includes processes that are executed individually.
- system represents the entire apparatus composed of a plurality of devices (apparatuses).
- the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
- the configurations described above as a plurality of devices (or processing units) may be combined into a single device (or processing unit).
- a configuration other than that described above may be added to the configuration of each device (or each processing unit).
- a part of the configuration of a certain device (or processing unit) may be included in the configuration of another device (or other processing unit). . That is, the embodiment of the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present invention.
- 100 transmission system 101 transmission device, 102 network, 103 reception device, 111 transmission unit, 112 reference signal synchronization unit, 113 reception unit, 114 integrated reception buffer time adjustment unit, 115 synthesis unit, 131 encoding unit, 132 FEC unit, 133 RTP section, 134 smoothing section, 135 reference signal synchronization section, 136 media synchronization section, 137 RTCP section, 138 ARQ section, 139 reception buffer time / processing parameter setting section, 141 rate control section, 201 reception section, 202 reception buffer , 203 RTP part, 204 FEC part, 205 decoding part, 206 ARQ part, 207 RTCP part, 208 reception buffer time setting part, 210 media synchronization part, 300 transmission system, 303 receiving device, 311 input unit, 312 output unit
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
1.第1の実施の形態(伝送システム)
2.第2の実施の形態(伝送システム)
3.第3の実施の形態(パーソナルコンピュータ)
[伝送システムの概要]
図1は、本発明を適用した伝送システムの主な構成例を示すブロック図である。伝送システム100は、複数の送信装置(送信装置101-1乃至送信装置101-N(Nは2以上の整数))から画像データを、インターネットやLAN等の汎用の伝送路であるネットワーク102を介して受信装置103に伝送するシステムである。以下において、各送信装置を互いに区別して説明する必要が無い場合、単に送信装置101と称する。なお、ネットワーク102には、ケーブル等だけでなく、ルータやハブ等のデバイスも含まれる。
図2に示されるように、送信装置101は、符号化部131、FEC(Forward Error Correction)部132、RTP(Real-time Transport Protocol)部133、平滑化部134、基準信号同期部135、メディア同期部136、RTCP(RTP Control Protocol)部137、ARQ(Auto Repeat Request)部138、および受信バッファ時間・処理パラメータ設定部139を有する。
次に、送信装置101の符号化部131の例について説明する。図3は、送信装置101の符号化部131の構成例を示すブロック図である。符号化部131は、画像データを解像度に関して重要度の高いデータから順に階層化し、その階層毎に符号化する階層符号化を行う。例えば、符号化部131は、空間解像度に関して重要度の高いデータから順に階層化された階層化データを生成する。また、例えば、符号化部131は、時間方向の解像度に関して重要度の高いデータから順に階層化された階層化データを生成する。さらに、例えば、符号化部131は、SNR(Signal to Noise Ratio)に関して重要度の高いデータから順に階層化された階層化データを生成する。符号化部131は、このように生成した階層化データを、その階層毎に符号化する。
次に、図1の受信装置103の各受信部113の内部の構成例について説明する。図7は、受信部113の主な構成例を示すブロック図である。図7に示されるように、受信部113は、受信部201、受信バッファ202、RTP部203、FEC部204、復号部205、ARQ部206、RTCP部207、受信バッファ時間設定部208、およびメディア同期部210を有する。
次に、上述した符号化部131の例に対応する復号部205の例について説明する。図8は、受信部113の復号部205の構成例を示すブロック図である。図8において、復号部205は、RTP部230、エントロピ復号部231、逆量子化部232、およびウェーブレット逆変換部233を有する。
次に、伝送システム100(図1)において行われる各処理について説明する。最初に、ビデオデータを各送信装置101からネットワーク102を介して受信装置103に伝送する伝送処理について説明する。
次に、基準信号を同期させる基準信号同期処理について説明する。基準信号同期部112(図1)は、IEEE 1588 PTP(Precision Time Protocol)を用い、送信装置101および受信装置103間の基準信号クロックの同期を行う。なお、基準信号クロックの周波数として、例えば、入力ビデオ画像のピクセルサンプリング周波数を用いるようにしてもよい。この場合、「ビデオIN」から入力するビデオ出力装置との同期も行うようにすることができる。
送信装置101のメディア同期部136(図2)は、基準信号同期部135より通知された時刻を基に、「ビデオIN」より入力されたデータのサンプリング時刻と同期した時刻を、RTPタイムスタンプの周波数に変換し、符号化部131内のRTP部164(図3)およびRTP部133(図2)にて各RTPパケットにRTPタイムスタンプとして付加する。
符号化部131は、例えば、特許文献1で提案されている動画像の各ピクチャの数ライン毎を1つの圧縮符号化ブロックとしてウェーブレット変換する階層符号化方式であり、かつ、階層毎に関連する入力データ範囲の異なる符号化方式を用いる。
次に、各送信装置101と受信装置103との間で行われるQoS制御処理のRTCP処理について説明する。RTCP部137(図2)およびRTCP部207(図7)は、IETF RFC 3550記載のRTCPを用いて、送信装置101と受信装置103との間でRTCPメッセージの送受信を行い、パケット損失率、往復伝搬遅延(RTT)、およびネットワークジッタ等の情報収集や、QoS制御処理のための制御メッセージの送受信を行う。QoS制御メッセージとしては、例えば、ARQ処理における再送要求メッセージ等がある。
次に、QoS制御処理のFEC処理について説明する。FEC部132(図2)は、符号化部131から供給される符号化データのRTPパケットを単位として、FEC冗長符号化を行う。例えば、FEC部132は、Reed-Solomon符号等の消失誤り訂正符号を用い冗長符号化を行う。
図1の受信装置103におけるARQ処理では、受信部201(図7)がRTPパケットのシーケンス番号を利用して損失したパケットを検知し、ARQ部206がその損失パケットの再送要求メッセージを生成し、RTCP部207が、送信装置101に対してその再送要求メッセージを送信し、再送要求を行う。
受信装置103の受信バッファ202(図7)は、ネットワークジッタ対応処理機構も兼ね備える。受信バッファ202は、ユーザ等により指定される伝送品質要求により決定された「ネットワークジッタ対応バッファ要求時間」以上の時間を受信バッファ時間としてセットする。これにより「ネットワークジッタ対応バッファ要求時間」以下のジッタを受けたパケットの同期処理が可能となる。
次に、伝送システム100において実行されるデータ伝送処理の全体の流れの例を、図9のフローチャートを参照して説明する。
次に、図9のステップS122において実行される受信バッファ時間決定処理の流れの例を図10のフローチャートを参照して説明する。
・・・(4)
・・・(5)
ただし、K=1,・・・,N
=冗長符号化ブロック受信待ち時間K
=ARQ再送パケット待ち時間K
=ネットワークジッタ対応バッファ時間K
=受信バッファ時間K
・・・(6) ただし、K=1,・・・,N
次に、図12のフローチャートを参照して、図9のステップS102において実行される受信バッファ時間・処理パラメータ設定処理を説明する。
[伝送システムの構成]
なお、以上においては、ネットワーク状況のみに従って受信バッファ時間の調整が行われるように説明したが、これに限らず、例えば、ユーザ等に画像品質や伝送品質に対する要求を入力させ、その入力に基づいて受信バッファ時間の調整が行われるようにしてもよい。
図15のフローチャートを参照して、この場合のデータ伝送処理の流れの例を説明する。
次に、図15のステップS323において実行される受信バッファ時間決定処理の流れの例を図16および図17のフローチャートを参照して説明する。
次に、図18のフローチャートを参照して、送信装置101および受信装置303により行われる受信バッファ動的変更伝送処理の流れの例を説明する。
[パーソナルコンピュータ]
上述した一連の処理は、ハードウエアにより実行させることもできるし、ソフトウエアにより実行させることもできる。この場合、例えば、図19に示されるようなパーソナルコンピュータとして構成されるようにしてもよい。
Claims (9)
- 複数の送信装置から1つの受信装置に、互いに同期がとられたデータを伝送するデータ伝送において、各データ伝送の、伝送路上で発生する遅延時間である伝送遅延の差を用いて、各データ伝送について設定される、前記受信装置において各データの同期をとるためのバッファ時間である受信バッファ時間を調整する調整手段と、
前記調整手段により調整された前記受信バッファ時間を用いて、各データ伝送に関する処理のパラメータを設定する設定手段と
を備える情報処理装置。 - 前記調整手段は、前記伝送遅延の最大値を求め、各データの前記伝送遅延と前記最大値との差を、予め定められた所定の受信バッファ時間である規定受信バッファ時間に加えたものを、前記受信バッファ時間とする
請求項1に記載の情報処理装置。 - 前記データ伝送に関する処理は、前記データ伝送のQoS制御処理であり、
前記設定手段は、前記QoS制御処理のパラメータとして、冗長符号化ブロックの先頭パケットから末尾のパケットを受信するまでの時間である冗長符号化ブロック受信待ち時間、再送パケットを待つ時間である再送パケット待ち時間、およびネットワークジッタを吸収するためのネットワークジッタ対応バッファ時間を設定する
請求項1に記載の情報処理装置。 - 前記データは伝送元において符号化され、得られた符号化データが伝送され、伝送先において前記符号化データが復号され、
前記設定手段は、前記処理のパラメータとして、前記符号化においてレート制御されて生成された前記符号化データを平滑化伝送する際に必要な可変圧縮符号化遅延要求時間を設定する
請求項1に記載の情報処理装置。 - 前記データの画質に関する要求である画像品質要求、および、前記データ伝送における伝送品質に関する要求である伝送品質要求を受け付ける受付手段をさらに備え、
前記調整手段は、前記受付手段により受け付けられた前記画像品質要求および前記伝送品質要求に基づいて、前記受信バッファ時間を調整する
請求項1に記載の情報処理装置。 - 前記調整手段は、前記受付手段により受け付けられた前記画像品質要求および前記伝送品質要求に基づいて、仮の前記受信バッファ時間を設定し、前記仮の受信バッファ時間を用いて、前記受信バッファ時間を調整する
請求項5に記載の情報処理装置。 - 前記受付手段により受け付けられる前記画像品質要求および前記伝送品質要求の入力を補助するGUIを表示する出力手段をさらに備える
請求項5に記載の情報処理装置。 - 情報処理装置の情報処理方法であって、
前記情報処理装置の調整手段が、複数の送信装置から1つの受信装置に、互いに同期がとられたデータを伝送するデータ伝送において、各データ伝送の、伝送路上で発生する遅延時間である伝送遅延の差を用いて、各データ伝送について設定される、前記受信装置において各データの同期をとるためのバッファ時間である受信バッファ時間を調整し、
前記情報処理装置の設定手段が、調整された前記受信バッファ時間を用いて、各データ伝送に関する処理のパラメータを設定する
情報処理方法。 - データ伝送を行うコンピュータを、
複数の送信装置から1つの受信装置に、互いに同期がとられたデータを伝送するデータ伝送において、各データ伝送の、伝送路上で発生する遅延時間である伝送遅延の差を用いて、各データ伝送について設定される、前記受信装置において各データの同期をとるためのバッファ時間である受信バッファ時間を調整する調整手段、
前記調整手段により調整された前記受信バッファ時間を用いて、各データ伝送に関する処理のパラメータを設定する設定手段
として機能させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/814,273 US8964921B2 (en) | 2010-08-12 | 2011-08-03 | Information processing apparatus, method, and program |
CN201180038506.6A CN103069835B (zh) | 2010-08-12 | 2011-08-03 | 信息处理装置和方法 |
BR112013002848A BR112013002848A2 (pt) | 2010-08-12 | 2011-08-03 | aparelho e método de processamento de informação, e, programa |
EP11816352.6A EP2605459A1 (en) | 2010-08-12 | 2011-08-03 | Information processing device, method and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010180946A JP5672840B2 (ja) | 2010-08-12 | 2010-08-12 | 情報処理装置および方法、並びにプログラム |
JP2010-180946 | 2010-08-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012020686A1 true WO2012020686A1 (ja) | 2012-02-16 |
Family
ID=45567655
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/067802 WO2012020686A1 (ja) | 2010-08-12 | 2011-08-03 | 情報処理装置および方法、並びにプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US8964921B2 (ja) |
EP (1) | EP2605459A1 (ja) |
JP (1) | JP5672840B2 (ja) |
CN (1) | CN103069835B (ja) |
BR (1) | BR112013002848A2 (ja) |
WO (1) | WO2012020686A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108628270A (zh) * | 2018-06-11 | 2018-10-09 | 哈尔滨工程大学 | 一种基于plc远程监控终端的优化网络控制装置与方法 |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5598155B2 (ja) * | 2010-08-12 | 2014-10-01 | ソニー株式会社 | 情報処理装置および方法、並びに送受信システム |
US9009341B2 (en) * | 2011-10-11 | 2015-04-14 | Avaya Inc. | Video bandwidth management system and method |
JP2014093655A (ja) * | 2012-11-02 | 2014-05-19 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
US10341047B2 (en) * | 2013-10-31 | 2019-07-02 | Hewlett Packard Enterprise Development Lp | Method and system for controlling the forwarding of error correction data |
US9432144B2 (en) * | 2014-09-12 | 2016-08-30 | Ciena Corporation | Precision time transfer systems and methods in optical networks |
CN105100675B (zh) * | 2015-09-11 | 2019-07-09 | Tcl集团股份有限公司 | 一种终端视频通信的质量调节方法及系统 |
US9754338B2 (en) | 2015-10-09 | 2017-09-05 | Gt Gettaxi Limited | System to facilitate a correct identification of a service provider |
CN105897759A (zh) * | 2016-06-14 | 2016-08-24 | 青岛乾元通数码科技有限公司 | 一种网络动态自适应音视频缓存方法及系统 |
US10636108B2 (en) * | 2016-09-30 | 2020-04-28 | Lyft, Inc. | Identifying matched requestors and providers |
US11574262B2 (en) | 2016-12-30 | 2023-02-07 | Lyft, Inc. | Location accuracy using local device communications |
US10554783B2 (en) | 2016-12-30 | 2020-02-04 | Lyft, Inc. | Navigation using proximity information |
EP3382918B1 (en) * | 2017-03-30 | 2022-09-28 | ADVA Optical Networking SE | System and method of clock management in a packet data network |
KR101924183B1 (ko) * | 2017-04-25 | 2018-11-30 | 주식회사 님버스 | 젠락 기능을 가진 멀티미디어 송수신 장치 |
US10521881B1 (en) | 2017-09-28 | 2019-12-31 | Apple Inc. | Error concealment for a head-mountable device |
US10594395B2 (en) | 2018-07-23 | 2020-03-17 | Ciena Corporation | Systems and methods for compensating coherent optics delay asymmetry in a packet optical network |
CN109152087B (zh) * | 2018-07-26 | 2022-02-11 | 同方电子科技有限公司 | 一种基于电台的敌我识别功能扩展方法 |
CN111083309B (zh) * | 2018-10-18 | 2022-04-01 | 北京魔门塔科技有限公司 | 一种多传感器数据的时间对齐方法及数据采集设备 |
JP7030673B2 (ja) * | 2018-11-20 | 2022-03-07 | 株式会社東芝 | 送信装置、通信装置、通信システム、送信方法、およびプログラム |
US11910452B2 (en) | 2019-05-28 | 2024-02-20 | Lyft, Inc. | Automatically connecting wireless computing devices based on recurring wireless signal detections |
JP2021013134A (ja) * | 2019-07-09 | 2021-02-04 | ソニー株式会社 | 受信装置、受信方法および送受信システム |
CN112527782B (zh) * | 2019-09-19 | 2023-09-22 | 北京京东振世信息技术有限公司 | 一种数据处理的方法和装置 |
USD997988S1 (en) | 2020-03-30 | 2023-09-05 | Lyft, Inc. | Transportation communication device |
US11887386B1 (en) | 2020-03-30 | 2024-01-30 | Lyft, Inc. | Utilizing an intelligent in-cabin media capture device in conjunction with a transportation matching system |
US11552722B2 (en) | 2020-12-10 | 2023-01-10 | Ciena Corporation | Precision time protocol using a coherent optical DSP frame |
CN117196996B (zh) * | 2023-10-17 | 2024-06-04 | 山东鸿业信息科技有限公司 | 一种数据资源的无接口交互管理方法及系统 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002185850A (ja) * | 2000-12-11 | 2002-06-28 | Nippon Telegr & Teleph Corp <Ntt> | 遠隔映像選択混合方法及びシステム |
JP2007311924A (ja) * | 2006-05-16 | 2007-11-29 | Sony Corp | 帯域分析装置及び方法、帯域合成装置及び方法、画像符号化装置及び方法、画像復号装置及び方法、並びにプログラム及び記録媒体 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007027813A (ja) * | 2005-07-12 | 2007-02-01 | Sharp Corp | 通信システム |
CN1889533B (zh) * | 2006-08-03 | 2012-06-20 | 华为技术有限公司 | 在网际协议链路上自适应传输第三代网络接口帧的方法 |
JP4356033B2 (ja) * | 2007-05-17 | 2009-11-04 | ソニー株式会社 | 画像データ処理装置および方法 |
JP5745204B2 (ja) * | 2008-07-28 | 2015-07-08 | 株式会社バンダイナムコエンターテインメント | プログラム、情報記憶媒体及びゲーム機 |
JP5178375B2 (ja) * | 2008-07-30 | 2013-04-10 | パナソニック株式会社 | デジタル放送再生装置およびデジタル放送再生方法 |
-
2010
- 2010-08-12 JP JP2010180946A patent/JP5672840B2/ja not_active Expired - Fee Related
-
2011
- 2011-08-03 CN CN201180038506.6A patent/CN103069835B/zh not_active Expired - Fee Related
- 2011-08-03 EP EP11816352.6A patent/EP2605459A1/en not_active Withdrawn
- 2011-08-03 WO PCT/JP2011/067802 patent/WO2012020686A1/ja active Application Filing
- 2011-08-03 US US13/814,273 patent/US8964921B2/en active Active
- 2011-08-03 BR BR112013002848A patent/BR112013002848A2/pt not_active IP Right Cessation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002185850A (ja) * | 2000-12-11 | 2002-06-28 | Nippon Telegr & Teleph Corp <Ntt> | 遠隔映像選択混合方法及びシステム |
JP2007311924A (ja) * | 2006-05-16 | 2007-11-29 | Sony Corp | 帯域分析装置及び方法、帯域合成装置及び方法、画像符号化装置及び方法、画像復号装置及び方法、並びにプログラム及び記録媒体 |
JP4371120B2 (ja) | 2006-05-16 | 2009-11-25 | ソニー株式会社 | 画像処理装置及び画像処理方法、プログラム及び記録媒体 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108628270A (zh) * | 2018-06-11 | 2018-10-09 | 哈尔滨工程大学 | 一种基于plc远程监控终端的优化网络控制装置与方法 |
Also Published As
Publication number | Publication date |
---|---|
BR112013002848A2 (pt) | 2016-06-07 |
EP2605459A1 (en) | 2013-06-19 |
CN103069835B (zh) | 2016-01-20 |
JP5672840B2 (ja) | 2015-02-18 |
CN103069835A (zh) | 2013-04-24 |
US8964921B2 (en) | 2015-02-24 |
JP2012044253A (ja) | 2012-03-01 |
US20130136218A1 (en) | 2013-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5672840B2 (ja) | 情報処理装置および方法、並びにプログラム | |
JP5598155B2 (ja) | 情報処理装置および方法、並びに送受信システム | |
JP5493471B2 (ja) | 情報処理装置および方法 | |
EP2337306B1 (en) | Transmitting apparatus and method, and receiving apparatus and method | |
US8311122B2 (en) | Information processing apparatus and method | |
JP5397700B2 (ja) | 情報処理装置および方法 | |
US8745432B2 (en) | Delay controller, control method, and communication system | |
EP2200325B1 (en) | Information processor and method therefor | |
US20080259796A1 (en) | Method and apparatus for network-adaptive video coding | |
JP2011223360A (ja) | 送信装置、受信装置、制御方法、及び通信システム | |
WO2011162168A1 (ja) | 情報処理装置および情報処理方法 | |
WO2013154025A1 (ja) | 情報処理装置および方法、並びに、プログラム | |
WO2013154024A1 (ja) | 情報処理装置および方法、並びに、プログラム | |
JP2011147050A (ja) | 画像処理装置および方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180038506.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11816352 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011816352 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13814273 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112013002848 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112013002848 Country of ref document: BR Kind code of ref document: A2 Effective date: 20130205 |