WO2014196189A1 - データ復号方法、データ復号装置及びデータ送信方法 - Google Patents
データ復号方法、データ復号装置及びデータ送信方法 Download PDFInfo
- Publication number
- WO2014196189A1 WO2014196189A1 PCT/JP2014/002936 JP2014002936W WO2014196189A1 WO 2014196189 A1 WO2014196189 A1 WO 2014196189A1 JP 2014002936 W JP2014002936 W JP 2014002936W WO 2014196189 A1 WO2014196189 A1 WO 2014196189A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- unit
- buffer
- packet
- decoding
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 160
- 239000000872 buffer Substances 0.000 claims abstract description 233
- 230000005540 biological transmission Effects 0.000 claims abstract description 170
- 238000012545 processing Methods 0.000 claims description 116
- 230000008707 rearrangement Effects 0.000 claims description 74
- 239000012634 fragment Substances 0.000 claims description 36
- 238000004891 communication Methods 0.000 claims description 34
- 238000010586 diagram Methods 0.000 description 47
- 238000012986 modification Methods 0.000 description 32
- 230000004048 modification Effects 0.000 description 32
- 230000008569 process Effects 0.000 description 23
- 230000005236 sound signal Effects 0.000 description 23
- 238000003860 storage Methods 0.000 description 18
- 230000003287 optical effect Effects 0.000 description 16
- 238000000605 extraction Methods 0.000 description 10
- 238000006243 chemical reaction Methods 0.000 description 8
- 238000013500 data storage Methods 0.000 description 8
- 230000002452 interceptive effect Effects 0.000 description 6
- 230000003139 buffering effect Effects 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- AWSBQWZZLBPUQH-UHFFFAOYSA-N mdat Chemical compound C1=C2CC(N)CCC2=CC2=C1OCO2 AWSBQWZZLBPUQH-UHFFFAOYSA-N 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 230000001174 ascending effect Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 239000010410 layer Substances 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000007796 conventional method Methods 0.000 description 2
- 238000013075 data extraction Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000012536 storage buffer Substances 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
- H04N19/423—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/44—Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44004—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/631—Multimode Transmission, e.g. transmitting basic layers and enhancement layers of the content over different transmission paths or transmitting with different error corrections, different keys or with different transmission protocols
Definitions
- the present disclosure relates to a data decoding method, a data decoding device, and a data transmission method.
- the encoded data is generated by encoding content including video data and audio data based on a moving image encoding standard such as HEVC (High Efficiency Video Coding).
- HEVC High Efficiency Video Coding
- the predetermined transmission format includes, for example, MPEG-2 TS (Moving Picture Experts Group-2 Transport Stream) or MMT (MPEG Media Transport) (see Non-Patent Document 1).
- MPEG-2 TS Motion Picture Experts Group-2 Transport Stream
- MMT MPEG Media Transport
- Patent Literature 1 discloses a technique for transmitting encoded media data for each packet in accordance with MMT.
- MPEG media transport MMT
- a data decoding method includes a reception step of receiving, for each packet, a plurality of encoded streams included in encoded data and transmitted using each of a plurality of transmission paths. (S110), a storage step (S120) for storing a plurality of packets of the received plurality of encoded streams in the first buffer (120a), and a parallel processing for rearranging the plurality of packets stored in the first buffer in decoding order.
- the data transmission method includes a step (S220) of generating a flag indicating whether or not a plurality of packets constituting a plurality of encoded streams included in encoded data need to be rearranged, And transmitting the corresponding encoded stream in units of packets using each of the transmission paths, and transmitting the flag using at least one of the plurality of transmission paths (S240).
- the encoded data can be appropriately decoded.
- FIG. 1 is a block diagram showing an example of the configuration of the data decoding apparatus according to the first embodiment.
- FIG. 2 is a diagram illustrating an example of a data structure of an MMT stream according to the first embodiment.
- FIG. 3 is a diagram illustrating an example of the data structure of the MPU according to the first embodiment.
- FIG. 4 is a diagram illustrating an example of the data flow of the MMT packet according to the first embodiment.
- FIG. 5 is a diagram illustrating an example of an MMT stream transmitted through each of a plurality of transmission paths according to Embodiment 1.
- FIG. 6 is a diagram illustrating an example of a state in which a plurality of MMT packets according to Embodiment 1 are rearranged.
- FIG. 1 is a block diagram showing an example of the configuration of the data decoding apparatus according to the first embodiment.
- FIG. 2 is a diagram illustrating an example of a data structure of an MMT stream according to the first embodiment.
- FIG. 3 is a diagram illustrating
- FIG. 7 is a flowchart showing an example of the operation of the data decoding apparatus according to the first embodiment.
- FIG. 8 is a flowchart showing an example of a DTS determination method in the data decoding apparatus according to the first embodiment.
- FIG. 9 is a diagram illustrating an example of an operation mode of the rearrangement unit according to the first embodiment.
- FIG. 10 is a block diagram illustrating an example of the configuration of the data transmission apparatus according to the first embodiment.
- FIG. 11 is a flowchart illustrating an example of the operation of the data transmission apparatus according to the first embodiment.
- FIG. 12 is a diagram illustrating an example of a data structure of an MMT stream according to the first modification of the first embodiment.
- FIG. 13 is a diagram illustrating an example of the data flow of the MMT packet according to the second modification of the first embodiment.
- FIG. 14 is a diagram illustrating an example of the data flow of the MMT packet according to the third modification of the first embodiment.
- FIG. 15 is a flowchart illustrating an example of a data decoding method according to the fourth modification of the first embodiment.
- FIG. 16 is a flowchart illustrating an example of a data transmission method according to the fourth modification of the first embodiment.
- FIG. 17 is an overall configuration diagram of a content supply system that implements a content distribution service.
- FIG. 18 is an overall configuration diagram of a digital broadcasting system.
- FIG. 19 is a block diagram illustrating a configuration example of a television.
- FIG. 19 is a block diagram illustrating a configuration example of a television.
- FIG. 20 is a block diagram illustrating a configuration example of an information reproducing / recording unit that reads and writes information from and on a recording medium that is an optical disk.
- FIG. 21 is a diagram illustrating a structure example of a recording medium that is an optical disk.
- FIG. 22A illustrates an example of a mobile phone.
- FIG. 22B is a block diagram illustrating a configuration example of a mobile phone.
- FIG. 23 is a diagram showing a structure of multiplexed data.
- FIG. 24 is a diagram schematically showing how each stream is multiplexed in the multiplexed data.
- FIG. 25 is a diagram showing in more detail how the video stream is stored in the PES packet sequence.
- FIG. 26 is a diagram illustrating the structure of TS packets and source packets in multiplexed data.
- FIG. 27 is a diagram illustrating a data structure of the PMT.
- FIG. 28 is a diagram illustrating an internal configuration of multiplexed data information.
- FIG. 29 shows the internal structure of stream attribute information.
- FIG. 30 shows steps for identifying video data.
- FIG. 31 is a block diagram illustrating a configuration example of an integrated circuit that realizes the moving picture coding method and the moving picture decoding method according to each embodiment.
- FIG. 32 is a diagram showing a configuration for switching the drive frequency.
- FIG. 33 is a diagram showing steps for identifying video data and switching between driving frequencies.
- FIG. 34 is a diagram showing an example of a look-up table in which video data standards are associated with drive frequencies.
- FIG. 35A is a diagram illustrating an example of a configuration for sharing a module of a signal processing unit.
- FIG. 35B is a diagram illustrating another example of
- MMT defines the format of data constituting the MMT package or the packetization method at the time of data transmission, and the data of a plurality of assets constituting the MMT package can be multiplexed and transmitted. Specifically, a plurality of different streams are generated by multiplexing data of a plurality of assets, and the stream is transmitted for each packet using each of the plurality of transmission paths.
- the system decoder model refers to, for example, each of a plurality of access units included in the received encoded data without causing buffer overflow and underflow by providing a buffer of a predetermined size. Is a model for guaranteeing that data is decoded at the corresponding decoding time.
- STD System Target Decoder
- received packets are not necessarily in the decoding order.
- a packet that is later in decoding order may be received first.
- the previously received packet must be held in the buffer until it can be decoded, and the buffer may overflow.
- the buffer may underflow.
- the conventional technique has a problem that it is not possible to guarantee the decoding of the encoded data by reducing the occurrence of buffer overflow or underflow.
- a data decoding method includes a plurality of encoded streams included in encoded data, the encoded streams transmitted using each of the plurality of transmission paths.
- the plurality of packets stored in the first buffer are rearranged in the decoding order and decoded, so that the packet can be appropriately decoded. For example, by rearranging packets in order of decoding, packets can be decoded at an appropriate timing, so that occurrence of buffer overflow or underflow can be suppressed.
- each of the plurality of packets is associated with one or more assets constituting the encoded data, and each of the one or more first buffers is associated with the asset.
- each of the plurality of packets may be distributed to the corresponding asset and stored in the corresponding first buffer.
- one or more first buffers are associated with assets, one asset data is stored in one first buffer. Therefore, buffer management can be easily performed, processing such as packet rearrangement can be speeded up, and delay can be reduced. Further, since the processing speed is increased, occurrence of buffer overflow can be reduced.
- each of the plurality of packets includes a packet identifier (packet_id) indicating a corresponding asset.
- packet_id packet identifier
- the packet is distributed by acquiring the packet identifier from the packet. May be.
- one of a plurality of modes related to timing for reading the packet from the first buffer is further selected, and the packet is changed to the first mode according to the selected mode which is the selected mode.
- the data may be read from one buffer and stored in the second buffer, and the packet stored in the second buffer may be decoded in the decoding step (S160).
- a packet can be decoded based on a mode in which the delay can be further reduced.
- the plurality of modes include a first mode (MPU mode) in which the packet can be read from the first buffer after the packet is stored in the first buffer, and a plurality of modes constituting the packet.
- a second mode (Fragment mode) in which the target division unit can be read from the first buffer after the target division unit which is one of the division units is stored in the first buffer; and the target of the packet
- a third mode (Media unit mode) in which a part of the target division unit can be read from the first buffer before storage of the division unit in the first buffer is completed may be included.
- the data when the second mode or the third mode is selected, the data can be read before the packet storage is completed, so that the processing can be speeded up. In addition, occurrence of buffer overflow can be suppressed.
- the plurality of division units may be access units or NAL units.
- a mode flag indicating the selection mode may be acquired from the encoded data, and the selection mode may be selected based on the acquired mode flag.
- the receiving side can decode the packet in the mode assumed at the time of transmission.
- the data decoding method further includes a time acquisition step (S142) for acquiring first time information for determining the decoding time of each of the plurality of packets, and the first time information, A time calculating step (S143) for calculating second time information indicating respective decoding times of a plurality of division units constituting the packet, and the rearranging step (S150) refers to the second time information.
- the packet may be stored in the second buffer for each division unit.
- each division unit for example, access unit
- the plurality of packets may be transmitted through any one of the plurality of transmission paths, and the packets in the transmission paths may be transmitted in decoding order.
- each encoded stream to be decoded independently. For example, it is possible to decode and reproduce only the first encoded stream transmitted through the first transmission path.
- the packet may be an MPU (Media Processing Unit).
- MPU Media Processing Unit
- the plurality of transmission paths may include broadcasting and communication.
- the encoded data can be transmitted using two physically different media for broadcasting and communication.
- the data decoding device receives, for each packet, a plurality of encoded streams included in the encoded data and transmitted using each of the plurality of transmission paths.
- a plurality of packets stored in the first buffer are rearranged in the decoding order and decoded.
- the packet can be decoded. For example, by rearranging packets in order of decoding, packets can be decoded at an appropriate timing, so that occurrence of buffer overflow or underflow can be suppressed.
- the data transmission method includes a step (S220) of generating a flag indicating whether or not a plurality of packets constituting a plurality of encoded streams included in encoded data need to be rearranged, And transmitting the corresponding encoded stream in units of packets using each of the transmission paths, and transmitting the flag using at least one of the plurality of transmission paths (S240).
- the processing amount can be reduced.
- FIG. 1 is a block diagram showing a configuration of data decoding apparatus 100 according to the present embodiment.
- the data decoding apparatus 100 decodes encoded data including a plurality of encoded streams transmitted using a plurality of transmission paths.
- the data decoding device 100 includes a filter unit 110, an MMT buffer unit 120, a decoding order acquisition unit 130, a time acquisition unit 140, a rearrangement unit 150, and an encoded data storage unit 160. And a decoding unit 170.
- the filter unit 110 receives the encoded stream for each packet and filters the received packet. Specifically, the filter unit 110 distributes the received packet for each asset.
- the filter unit 110 includes a receiving unit 111 and a storage unit 112.
- the reception unit 111 receives, for each packet, a plurality of encoded streams included in the encoded data and transmitted using each of the plurality of transmission paths. That is, the receiving unit 111 receives a corresponding encoded stream for each packet from each of the plurality of transmission paths.
- the plurality of encoded streams included in the encoded data are transmitted for each packet via the corresponding transmission path in a one-to-one correspondence with the plurality of transmission paths.
- each of the encoded streams is an independently decodable stream, and specifically, an MMT stream composed of a plurality of MMT packets.
- each of the plurality of packets is associated with one or more assets constituting the encoded data.
- each of the plurality of packets includes a packet identifier (packet_id) indicating a corresponding asset.
- an asset is a data entity including data having the same transport characteristics, and is one of video data, audio data, and the like, for example.
- an asset corresponds to an encoded stream of AV data.
- each hierarchical stream corresponds to a different asset. Details of the MMT packet, MMT stream, asset, and packet identifier will be described later with reference to FIG.
- the storage unit 112 stores a plurality of packets of a plurality of received encoded streams in the first buffer 120a.
- the number of first buffers 120a is one or more, and one asset is associated with each of the one or more first buffers 120a.
- the storage unit 112 distributes each of the plurality of packets to the corresponding asset and stores it in the corresponding first buffer 120a. For example, the storage unit 112 distributes packets by acquiring a packet identifier from the packet.
- the MMT buffer unit 120 includes one or more first buffers 120a. Specifically, one or more assets are associated with one or more first buffers 120a on a one-to-one basis. For example, the MMT buffer unit 120 includes the same number of first buffers 120a as the number of a plurality of assets constituting the encoded stream. For example, when the encoded stream includes video data and audio data, the MMT buffer unit 120 includes two first buffers 120a including a buffer for storing video data and a buffer for storing audio data. Prepare.
- the first buffer 120a is an MMT packet input buffer.
- the payload data of the MMT packet is rearranged in the decoding order.
- the access unit data is rearranged in the decoding order.
- the rearranged access unit data is stored in the corresponding second buffer 160a in the decoding order in accordance with the predetermined timing.
- the decoding order acquisition unit 130 acquires information indicating the decoding order of the packets from each of the plurality of packets. For example, the decoding order acquisition unit 130 acquires the decoding order of payloads by analyzing packet header information. Specifically, the decoding order acquisition unit 130 acquires the decoding order of an access unit included in the payload or a unit (for example, a NAL unit) obtained by dividing the access unit.
- a unit for example, a NAL unit
- the time acquisition unit 140 acquires first time information for determining a decoding time (DTS: Decode Time Stamp) or a presentation time (PTS: Presentation Time Stamp) of each of a plurality of packets. Specifically, the time acquisition unit 140 acquires the first time information by acquiring and analyzing configuration information (CI: Composition Information) included in the header part of the MMT stream.
- the first time information is, for example, the absolute value of the top DTS or PTS of the access unit included in the packet.
- the rearrangement unit 150 rearranges the plurality of packets stored in the first buffer 120a in the decoding order. For example, the rearrangement unit 150 rearranges a plurality of packets in the decoding order in the storage area of the first buffer 120a. Alternatively, the rearrangement unit 150 may rearrange a plurality of packets in the decoding order by outputting the packets from the first buffer 120a in the decoding order.
- the rearrangement unit 150 selects one of a plurality of modes related to the timing of reading the packet from the first buffer 120a, reads the packet from the first buffer 120a according to the selected mode that is the selected mode, Store in the buffer 160a. Specifically, rearrangement unit 150 acquires a mode flag indicating the selection mode from the encoded data, and selects the selection mode based on the acquired mode flag.
- the plurality of modes include, for example, an MPU mode, a Fragment mode, and a Media Unit mode. Details of these plural modes will be described later with reference to FIG.
- the rearrangement unit 150 calculates second time information indicating each decoding time of the division unit constituting the packet based on the first time information, for example.
- the rearrangement unit 150 refers to the second time information and stores the packet in the second buffer 160a for each division unit.
- the division unit is, for example, an access unit or a NAL unit.
- the encoded data storage unit 160 includes one or more second buffers 160a. Specifically, one or more assets are associated one-to-one with one or more second buffers 160a. For example, the encoded data storage unit 160 includes the same number of second buffers 160a as the number of a plurality of assets constituting the encoded stream.
- the decoding unit 170 decodes a plurality of packets rearranged in the decoding order. For example, the decoding unit 170 decodes a packet stored in the encoded data storage unit 160. Specifically, the decoding unit 170 decodes a packet in units of access units or NAL units based on a moving image coding standard such as HEVC.
- the decoded data (video data, audio data, etc.) generated by the decoding is output to a display, a speaker, etc., for example.
- FIG. 2 is a diagram showing a data structure of MMT stream 200 according to the present embodiment.
- the MMT stream 200 is one of one or more streams that constitute one MMT package.
- the MMT package corresponds to one piece of encoded data.
- the MMT package corresponds to one broadcast program content.
- One MMT package is divided into a plurality of MMT streams 200 and transmitted via a plurality of transmission paths.
- the encoded data is AV data encoded based on a moving image encoding standard such as HEVC.
- the encoded data includes video data, audio data, metadata attached to these, still images, files, and the like.
- Encoded data is converted to MP4 data according to the MP4 file format, which is one of the system multiplexing standards. Then, the MMT packet is generated by packetizing the MP4 data. The MMT stream 200 is generated by multiplexing a plurality of MMT packets. Note that the MP4 data does not necessarily have a complete format, and at least sample data that is a storage unit of encoded data in MP4 may be included.
- the MMT stream 200 includes a plurality of MMT packets 210.
- the MMT stream 200 is a packet sequence of a plurality of MMT packets 210.
- the MMT packet 210 is data in which one or more MPUs (Media Processing Units) are packetized.
- MPUs Media Processing Units
- one MPU or a unit obtained by dividing an MPU constitutes one MMT packet 210.
- the MPU is a data unit defined in the MMT standard, for example, MP4 data.
- the MMT packet 210 is associated with one or more assets constituting the encoded data.
- the asset is, for example, video data, audio data, or metadata.
- the MMT packet 210 includes a header 220 and a payload 230.
- the payload 230 includes one or more access units 231.
- the header 220 is attached information regarding the payload 230.
- the header 220 includes a packet identifier 221 and time information 222.
- the packet identifier 221 is an identification number (packet_id) indicating an asset corresponding to the packet among one or more assets constituting the encoded data. Specifically, the packet identifier 221 indicates the asset to which the payload 230 corresponds. For example, in FIG. 2, the packet identifier 221 is indicated by “ID1” and “ID2”.
- the packet_id is identification information unique to each asset constituting the MMT package.
- the time information 222 is time information for determining the presentation time (PTS) or the decoding time (DTS) of the access unit 231.
- the time information 222 is a relative value of PTS or DTS with respect to a predetermined time. A specific method for determining the PTS or DTS of the access unit 231 using the time information 222 will be described later.
- the payload 230 includes one or more access units 231 included in the corresponding asset among the one or more assets.
- the access unit 231 includes, for example, one or more NAL units.
- the payload 230 includes a plurality of access units 231 included in one GOP (Group Of Pictures).
- the payload 230 may include only one NAL unit or a unit obtained by dividing the NAL unit.
- the MMT stream 200 includes correspondence information 240 and time offset information 250 as service information.
- the correspondence information 240 includes, for example, information (for example, MPT (MMT Package Table)) indicating a correspondence relationship between a plurality of transmission paths such as broadcasting and communication and a plurality of packages.
- the correspondence information 240 includes information indicating a correspondence relationship between a plurality of assets such as video data and audio data and an identification number (packet_id).
- the time offset information 250 is time information for determining the PTS or DTS of the access unit 231.
- the time offset information 250 indicates an absolute PTS or DTS of a predetermined access unit.
- FIG. 3 is a diagram showing a data structure of the MPU according to the present embodiment. As described above, in this embodiment, one MPU configures one MMT packet.
- the MPU 300 includes a plurality of boxes. Specifically, the MPU 300 includes an ftyp / syp box 310, an mmpu box 320, and N (N is an integer of 1 or more) box sets 330. Each of the N box sets 330 includes a moof box 331 and an mdat box 332.
- the ftyp / syp box 310 includes information indicating the file type of the MPU 300, for example.
- the mmpu box 320 includes the sequence number of the MPU 300 and the packet_id.
- the moof box 331 includes MP4 header information.
- the moof box 331 includes time information such as a presentation time or a decoding time of a sample included in the mdat box 332.
- the moof box 331 includes a traf box and a tfdt box (not shown).
- the mdat box 332 includes content data such as video and audio.
- the mdat box 332 includes a sample or a subsample, specifically, an access unit or a NAL unit.
- FIG. 4 is a diagram showing an example of the data flow of the MMT packet according to the present embodiment.
- the plurality of transmission paths are physically different transmission paths.
- the plurality of transmission paths are interfaces provided in the data decoding apparatus 100, and are connected to interfaces such as different connectors or connection ports, for example.
- the plurality of transmission paths include, for example, broadcasting and communication. Broadcasting includes terrestrial broadcasting, satellite broadcasting, CATV, and the like.
- the communication includes wireless communication such as the Internet and wired communication such as optical fiber communication.
- one television program content is transmitted using two transmission paths of terrestrial broadcasting and Internet communication.
- 60 fps basic layer data is transmitted via terrestrial broadcasting
- 60 fps extension layer data is transmitted via Internet communication.
- the receiving side when receiving only terrestrial broadcast data, the receiving side can reproduce the TV program content at a frame rate of 60 fps.
- a higher-definition television program content can be reproduced at a frame rate of 120 fps.
- FIG. 4 shows a data flow when the i-th MMT packet (MMTs (i)) of the s-th MMT stream arrives.
- the filter unit 110 filters the MMT stream for each MMT packet based on packet_id. Specifically, when the receiving unit 111 receives an MMT packet, the filter unit 110 acquires packet_id from the header of the MMT packet. Then, the storage unit 112 distributes the MMT packet to the corresponding asset based on the acquired packet_id. For example, when the packet_id of the received MMT packet is “m”, the storage unit 112 stores the MMT packet in the MMTBm 121.
- MMTBm 121 is the first buffer corresponding to the asset whose packet_id is “m”.
- the MMTBn 122 is a first buffer corresponding to an asset whose packet_id is “n”.
- Each of the MMTBm 121 and the MMTBn 122 is a buffer for storing MMT payload data, and is an example of one or more first buffers 120a (MMTB) provided in the MMT buffer unit 120 illustrated in FIG.
- m indicates the m-th, n-th, and k-th assets, respectively.
- m indicates video data
- n indicates audio data
- k indicates other data such as metadata.
- Other data includes, for example, configuration information.
- the plurality of MMT packets stored in MMTBm 121 or MMTBn 122 are rearranged in the decoding order and output to picture buffer 161 or audio buffer 162.
- Each of the picture buffer 161 and the audio buffer 162 is an example of the second buffer 160a shown in FIG.
- the picture buffer 161 is a video stream storage buffer (EB: Elementary stream Buffer).
- EB Elementary stream Buffer
- the picture buffer 161 is a CPB (Coded Picture Buffer) in MPEG-4 AVC or HEVC.
- the picture buffer 161 is a VBV (Video Buffer Verifier) buffer in MPEG-2 Video.
- the audio buffer 162 is an audio stream storage buffer.
- a plurality of access units constituting the same asset are input to the picture buffer 161 in the decoding order and output in the decoding order.
- a plurality of access units constituting the same video data are input to the picture buffer 161 in the decoding order and output in the decoding order.
- the picture buffer 161 outputs the stored access unit to the image decoding unit 171 based on the DTS.
- FIG. 4 shows an example in which the jth access unit AUm (j) corresponding to the mth asset is output. The same applies to the audio buffer 162.
- the image decoding unit 171 decodes the access unit input from the picture buffer 161. For example, the image decoding unit 171 decodes the access unit based on a moving image coding standard such as HEVC, and stores the decoded video data in the reference buffer 181. The image decoding unit 171 outputs video data decoded based on the PTS.
- a moving image coding standard such as HEVC
- the reference buffer 181 is a buffer for storing a reference picture in video data, that is, a decoded picture.
- the reference buffer 181 is a DPB (Decoded Picture Buffer) in MPEG-4 AVC or HEVC.
- the reference buffer 181 is a re-order buffer in MPEG-2 Video.
- the audio decoding unit 172 decodes the access unit input from the audio buffer 162. For example, the audio decoding unit 172 decodes the access unit based on audio encoding standards such as MPEG-2 AAC and MPEG-4 AAC, and outputs the decoded audio data in synchronization with the video data.
- audio encoding standards such as MPEG-2 AAC and MPEG-4 AAC
- the image decoding unit 171 and the audio decoding unit 172 are each included in the decoding unit 170 shown in FIG.
- the MMT packet when the m-th asset, specifically, the MMT packet corresponding to the video data is input, the MMT packet is sent to the MMTBm 121, the picture buffer 161, the image decoding unit 171 and the reference buffer 181. The video data is output in order.
- the MMT packet passes through the MMTBn 122, the audio buffer 162, and the audio decoding unit 172 in order, as audio data. Is output.
- the MMT packet is subjected to necessary processing.
- FIG. 5 is a diagram illustrating an example of an MMT stream transmitted through each of a plurality of transmission paths according to the present embodiment.
- FIG. 6 is a diagram showing an example of how a plurality of MMT packets according to the present embodiment are rearranged.
- “x” of “MPUxy” indicates packet_id
- “y” indicates the sequence number of the MPU in the same asset.
- “MPU10” indicates an MPU whose packet_id is “1” and whose sequence number is “0”.
- the ascending order of sequence numbers corresponds to the decoding order.
- an MMT stream including MMT packets of a plurality of assets is transmitted on the transmission path # 0.
- the MMT stream transmitted through the transmission path # 0 includes an MMT packet (such as MPU10) whose packet_id is “1” and an MMT packet (such as MPU20) whose packet_id is “2”.
- the MMT stream transmitted through the transmission path # 1 includes an MMT packet (MPU 12 or the like) whose packet_id is “1”.
- a plurality of MMT packets constituting the MMT package are transmitted through any one of a plurality of transmission paths.
- the MMT packet is transmitted through only one of a plurality of transmission paths, and is not transmitted through two or more transmission paths.
- the MPU 10 is transmitted only through the transmission path # 0 and is not transmitted through the transmission path # 1 and other transmission paths.
- packets in the transmission path are transmitted in decoding order.
- MMT packets are transmitted in ascending order of sequence numbers for each packet_id in the transmission path. Specifically, focusing on the MMT whose packet_id is “1” shown in FIG. 5, the MPU 10, the MPU 11, the MPU 14, and the MPU 15 are transmitted in ascending order of sequence numbers.
- an MMT stream transmitted through a transmission path can be decoded independently of other MMT streams transmitted through other transmission paths.
- the filter unit 110 filters, for example, only packets whose packet_id is “1”. As a result, as shown in FIG. 6A, the MMTB stores only the packets whose packet_id is “1” in the order of arrival.
- MMT packets constituting the same asset that is, MMT packets having the same packet_id are transmitted in a plurality of transmission paths. Therefore, even for MMT packets of the same asset, the arrival order at the system decoder does not necessarily match the decoding order. For example, as shown in FIG. 6A, the MPU 14 with the larger sequence number arrives earlier than the MPU 14 with the smaller sequence number.
- the rearrangement unit 150 rearranges the MMT packets so that they are in the decoding order, that is, the ascending order of the sequence numbers.
- FIG. 7 is a flowchart showing an example of the operation of the data decoding apparatus 100 according to the present embodiment.
- the MMT reference clock is NTP (Network Time Protocol), and DTS and PTS are also set based on NTP.
- NTP does not have to be a value acquired from an NTP server, and may be substituted with a value uniquely set by a broadcasting station, for example.
- the data decoding apparatus 100 acquires configuration information (S100). Specifically, the data decoding apparatus 100 acquires and analyzes the configuration information of a plurality of transmission paths that transmit the MMT package, thereby obtaining information necessary for reception from each transmission path, and the asset and packet identifier. Information indicating the correspondence is acquired as configuration information. For example, the information required for reception is a network ID in the case of broadcasting, and a URL (Uniform Resource Locator) in the case of communication.
- a network ID in the case of broadcasting
- URL Uniform Resource Locator
- the configuration information is included in, for example, headers at the top of a plurality of MMT streams.
- the configuration information includes correspondence information 240 and time offset information 250 shown in FIG.
- the time acquisition unit 140 acquires time offset information 250 included in the configuration information. After receiving the configuration information, reception of MMT assets is started.
- the receiving unit 111 receives an MMT packet from one of a plurality of transmission paths (S110). Specifically, the reception unit 111 acquires MMT packets transmitted through each of the plurality of transmission paths in the order of arrival.
- the storage unit 112 distributes the received MMT packet based on the packet_id and stores it in the MMTB (S120). In other words, the storage unit 112 filters the MMT packets, sorts the assets for each asset, and stores them in the MMTB corresponding to the sorted assets.
- the decoding order acquisition unit 130 acquires the decoding order of the access units by analyzing the header information of the MMT packet (S130). Specifically, decoding order acquisition section 130 determines the MPU decoding order based on a parameter indicating the MPU sequence number stored in the header of the MMT packet. Or the decoding order acquisition part 130 may acquire the decoding order of the unit (subsample) which divided
- the rearrangement unit 150 calculates the DTS for each access unit (S140).
- Reordering section 150 calculates a DTS for each subsample when decoding the access unit in units of subsamples.
- a specific example of the DTS calculation method will be described later with reference to FIG.
- the rearrangement unit 150 rearranges the payload data of the received MMT packets so that they are in decoding order, separates them in units of access units, and stores them in the second buffer 160a (for example, a picture buffer) (S150).
- the read (pulling) timing of the access unit from the first buffer 120a (MMTB) to the second buffer 160a is defined by one of a plurality of predefined modes (selection mode).
- the decryption unit 170 decrypts the access unit based on the DTS (S160). Specifically, the encoded data storage unit 160 extracts the target access unit from the second buffer 160a based on the DTS of the access unit and outputs it to the decoding unit 170. For example, the encoded data storage unit 160 extracts the access unit or subsample from the second buffer 160 a according to the access unit or subsample decoding time calculated by the rearrangement unit 150 and outputs the extracted access unit or subsample to the decoding unit 170. Then, the decryption unit 170 decrypts the input target access unit.
- Whether the access unit is decoded in units of subsamples may be set by the data decoding apparatus 100, or may be set by MMT configuration information or information included in the MMT message.
- the data decoding apparatus 100 ends the decoding process. If the packet reception is not completed (No in S170), the process from the packet reception (S110) is repeated.
- PTSm (i) which is the presentation time of the i-th access unit in the display order in the m-th MPU (MMT packet), is calculated using the following (Equation 1).
- PTSm (i) PTSm (0) + deltaPTS (i)
- PTSm (0) is the presentation time of the first access unit in the display order of the mth MPU.
- DeltaPTS (i) is a difference value between PTSm (i) and PTSm (0).
- DTSm (i) which is the decoding time of the i-th access unit in the decoding order in the m-th MPU (MMT packet), is calculated using the following (Equation 2).
- DTSm (i) DTSm (0) + deltaDTS (i)
- DTSm (0) is the decoding time of the access unit that is the head in the decoding order of the mth MPU.
- DeltaDTS (i) is a difference value between DTSm (i) and DTSm (0).
- the presentation time PTSm (0) and the decoding time DTSm (0) of the head access unit are examples of first time information included in configuration information (CI) acquired separately from video data, audio data, and the like.
- PTSm (0) and DTSm (0) correspond to the time offset information 250 shown in FIG.
- DeltaPTS (i) and deltaDTS (i) are obtained by analyzing a traf box included in a moof box, which is the header information of MP4.
- deltaPTS (i) and deltaDTS (i) correspond to the time information 222 shown in FIG.
- FIG. 8 is a flowchart showing an example of a DTS determination method in the data decoding apparatus according to the present embodiment.
- the rearrangement unit 150 acquires the DTS of the access unit that is the head in the decoding order in the MPU (S141). For example, the rearrangement unit 150 acquires DTSm (0) by acquiring the first time information extracted from the configuration information by the time acquisition unit 140.
- the rearrangement unit 150 acquires difference information (deltaDTS (i)) from the header of the target access unit (S142). For example, the rearrangement unit 150 analyzes the traf box in the moof box, which is MP4 header information, and acquires difference information. If the moof box is not included in the MPU, the DTS or PTS for each access unit is determined based on the header information of the MMT packet or MMT payload, the time information transmitted by the MMT message, or the like. May be.
- the rearrangement unit 150 calculates the DTS of the target access unit (S143). Specifically, the rearrangement unit 150 calculates DTSm (i) as shown in (Expression 2) by adding the acquired DTSm (0) and deltaDTS (i).
- the PTS and DTS for each access unit may be signaled in the header of the RTP packet. Since the MMT packet can have the same packet structure as RTP, it can be signaled as the header of the MMT packet or the header information of the MMT payload. Similarly, when the MMT payload is transmitted by MPEG-2 TS, the PTS and DTS for each access unit may be signaled in the header of a PES (Packetized Elementary Stream) packet. Thereby, the rearrangement part 150 can acquire PTS and DTS for every access unit by analyzing the header of a RTP packet or the header of a PES packet.
- RTP Real-time Transport Protocol
- PTSm (0) may be calculated.
- the DTS or PTS of the first access unit of the second and subsequent MPUs may not be included in the configuration information.
- the rearrangement unit 150 uses the information indicating the absolute value of the difference time between the DTS (or PTS) of the first sample of the plurality of division units constituting the MPU and the DTS (or PTS) of the first sample of the first MPU. Based on this, the DTS (or PTS) of the head access unit of each division unit may be calculated. Once the DTS is calculated, the PTS can be acquired by analyzing the information in the division unit.
- the division unit is, for example, Track Fragment which is a fragment unit in MP4. Also, as information indicating the absolute value of the difference time, the absolute value of the difference time between the DTS of the first sample of the first MPU of the same asset and the DTS of the first access unit of the Track Fragment in which the tfdt box is stored by the tfdt box. Is shown.
- FIG. 9 is a diagram illustrating an example of an operation mode of the rearrangement unit 150 according to the present embodiment.
- the rearrangement unit 150 can execute processing according to one of three operation modes.
- the first mode is a mode in which the packet can be read from the first buffer 120a after the packet is stored in the first buffer 120a.
- the first mode is an MPU mode.
- the rearrangement unit 150 waits until all the data of the MPU is ready, and then reads the target packet from the MMTB. For example, the rearrangement unit 150 reads packets from the MMTB in units of access units, and stores the read access units in the second buffer 160a.
- the second mode is a mode in which the packet can be read from the first buffer 120a after the target division unit, which is one of a plurality of division units constituting the packet, is stored in the first buffer 120a.
- the second mode is a Fragment mode.
- the rearrangement unit 150 waits until all the Fragment data that is the division unit of the MPU is completed, and then reads the target Fragment from the MMTB.
- the rearrangement unit 150 stores the read Fragment in the second buffer 160a.
- Fragment is a sample (corresponding to an access unit) in MP4 or a subsample which is a unit obtained by dividing the sample.
- the third mode is a mode in which a part of the target division unit can be read from the first buffer 120a before the packet is completely stored in the first buffer 120a.
- the third mode is a Media Unit mode.
- the rearrangement unit 150 reads a part of the MPU or Fragment from the MMTB without waiting until all the data of the MPU or Fragment is prepared when following the Media Unit mode.
- the rearrangement unit 150 stores a part of the read MPU or Fragment in the second buffer 160a.
- the data stored in the second buffer 160a is output to the decoding unit 170 in units of access units based on the DTS.
- an MPEG-4 AVC or HEVC NAL unit or HEVC decoding unit may be used as a subsample and output from the second buffer 160a to the decoding unit 170 at the subsample decoding time. Good.
- the decoding time of the subsample can be calculated by dividing the frame interval by the subsample in the access unit (or the subsample including slice data) or the number of decoding units in the case of a fixed frame rate. it can. Or the decoding time of a subsample may be determined based on additional information, such as Picture Timing SEI, in the case of a variable frame rate.
- the rearrangement unit 150 can apply the above three modes independently of the MMT packetization unit.
- reading (extraction) of data from the MMTB can be performed continuously in sub-sample units or at a predetermined bit rate.
- the decoding unit 170 performs decoding in units of subsamples
- extraction from the second buffer 160a is also performed in units of subsamples.
- MMTB can be performed in units of access units.
- extraction of data from the MMTB can be performed in units of access units. The same applies when packetization is performed using a protocol such as RTP or TS.
- the buffering time of received data can be reduced by using the Fragment mode or the Media Unit mode. In this manner, processing can be performed with low delay by using the Fragment mode or the Media Unit mode. That is, the decoding process can be speeded up.
- the rearrangement unit 150 when data of the same asset is transmitted through a single transmission path, the reception order (arrival order or storage order) of the MMT packets is the same as the decoding order. For this reason, the rearrangement unit 150 does not need to rearrange the MMT packets. For example, on the transmission side, a flag indicating whether or not rearrangement is necessary may be included in the configuration information or the MMT message and transmitted. Thereby, the rearrangement unit 150 operates so as not to perform the rearrangement when the rearrangement is unnecessary.
- the rearrangement unit 150 may not rearrange the MMT packets. However, even when only broadcasting is used as the transmission path, the rearrangement unit 150 rearranges the MMT packets in the decoding order when transmitting the same asset data through a plurality of broadcast channels.
- the reception order of MMT packets and the decoding order may not be equal in a communication path such as an IP network.
- the rearrangement unit 150 rearranges the MMT packets in the decoding order.
- the rearrangement unit 150 may select only the Fragment mode or the Media Unit mode that can be processed in units of access units in order to be consistent with the MPEG-2 TS system target decoder. Good.
- the rearrangement unit 150 can use two types of modes that define the extraction rate from the first buffer 120a.
- the first mode is the leak model mode.
- the data extraction rate from the first buffer 120a to the second buffer 160a is, for example, the upper limit value of the bit rate in the profile or level defined in the video or audio encoding method stored in the MPU. Is a value multiplied by a coefficient. As a result, consistency with the MPEG-2 TS system target decoder can be maintained.
- the second mode is a vbv_delay or HRD (Hypothetical Reference Decoder) model mode.
- video data can be extracted at a rate defined by the vbv_delay model in MPEG-2 video, or the HRD model in MPEG-4 AVC or HEVC.
- the rearrangement unit 150 stops the extraction when the buffer occupation amount of the second buffer 160a is higher than a predetermined value, for example, 100%. Moreover, the rearrangement part 150 stops extraction, when the buffer occupation amount of the 1st buffer 120a is lower than predetermined value, for example, when it is empty (0%).
- the mode to be used and the extraction rate among the two types of modes can be indicated by information that can be acquired before starting the decoding of the MMT content, such as MMT message information.
- MMT message information such as MMT message information.
- an MMT packet when transmitted by a TS, it can be indicated by an MPEG-2 system descriptor.
- the size of the first buffer 120a is, for example, a value obtained by multiplying the upper limit value of the size of the encoded picture buffer in the profile or level defined in the encoding method of video data or audio data stored in the MPU by a coefficient. .
- the coefficient is determined based on the ratio of MPU header information (moof box) to the sum of access unit sizes in the MPU. Note that the size of the first buffer 120a corresponding to the audio data can be separately defined according to the number of channels and the like.
- the size of the first buffer 120a can be indicated by information that can be acquired before the start of decoding of the MMT content, such as MMT message information.
- MMT message information such as MMT message information.
- an MMT packet when transmitted by a TS, it can be indicated by an MPEG-2 system descriptor.
- a different buffer size may be set in each of the three operation modes (MPU mode, Fragment mode, Media Unit mode) described above.
- a different buffer size may be set as the size of the first buffer 120a depending on whether the decoding processing unit is an access unit unit or a sub-sample unit. For example, the size of the first buffer 120a may be smaller when decoding in units of subsamples than when decoding in units of access units.
- a third buffer may be provided between the first buffer 120a and the second buffer 160a.
- the third buffer is, for example, an MB (Multiplexing Buffer).
- the rearrangement unit 150 controls the extraction of data from the third buffer to the second buffer 160a. As a result, it is possible to maintain consistency with the MPEG-2 TS system target decoder.
- FIG. 10 is a block diagram showing a configuration of data transmission apparatus 400 according to the present embodiment.
- the data transmission device 400 includes a flag generation unit 410, an encoding unit 420, a multiplexing unit 430, and a transmission unit 440.
- the flag generation unit 410 generates a flag indicating whether or not it is necessary to rearrange a plurality of packets constituting a plurality of encoded streams included in the encoded data. For example, the flag generation unit 410 generates a flag according to the number of transmission paths used for transmission of assets of the MMT package. Specifically, when data of the same asset is transmitted through a single transmission line, rearrangement is not necessary, and therefore the flag generation unit 410 generates a flag indicating that rearrangement is unnecessary. . Alternatively, when data of the same asset is transmitted through a plurality of transmission paths, rearrangement is necessary, and therefore the flag generation unit 410 generates a flag indicating that rearrangement is necessary.
- the flag generation unit 410 may generate a mode flag for selecting an operation mode of the rearrangement unit 150 of the data decoding device 100. Further, the flag generation unit 410 may generate information for determining the size of a buffer provided in the data decoding device 100 and the like.
- the encoding unit 420 generates encoded data by encoding content including video data and audio data based on a moving image encoding standard such as HEVC.
- the multiplexing unit 430 generates a plurality of encoded streams by multiplexing the encoded data for each packet. For example, the multiplexing unit 430 converts the encoded data into MP4 data according to the MP4 file format. Then, the multiplexing unit 430 generates an MMT packet by packetizing the MP4 data. Furthermore, the multiplexing unit 430 generates a plurality of MMT streams by multiplexing a plurality of MMT packets.
- the number of MMT streams to be generated depends on the number of transmission paths used by the transmission unit 440 for transmission.
- multiplexing section 430 generates an encoded stream so as to correspond one-to-one with a plurality of transmission paths.
- the multiplexing unit 430 stores the flag generated by the flag generation unit 410 in the header of the encoded stream.
- the multiplexing unit 430 stores a flag in configuration information or an MMT message.
- the transmission unit 440 transmits a corresponding encoded stream in units of packets using each of a plurality of transmission paths, and transmits a flag using at least one of the plurality of transmission paths. Specifically, the transmission unit 440 transmits a plurality of encoded streams generated by the multiplexing unit 430 using a plurality of transmission paths. Specifically, the transmission unit 440 transmits the encoded stream using each of the plurality of transmission paths by associating the transmission paths with the encoded streams on a one-to-one basis.
- the transmission unit 440 may separately transmit configuration information such as a flag before the MMT stream.
- the transmission unit 440 transmits configuration information including a flag using at least one of a plurality of transmission paths.
- the multiplexing unit 430 constructs the size of the MPU or Fragment so that at least the size of the MPU or Fragment is equal to or smaller than the buffer size. Specifically, the multiplexing unit 430 constructs the MPU so that the maximum size of the MPU does not become larger than the sum of the sizes of the first buffer 120a (MMTB).
- the multiplexing unit 430 may limit multiplexing so as not to multiplex a plurality of assets on the same transmission path. That is, the multiplexing unit 430 may transmit one asset through only one transmission path.
- the unit of packetization may be limited to only MPU. Good. That is, packetization using MFU (Movie Fragment Unit) described later may be prohibited.
- MFU Morovie Fragment Unit
- MFU packetization may be prohibited in the profile.
- the profile information may be signaled in the configuration information or the MMT message. Or you may signal the information which shows that the unit of packetization is restrict
- the multiplexing unit 430 may make the unit of MPU the same for each asset constituting the MMT package. For example, when the MMT package includes a video asset and an audio asset, the playback time length of the MPU is made the same. Specifically, multiplexing section 430 determines that the DTS of the access unit leading in the display (playback) order in the i-th MPU of audio is the DTS of the access unit leading in the display order in the i-th MPU of video. The unit of MPU may be determined so as to be immediately before or after.
- FIG. 11 is a flowchart showing an example of the operation of the data transmission device 400 according to the present embodiment.
- the encoding unit 420 encodes content including video data and the like based on a predetermined moving image encoding standard (S210).
- the flag generation unit 410 generates a flag indicating whether or not rearrangement is necessary (S220). As described above, the flag generation unit 410 may generate other information such as a mode flag.
- the multiplexing unit 430 generates a plurality of encoded streams by multiplexing the flag and the encoded data for each packet (S230). Then, the transmission unit 440 transmits a corresponding encoded stream in units of packets using a plurality of transmission paths (S240). If the flag is not included in the encoded stream, the transmission unit 440 transmits the flag using at least one of the plurality of transmission paths.
- the data transmitting apparatus 400 may transmit information for controlling the behavior of the system decoder according to information unique to the transmission path, or information for instructing the behavior of the system decoder.
- the information unique to the transmission path is, for example, end-to-end delay or jitter.
- the data decoding apparatus 100 may further include a buffer necessary for compensating for these.
- the data transmission device 400 transmits, for example, delay or jitter information as initialization information such as configuration information.
- the data transmission device 400 may transmit information such as delay as auxiliary information that can be transmitted as an MMT stream such as an MMT message.
- the data transmission device 400 may periodically transmit information such as a delay.
- the data transmission device 400 may determine and transmit a necessary buffering amount at the start of decoding.
- the data decoding device 100 acquires information such as delay, jitter, or buffering amount transmitted from the data transmitting device 400, and sets the buffer size and the buffering amount at the start of decoding based on the acquired information. Decisions can be made.
- the end-to-end delay in broadcasting is constant and there is no jitter.
- the header of the MMT packet includes transmission time information of the first 1 bit of the packet. Assume that data decoding apparatus 100 receives a packet at packet transmission time S1 at time T1 in broadcasting and at time T2 in communication.
- time T1 and time T2 indicate a communication delay time for broadcasting.
- the delay time is a negative value.
- the data transmitting apparatus 400 uses information indicating whether or not to discard a packet of a transmission path with a large delay, or information (delay time or the like) for determining whether or not to be discarded, as initialization information or an MMT message. You may include and transmit. For example, when the delay time is set to 0, data received by broadcasting can be decoded without being delayed in decoding time due to transmission delay or jitter in communication.
- end-to-end delay and jitter of the transmission path may be compensated by providing a separate buffer in the previous stage of the data decoding apparatus 100.
- a plurality of encoded streams included in the encoded data, and the encoded streams transmitted using each of the plurality of transmission paths are separated for each packet.
- the plurality of packets stored in the first buffer are rearranged in the decoding order and decoded, so that the packet can be appropriately decoded. For example, by rearranging packets in order of decoding, packets can be decoded at an appropriate timing, so that occurrence of buffer overflow or underflow can be reduced.
- Modification 1 In the above-described embodiment, the case where the MMT packet is packetized in units of MPU has been described, but in this modification, the case where the MMT packet is packetized in units of division obtained by dividing the MPU will be described. Specifically, the MMT packet according to the present modification is packetized in units (MPU Fragment) in which the MPU is fragmented using the MFU.
- FIG. 12 is a diagram showing an example of the data structure of the MMT stream according to this modification.
- each MPU is fragmented into three MFUs. Then, by interleaving a plurality of MFUs corresponding to a plurality of assets, MFUs corresponding to different assets are multiplexed into one MMT stream. For example, as shown in FIG. 12, an MFU 21 corresponding to audio data is arranged next to the MFU 11 corresponding to video data.
- MPU 20 audio data
- MPU 11 video data
- MFU 21 audio data
- MFU 11 video data
- the delay time can be shortened compared to interleaving in MPU units.
- the Fragment mode and the Media Unit mode shown in FIG. 9 it is preferable to reduce the delay on the multiplexing side, that is, the data transmission side.
- multiplexing in Fragment units can reduce the delay time required for multiplexing on the transmission side and the delay time required for decoding on the reception side.
- the encoded data can be packetized in units of MPU or MFU.
- the unit of packetization for each transmission path is the same. This is because the delay time required for packetization on the transmission side and the delay required for decoding on the reception side depend on the unit of packetization.
- the communication side when packetizing in units of samples (in units of access units) on the broadcasting side, the communication side also packetizes in units of samples.
- the unit of packetization when the unit of packetization is larger than MTU (Maximum Transmission Unit) of the lower transmission layer, the packetization unit may be divided. That is, the unit of packetization for each transmission path may be different.
- the unit of packetization is MPU, sample (access unit), subsample (NAL unit), or the like.
- the decoding order acquisition unit 130 of the data decoding device 100 uses the parameter indicating the decoding order of MPU Fragment in the MPU in the header of the MMT packet to the MFU.
- the decoding order of the stored access unit or subsample is determined.
- the parameter indicating the decoding order of MPU Fragment is the index number of the movie fragment to which the MFU belongs, the index number of the sample in the movie fragment, or offset information indicating the storage position in the sample.
- the decoding order of access units and subsamples can be obtained from packet header information and the like even when a protocol such as RTP is used.
- Modification 2 In the above embodiment, an example in which data is output from the first buffer 120a to the second buffer 160a has been described. However, in the present modification, the decoding unit 170 is not provided from the first buffer 120a without providing the second buffer 160a. Decode directly output data.
- FIG. 13 is a diagram illustrating an example of the data flow of the MMT packet according to the present modification.
- the data decoding apparatus 500 does not include the second buffer 160a.
- the access unit output from the MMTBm 121 is directly decoded by the image decoding unit 171.
- the access unit output from MMTBn 122 is directly decoded by voice decoding section 172.
- the data decoding device 500 according to this modification is a simplified version of the data decoding device 100 according to the first embodiment, that is, a simplified system decoder. Specifically, in the data decoding apparatus 500 according to the present modification, only the minimum buffer size required for synchronized playback between assets constituting the MMT package is defined.
- the size of the first buffer 120a is defined according to the level in the moving picture coding system.
- the MPU or Fragment size is set to be equal to or smaller than the buffer size, respectively.
- the decoding unit 170 starts decoding.
- the time acquisition unit 140 acquires the initial buffer occupancy necessary at the start of decoding of each asset from the encoded stream of audio or video or auxiliary information transmitted separately.
- the rearrangement unit 150 stores the first buffer at a time when data equal to or greater than the initial buffer occupancy is stored in the first buffer 120a corresponding to all assets. Data is extracted from 120a to the decryption unit 170. Then, the decryption unit 170 decrypts the input data. Note that the rearrangement unit 150 may extract data from the buffer to the decryption unit 170 when the buffer of any asset becomes full (occupation amount is 100%).
- the system decoder simple method according to this modification can be applied at least when underflow does not occur.
- the simple method can be applied when the transmission rate of the MMT package is sufficiently larger than the sum of the bit rates of the assets constituting the MMT package.
- the simple method can be applied to a case where the MMT package stored in a recording medium such as a hard disk or an optical disk is reproduced while being continuously read at a sufficiently high transfer rate.
- the system decoder simple method according to this modification can be used even when the transmission rate or jitter varies greatly in the transmission path and it is difficult to apply the system decoder that regulates the operation under a constant transmission rate or jitter. Can be applied.
- Modification 3 In the above embodiment, an example in which MMT packets are transmitted by MMT in both cases of broadcasting and communication has been described. For example, in the case of broadcasting, it is assumed that MMT packets are transmitted using MPEG-2 TS.
- FIG. 14 is a diagram illustrating an example of the data flow of the MMT packet according to the present modification.
- the data decoding device 600 (system decoder) according to the present modification is newly provided with a filter unit 610 and an input buffer 615, compared to the data decoding device 100 shown in FIG.
- the filter unit 610 filters TS packets that transmit the MMT package based on the packet identifier (PID) of the transport stream. Specifically, the filter unit 610 acquires the PID by analyzing the header of the TS packet included in the transport stream. Then, the filter unit 610 stores the TS packet corresponding to the PID indicating the MMT stream in the input buffer 615. At this time, the input buffer 615 stores the payload of the TS packet.
- PID packet identifier
- the input buffer 615 separates the data of the MMT package from the stored payload and outputs it to the filter unit 110. Since the processing after the filter unit 110 is the same as that of the first embodiment, the description thereof is omitted.
- the data decoding apparatus 600 can use MPEG-2 system PCR (Program Clock Reference) as a reference clock.
- MPEG-2 system PCR Program Clock Reference
- NTP reference clock in MMT
- the data decoding apparatus 600 can resynchronize based on the PCR of the TS packet having PCR_PID.
- FIG. 15 is a flowchart showing an example of the data decoding method according to this modification.
- a plurality of encoded streams included in the encoded data and transmitted using each of the plurality of transmission paths is received for each packet (S310).
- the plurality of packets of the plurality of received encoded streams are stored in the first buffer (S320).
- the plurality of packets stored in the first buffer are rearranged in the decoding order (S330).
- the plurality of packets rearranged in the decoding order are decoded (S340).
- the data transmission method according to this modification may include a plurality of steps as shown in FIG.
- FIG. 16 is a flowchart illustrating an example of a data transmission method according to this modification.
- a flag indicating whether or not it is necessary to rearrange a plurality of packets constituting a plurality of encoded streams included in the encoded data is generated (S410).
- a corresponding encoded stream is transmitted in units of packets using each of the plurality of transmission paths, and a flag is transmitted using at least one of the plurality of transmission paths (S420).
- Embodiment 2 Stores a program for realizing the configuration of the data transmission method (moving image encoding method (image encoding method)) or data decoding method (moving image decoding method (image decoding method)) described in each of the above embodiments.
- the storage medium may be any medium that can record a program, such as a magnetic disk, an optical disk, a magneto-optical disk, an IC card, and a semiconductor memory.
- the system includes an image code comprising a data transmission device (image coding device) using a data transmission method (image coding method) and a data decoding device (image decoding device) using a data decoding method (image decoding method). And a decoding / decoding apparatus.
- image coding device image coding device
- data decoding device image decoding device
- image decoding method image decoding method
- FIG. 17 is a diagram showing an overall configuration of a content supply system ex100 that realizes a content distribution service.
- a communication service providing area is divided into desired sizes, and base stations ex106, ex107, ex108, ex109, and ex110, which are fixed wireless stations, are installed in each cell.
- the content supply system ex100 includes a computer ex111, a PDA (Personal Digital Assistant) ex112, a camera ex113, a mobile phone ex114, a game machine ex115 via the Internet ex101, the Internet service provider ex102, the telephone network ex104, and the base stations ex106 to ex110. Etc. are connected.
- PDA Personal Digital Assistant
- each device may be directly connected to the telephone network ex104 without going from the base station ex106, which is a fixed wireless station, to ex110.
- the devices may be directly connected to each other via short-range wireless or the like.
- the camera ex113 is a device that can shoot moving images such as a digital video camera
- the camera ex116 is a device that can shoot still images and movies such as a digital camera.
- the mobile phone ex114 is a GSM (registered trademark) (Global System for Mobile Communications) system, a CDMA (Code Division Multiple Access) system, a W-CDMA (Wideband-Code Division Multiple Access) system, or an LTE (Long Terminal Term Evolution). It is possible to use any of the above-mentioned systems, HSPA (High Speed Packet Access) mobile phone, PHS (Personal Handyphone System), or the like.
- the camera ex113 and the like are connected to the streaming server ex103 through the base station ex109 and the telephone network ex104, thereby enabling live distribution and the like.
- live distribution content that is shot by the user using the camera ex113 (for example, music live video) is encoded as described in the above embodiments (that is, in one aspect of the present disclosure).
- the streaming server ex103 stream-distributes the content data transmitted to the requested client. Examples of the client include a computer ex111, a PDA ex112, a camera ex113, a mobile phone ex114, and a game machine ex115 that can decode the encoded data.
- Each device that has received the distributed data decodes and reproduces the received data (that is, functions as an image decoding device according to one aspect of the present disclosure).
- the captured data may be encoded by the camera ex113, the streaming server ex103 that performs data transmission processing, or may be shared with each other.
- the decryption processing of the distributed data may be performed by the client, the streaming server ex103, or may be performed in common with each other.
- still images and / or moving image data captured by the camera ex116 may be transmitted to the streaming server ex103 via the computer ex111.
- the encoding process in this case may be performed by any of the camera ex116, the computer ex111, and the streaming server ex103, or may be performed in a shared manner.
- these encoding / decoding processes are generally performed in the computer ex111 and the LSI ex500 included in each device.
- the LSI ex500 may be configured as a single chip or a plurality of chips.
- moving image encoding / decoding software is incorporated into some recording medium (CD-ROM, flexible disk, hard disk, etc.) that can be read by the computer ex111, etc., and encoding / decoding processing is performed using the software. May be.
- moving image data acquired by the camera may be transmitted.
- the moving image data at this time is data encoded by the LSI ex500 included in the mobile phone ex114.
- the streaming server ex103 may be a plurality of servers or a plurality of computers, and may process, record, and distribute data in a distributed manner.
- the encoded data can be received and reproduced by the client.
- the information transmitted by the user can be received, decrypted and reproduced by the client in real time, and personal broadcasting can be realized even for a user who does not have special rights or facilities.
- the digital broadcast system ex200 also includes at least the video encoding device (video encoding device) or video decoding of each of the above embodiments. Any of the devices (image decoding devices) can be incorporated.
- video encoding device video encoding device
- Any of the devices (image decoding devices) can be incorporated.
- the broadcast station ex201 multiplexed data obtained by multiplexing music data and the like on video data is transmitted to a communication or satellite ex202 via radio waves.
- This video data is data encoded by the moving image encoding method described in the above embodiments (that is, data encoded by the image encoding apparatus according to one aspect of the present disclosure).
- the broadcasting satellite ex202 transmits a radio wave for broadcasting, and this radio wave is received by a home antenna ex204 capable of receiving satellite broadcasting.
- the received multiplexed data is decoded and reproduced by an apparatus such as the television (receiver) ex300 or the set top box (STB) ex217 (that is, functions as an image decoding apparatus according to one embodiment of the present disclosure).
- a reader / recorder ex218 that reads and decodes multiplexed data recorded on a recording medium ex215 such as a DVD or a BD, or encodes a video signal on the recording medium ex215 and, in some cases, multiplexes and writes it with a music signal. It is possible to mount the moving picture decoding apparatus or moving picture encoding apparatus described in the above embodiments. In this case, the reproduced video signal is displayed on the monitor ex219, and the video signal can be reproduced in another device or system using the recording medium ex215 on which the multiplexed data is recorded.
- a moving picture decoding apparatus may be mounted in a set-top box ex217 connected to a cable ex203 for cable television or an antenna ex204 for satellite / terrestrial broadcasting and displayed on the monitor ex219 of the television.
- the moving picture decoding apparatus may be incorporated in the television instead of the set top box.
- FIG. 19 is a diagram illustrating a television (receiver) ex300 that uses the video decoding method and the video encoding method described in each of the above embodiments.
- the television ex300 obtains or outputs multiplexed data in which audio data is multiplexed with video data via the antenna ex204 or the cable ex203 that receives the broadcast, and demodulates the received multiplexed data.
- the modulation / demodulation unit ex302 that modulates multiplexed data to be transmitted to the outside, and the demodulated multiplexed data is separated into video data and audio data, or the video data and audio data encoded by the signal processing unit ex306 Is provided with a multiplexing / demultiplexing unit ex303.
- the television ex300 decodes the audio data and the video data, or encodes the information
- the audio signal processing unit ex304 the video signal processing unit ex305 (the image encoding device or the image according to one embodiment of the present disclosure)
- a signal processing unit ex306 that functions as a decoding device
- a speaker ex307 that outputs the decoded audio signal
- an output unit ex309 that includes a display unit ex308 such as a display that displays the decoded video signal.
- the television ex300 includes an interface unit ex317 including an operation input unit ex312 that receives an input of a user operation.
- the television ex300 includes a control unit ex310 that performs overall control of each unit, and a power supply circuit unit ex311 that supplies power to each unit.
- the interface unit ex317 includes a bridge unit ex313 connected to an external device such as a reader / recorder ex218, a recording unit ex216 such as an SD card, and an external recording unit such as a hard disk.
- a driver ex315 for connecting to a medium, a modem ex316 for connecting to a telephone network, and the like may be included.
- the recording medium ex216 is capable of electrically recording information by using a nonvolatile / volatile semiconductor memory element to be stored.
- Each part of the television ex300 is connected to each other via a synchronous bus.
- the television ex300 receives a user operation from the remote controller ex220 or the like, and demultiplexes the multiplexed data demodulated by the modulation / demodulation unit ex302 by the multiplexing / demultiplexing unit ex303 based on the control of the control unit ex310 having a CPU or the like. Furthermore, in the television ex300, the separated audio data is decoded by the audio signal processing unit ex304, and the separated video data is decoded by the video signal processing unit ex305 using the decoding method described in each of the above embodiments.
- the decoded audio signal and video signal are output from the output unit ex309 to the outside. At the time of output, these signals may be temporarily stored in the buffers ex318, ex319, etc. so that the audio signal and the video signal are reproduced in synchronization. Also, the television ex300 may read multiplexed data from recording media ex215 and ex216 such as a magnetic / optical disk and an SD card, not from broadcasting. Next, a configuration in which the television ex300 encodes an audio signal or a video signal and transmits the signal to the outside or to a recording medium will be described.
- the television ex300 receives a user operation from the remote controller ex220 and the like, encodes an audio signal with the audio signal processing unit ex304, and converts the video signal with the video signal processing unit ex305 based on the control of the control unit ex310. Encoding is performed using the encoding method described in (1).
- the encoded audio signal and video signal are multiplexed by the multiplexing / demultiplexing unit ex303 and output to the outside. When multiplexing, these signals may be temporarily stored in the buffers ex320, ex321, etc. so that the audio signal and the video signal are synchronized.
- a plurality of buffers ex318, ex319, ex320, and ex321 may be provided as illustrated, or one or more buffers may be shared. Further, in addition to the illustrated example, data may be stored in the buffer as a buffer material that prevents system overflow and underflow even between the modulation / demodulation unit ex302 and the multiplexing / demultiplexing unit ex303, for example.
- the television ex300 has a configuration for receiving AV input of a microphone and a camera, and performs encoding processing on the data acquired from them. Also good.
- the television ex300 has been described as a configuration capable of the above-described encoding processing, multiplexing, and external output, but these processing cannot be performed, and only the above-described reception, decoding processing, and external output are possible. It may be a configuration.
- the decoding process or the encoding process may be performed by either the television ex300 or the reader / recorder ex218,
- the reader / recorder ex218 may share with each other.
- FIG. 20 shows a configuration of the information reproducing / recording unit ex400 when data is read from or written to an optical disk.
- the information reproducing / recording unit ex400 includes elements ex401, ex402, ex403, ex404, ex405, ex406, and ex407 described below.
- the optical head ex401 irradiates a laser spot on the recording surface of the recording medium ex215 that is an optical disk to write information, and detects information reflected from the recording surface of the recording medium ex215 to read the information.
- the modulation recording unit ex402 electrically drives a semiconductor laser built in the optical head ex401 and modulates the laser beam according to the recording data.
- the reproduction demodulator ex403 amplifies the reproduction signal obtained by electrically detecting the reflected light from the recording surface by the photodetector built in the optical head ex401, separates and demodulates the signal component recorded on the recording medium ex215, and is necessary To play back information.
- the buffer ex404 temporarily holds information to be recorded on the recording medium ex215 and information reproduced from the recording medium ex215.
- the disk motor ex405 rotates the recording medium ex215.
- the servo control unit ex406 moves the optical head ex401 to a predetermined information track while controlling the rotational drive of the disk motor ex405, and performs a laser spot tracking process.
- the system control unit ex407 controls the entire information reproduction / recording unit ex400.
- the system control unit ex407 uses various types of information held in the buffer ex404, and generates and adds new information as necessary.
- the modulation recording unit ex402, the reproduction demodulation unit This is realized by recording / reproducing information through the optical head ex401 while operating the ex403 and the servo control unit ex406 in a coordinated manner.
- the system control unit ex407 includes, for example, a microprocessor, and executes these processes by executing a read / write program.
- the optical head ex401 has been described as irradiating a laser spot.
- a configuration in which higher-density recording is performed using near-field light may be used.
- FIG. 21 shows a schematic diagram of a recording medium ex215 that is an optical disk.
- Guide grooves grooves
- address information indicating the absolute position on the disc is recorded in advance on the information track ex230 by changing the shape of the groove.
- This address information includes information for specifying the position of the recording block ex231 that is a unit for recording data, and the recording block is specified by reproducing the information track ex230 and reading the address information in a recording or reproducing apparatus.
- the recording medium ex215 includes a data recording area ex233, an inner peripheral area ex232, and an outer peripheral area ex234.
- the area used for recording user data is the data recording area ex233, and the inner circumference area ex232 and the outer circumference area ex234 arranged on the inner or outer circumference of the data recording area ex233 are used for specific purposes other than user data recording. Used.
- the information reproducing / recording unit ex400 reads / writes encoded audio data, video data, or multiplexed data obtained by multiplexing these data with respect to the data recording area ex233 of the recording medium ex215.
- an optical disk such as a single-layer DVD or BD has been described as an example.
- an optical disc with a multi-dimensional recording / reproducing structure such as recording information using light of different wavelengths in the same place on the disc, or recording different layers of information from various angles. It may be.
- the car ex210 having the antenna ex205 can receive data from the satellite ex202 and the like, and the moving image can be reproduced on a display device such as the car navigation ex211 that the car ex210 has.
- the configuration of the car navigation ex211 may be, for example, a configuration in which a GPS receiving unit is added in the configuration illustrated in FIG. 19, and the same may be considered for the computer ex111, the mobile phone ex114, and the like.
- FIG. 22A is a diagram showing the mobile phone ex114 using the moving picture decoding method and the moving picture encoding method described in the above embodiment.
- the mobile phone ex114 includes an antenna ex350 for transmitting and receiving radio waves to and from the base station ex110, a camera unit ex365 capable of capturing video and still images, a video captured by the camera unit ex365, a video received by the antenna ex350, and the like Is provided with a display unit ex358 such as a liquid crystal display for displaying the decrypted data.
- the mobile phone ex114 further includes a main body unit having an operation key unit ex366, an audio output unit ex357 such as a speaker for outputting audio, an audio input unit ex356 such as a microphone for inputting audio, a captured video,
- an audio input unit ex356 such as a microphone for inputting audio
- a captured video In the memory unit ex367 for storing encoded data or decoded data such as still images, recorded audio, received video, still images, mails, or the like, or an interface unit with a recording medium for storing data
- a slot ex364 is provided.
- the mobile phone ex114 has a power supply circuit part ex361, an operation input control part ex362, and a video signal processing part ex355 with respect to a main control part ex360 that comprehensively controls each part of the main body including the display part ex358 and the operation key part ex366.
- a camera interface unit ex363, an LCD (Liquid Crystal Display) control unit ex359, a modulation / demodulation unit ex352, a multiplexing / demultiplexing unit ex353, an audio signal processing unit ex354, a slot unit ex364, and a memory unit ex367 are connected to each other via a bus ex370. ing.
- the power supply circuit unit ex361 starts up the mobile phone ex114 in an operable state by supplying power from the battery pack to each unit.
- the cellular phone ex114 converts the audio signal collected by the audio input unit ex356 in the voice call mode into a digital audio signal by the audio signal processing unit ex354 based on the control of the main control unit ex360 having a CPU, a ROM, a RAM, and the like. Then, this is subjected to spectrum spread processing by the modulation / demodulation unit ex352, digital-analog conversion processing and frequency conversion processing are performed by the transmission / reception unit ex351, and then transmitted via the antenna ex350.
- the mobile phone ex114 also amplifies the received data received via the antenna ex350 in the voice call mode, performs frequency conversion processing and analog-digital conversion processing, performs spectrum despreading processing by the modulation / demodulation unit ex352, and performs voice signal processing unit After being converted into an analog audio signal by ex354, this is output from the audio output unit ex357.
- the text data of the e-mail input by operating the operation key unit ex366 of the main unit is sent to the main control unit ex360 via the operation input control unit ex362.
- the main control unit ex360 performs spread spectrum processing on the text data in the modulation / demodulation unit ex352, performs digital analog conversion processing and frequency conversion processing in the transmission / reception unit ex351, and then transmits the text data to the base station ex110 via the antenna ex350.
- almost the reverse process is performed on the received data and output to the display unit ex358.
- the video signal processing unit ex355 compresses the video signal supplied from the camera unit ex365 by the moving image encoding method described in the above embodiments. Encode (that is, function as an image encoding apparatus according to an aspect of the present disclosure), and send the encoded video data to the multiplexing / demultiplexing unit ex353.
- the audio signal processing unit ex354 encodes the audio signal picked up by the audio input unit ex356 while the camera unit ex365 images a video, a still image, etc., and sends the encoded audio data to the multiplexing / separating unit ex353. To do.
- the multiplexing / demultiplexing unit ex353 multiplexes the encoded video data supplied from the video signal processing unit ex355 and the encoded audio data supplied from the audio signal processing unit ex354 by a predetermined method, and is obtained as a result.
- the multiplexed data is subjected to spread spectrum processing by the modulation / demodulation unit (modulation / demodulation circuit unit) ex352, digital-analog conversion processing and frequency conversion processing by the transmission / reception unit ex351, and then transmitted via the antenna ex350.
- the multiplexing / separating unit ex353 separates the multiplexed data into a video data bit stream and an audio data bit stream, and performs video signal processing on the video data encoded via the synchronization bus ex370.
- the encoded audio data is supplied to the audio signal processing unit ex354 while being supplied to the unit ex355.
- the video signal processing unit ex355 decodes the video signal by decoding using the video decoding method corresponding to the video encoding method described in each of the above embodiments (that is, an image according to an aspect of the present disclosure).
- video and still images included in the moving image file linked to the home page are displayed from the display unit ex358 via the LCD control unit ex359.
- the audio signal processing unit ex354 decodes the audio signal, and the audio is output from the audio output unit ex357.
- the terminal such as the mobile phone ex114 is referred to as a transmission terminal having only an encoder and a receiving terminal having only a decoder.
- a transmission terminal having only an encoder
- a receiving terminal having only a decoder.
- multiplexed data in which music data or the like is multiplexed with video data is received and transmitted, but data in which character data or the like related to video is multiplexed in addition to audio data It may be video data itself instead of multiplexed data.
- the moving picture encoding method or the moving picture decoding method shown in each of the above embodiments can be used in any of the above-described devices / systems. The described effect can be obtained.
- multiplexed data obtained by multiplexing audio data or the like with video data is configured to include identification information indicating which standard the video data conforms to.
- identification information indicating which standard the video data conforms to.
- FIG. 23 is a diagram showing a structure of multiplexed data.
- multiplexed data is obtained by multiplexing one or more of a video stream, an audio stream, a presentation graphics stream (PG), and an interactive graphics stream.
- the video stream indicates the main video and sub-video of the movie
- the audio stream (IG) indicates the main audio portion of the movie and the sub-audio mixed with the main audio
- the presentation graphics stream indicates the subtitles of the movie.
- the main video indicates a normal video displayed on the screen
- the sub-video is a video displayed on a small screen in the main video.
- the interactive graphics stream indicates an interactive screen created by arranging GUI components on the screen.
- the video stream is encoded by the moving image encoding method or apparatus shown in the above embodiments, or the moving image encoding method or apparatus conforming to the conventional standards such as MPEG-2, MPEG4-AVC, and VC-1. ing.
- the audio stream is encoded by a method such as Dolby AC-3, Dolby Digital Plus, MLP, DTS, DTS-HD, or linear PCM.
- Each stream included in the multiplexed data is identified by PID. For example, 0x1011 for video streams used for movie images, 0x1100 to 0x111F for audio streams, 0x1200 to 0x121F for presentation graphics, 0x1400 to 0x141F for interactive graphics streams, 0x1B00 to 0x1B1F are assigned to video streams used for sub-pictures, and 0x1A00 to 0x1A1F are assigned to audio streams used for sub-audio mixed with the main audio.
- FIG. 24 is a diagram schematically showing how multiplexed data is multiplexed.
- a video stream ex235 composed of a plurality of video frames and an audio stream ex238 composed of a plurality of audio frames are converted into PES packet sequences ex236 and ex239, respectively, and converted into TS packets ex237 and ex240.
- the data of the presentation graphics stream ex241 and interactive graphics ex244 are converted into PES packet sequences ex242 and ex245, respectively, and further converted into TS packets ex243 and ex246.
- the multiplexed data ex247 is configured by multiplexing these TS packets into one stream.
- FIG. 25 shows in more detail how the video stream is stored in the PES packet sequence.
- the first row in FIG. 25 shows a video frame sequence of the video stream.
- the second level shows a PES packet sequence.
- a plurality of Video Presentation Units in a video stream are divided into pictures, B pictures, and P pictures, and are stored in the payload of the PES packet.
- Each PES packet has a PES header, and a PTS (Presentation Time-Stamp) that is a display time of a picture and a DTS (Decoding Time-Stamp) that is a decoding time of a picture are stored in the PES header.
- PTS Presentation Time-Stamp
- DTS Decoding Time-Stamp
- FIG. 26 shows the format of the TS packet that is finally written in the multiplexed data.
- the TS packet is a 188-byte fixed-length packet composed of a 4-byte TS header having information such as a PID for identifying a stream and a 184-byte TS payload for storing data.
- the PES packet is divided and stored in the TS payload.
- a 4-byte TP_Extra_Header is added to a TS packet, forms a 192-byte source packet, and is written in multiplexed data.
- TP_Extra_Header information such as ATS (Arrival_Time_Stamp) is described.
- ATS indicates the transfer start time of the TS packet to the PID filter of the decoder.
- Source packets are arranged in the multiplexed data as shown in the lower part of FIG. 26, and the number incremented from the head of the multiplexed data is called SPN (source packet number).
- TS packets included in the multiplexed data include PAT (Program Association Table), PMT (Program Map Table), PCR (Program Clock Reference), and the like in addition to each stream such as video / audio / caption.
- PAT indicates what the PID of the PMT used in the multiplexed data is, and the PID of the PAT itself is registered as 0.
- the PMT has the PID of each stream such as video / audio / subtitles included in the multiplexed data and the attribute information of the stream corresponding to each PID, and has various descriptors related to the multiplexed data.
- the descriptor includes copy control information for instructing permission / non-permission of copying of multiplexed data.
- the PCR corresponds to the ATS in which the PCR packet is transferred to the decoder. Contains STC time information.
- FIG. 27 is a diagram for explaining the data structure of the PMT in detail.
- a PMT header describing the length of data included in the PMT is arranged at the head of the PMT.
- a plurality of descriptors related to multiplexed data are arranged.
- the copy control information and the like are described as descriptors.
- a plurality of pieces of stream information regarding each stream included in the multiplexed data are arranged.
- the stream information includes a stream descriptor in which a stream type, a stream PID, and stream attribute information (frame rate, aspect ratio, etc.) are described to identify a compression codec of the stream.
- the multiplexed data is recorded together with the multiplexed data information file.
- the multiplexed data information file is management information of multiplexed data, has a one-to-one correspondence with the multiplexed data, and includes multiplexed data information, stream attribute information, and an entry map.
- the multiplexed data information includes a system rate, a reproduction start time, and a reproduction end time as shown in FIG.
- the system rate indicates a maximum transfer rate of multiplexed data to a PID filter of a system target decoder described later.
- the ATS interval included in the multiplexed data is set to be equal to or less than the system rate.
- the playback start time is the PTS of the first video frame of the multiplexed data
- the playback end time is set by adding the playback interval for one frame to the PTS of the video frame at the end of the multiplexed data.
- attribute information about each stream included in the multiplexed data is registered for each PID.
- the attribute information has different information for each video stream, audio stream, presentation graphics stream, and interactive graphics stream.
- the video stream attribute information includes the compression codec used to compress the video stream, the resolution of the individual picture data constituting the video stream, the aspect ratio, and the frame rate. It has information such as how much it is.
- the audio stream attribute information includes the compression codec used to compress the audio stream, the number of channels included in the audio stream, the language supported, and the sampling frequency. With information. These pieces of information are used for initialization of the decoder before the player reproduces it.
- the stream type included in the PMT is used.
- video stream attribute information included in the multiplexed data information is used.
- the video encoding shown in each of the above embodiments for the stream type or video stream attribute information included in the PMT.
- FIG. 30 shows steps of the moving picture decoding method according to the present embodiment.
- step exS100 the stream type included in the PMT or the video stream attribute information included in the multiplexed data information is acquired from the multiplexed data.
- step exS101 it is determined whether or not the stream type or the video stream attribute information indicates multiplexed data generated by the moving picture encoding method or apparatus described in the above embodiments. To do.
- step exS102 the above embodiments are performed. Decoding is performed by the moving picture decoding method shown in the form.
- the conventional information Decoding is performed by a moving image decoding method compliant with the standard.
- FIG. 31 shows a configuration of an LSI ex500 that is made into one chip.
- the LSI ex500 includes elements ex501, ex502, ex503, ex504, ex505, ex506, ex507, ex508, and ex509 described below, and each element is connected via a bus ex510.
- the power supply circuit unit ex505 is activated to an operable state by supplying power to each unit when the power supply is on.
- the LSI ex500 uses the AV I / O ex509 to perform the microphone ex117 and the camera ex113 based on the control of the control unit ex501 including the CPU ex502, the memory controller ex503, the stream controller ex504, the driving frequency control unit ex512, and the like.
- the AV signal is input from the above.
- the input AV signal is temporarily stored in an external memory ex511 such as SDRAM.
- the accumulated data is divided into a plurality of times as appropriate according to the processing amount and the processing speed and sent to the signal processing unit ex507, and the signal processing unit ex507 encodes an audio signal and / or video. Signal encoding is performed.
- the encoding process of the video signal is the encoding process described in the above embodiments.
- the signal processing unit ex507 further performs processing such as multiplexing the encoded audio data and the encoded video data according to circumstances, and outputs the result from the stream I / Oex 506 to the outside.
- the output multiplexed data is transmitted to the base station ex107 or written to the recording medium ex215. It should be noted that data should be temporarily stored in the buffer ex508 so as to be synchronized when multiplexing.
- the memory ex511 is described as an external configuration of the LSI ex500.
- a configuration included in the LSI ex500 may be used.
- the number of buffers ex508 is not limited to one, and a plurality of buffers may be provided.
- the LSI ex500 may be made into one chip or a plurality of chips.
- control unit ex501 includes the CPU ex502, the memory controller ex503, the stream controller ex504, the drive frequency control unit ex512, and the like, but the configuration of the control unit ex501 is not limited to this configuration.
- the signal processing unit ex507 may further include a CPU.
- the CPU ex502 may be configured to include a signal processing unit ex507 or, for example, an audio signal processing unit that is a part of the signal processing unit ex507.
- the control unit ex501 is configured to include a signal processing unit ex507 or a CPU ex502 having a part thereof.
- LSI LSI
- IC system LSI
- super LSI ultra LSI depending on the degree of integration
- the method of circuit integration is not limited to LSI, and implementation with a dedicated circuit or a general-purpose processor is also possible.
- An FPGA Field Programmable Gate Array
- Such a programmable logic device typically loads or reads a program constituting software or firmware from a memory or the like, so that the moving image encoding method or the moving image described in each of the above embodiments is used.
- An image decoding method can be performed.
- FIG. 32 shows a configuration ex800 in the present embodiment.
- the drive frequency switching unit ex803 sets the drive frequency high when the video data is generated by the moving image encoding method or apparatus described in the above embodiments.
- the decoding processing unit ex801 that executes the moving picture decoding method described in each of the above embodiments is instructed to decode the video data.
- the video data is video data compliant with the conventional standard, compared to the case where the video data is generated by the moving picture encoding method or apparatus shown in the above embodiments, Set the drive frequency low. Then, it instructs the decoding processing unit ex802 compliant with the conventional standard to decode the video data.
- the drive frequency switching unit ex803 includes the CPU ex502 and the drive frequency control unit ex512 in FIG.
- the decoding processing unit ex801 that executes the moving picture decoding method shown in each of the above embodiments and the decoding processing unit ex802 that complies with the conventional standard correspond to the signal processing unit ex507 in FIG.
- the CPU ex502 identifies which standard the video data conforms to.
- the drive frequency control unit ex512 sets the drive frequency.
- the signal processing unit ex507 decodes the video data.
- the identification of the video data for example, it is conceivable to use the identification information described in the third embodiment.
- the identification information is not limited to that described in Embodiment 3, and any information that can identify which standard the video data conforms to may be used. For example, it is possible to identify which standard the video data conforms to based on an external signal that identifies whether the video data is used for a television or a disk. In some cases, identification may be performed based on such an external signal. In addition, the selection of the driving frequency in the CPU ex502 may be performed based on, for example, a lookup table in which video data standards and driving frequencies are associated with each other as shown in FIG. The look-up table is stored in the buffer ex508 or the internal memory of the LSI, and the CPU ex502 can select the drive frequency by referring to the look-up table.
- FIG. 33 shows steps for executing the method of the present embodiment.
- the signal processing unit ex507 acquires identification information from the multiplexed data.
- the CPU ex502 identifies whether the video data is generated by the encoding method or apparatus described in each of the above embodiments based on the identification information.
- the CPU ex502 sends a signal for setting the drive frequency high to the drive frequency control unit ex512. Then, the drive frequency control unit ex512 sets a high drive frequency.
- step exS203 the CPU ex502 drives the signal for setting the drive frequency low. This is sent to the frequency control unit ex512. Then, in the drive frequency control unit ex512, the drive frequency is set to be lower than that in the case where the video data is generated by the encoding method or apparatus described in the above embodiments.
- the power saving effect can be further enhanced by changing the voltage applied to the LSI ex500 or the device including the LSI ex500 in conjunction with the switching of the driving frequency. For example, when the drive frequency is set low, it is conceivable that the voltage applied to the LSI ex500 or the device including the LSI ex500 is set low as compared with the case where the drive frequency is set high.
- the setting method of the driving frequency may be set to a high driving frequency when the processing amount at the time of decoding is large, and to a low driving frequency when the processing amount at the time of decoding is small. It is not limited to the method.
- the amount of processing for decoding video data compliant with the MPEG4-AVC standard is larger than the amount of processing for decoding video data generated by the moving picture encoding method or apparatus described in the above embodiments. It is conceivable that the setting of the driving frequency is reversed to that in the case described above.
- the method for setting the drive frequency is not limited to the configuration in which the drive frequency is lowered.
- the voltage applied to the LSIex500 or the apparatus including the LSIex500 is set high.
- the driving of the CPU ex502 is stopped.
- the CPU ex502 is temporarily stopped because there is room in processing. Is also possible. Even when the identification information indicates that the video data is generated by the moving image encoding method or apparatus described in each of the above embodiments, if there is a margin for processing, the CPU ex502 is temporarily driven. It can also be stopped. In this case, it is conceivable to set the stop time shorter than in the case where the video data conforms to the conventional standards such as MPEG-2, MPEG4-AVC, and VC-1.
- a plurality of video data that conforms to different standards may be input to the above-described devices and systems such as a television and a mobile phone.
- the signal processing unit ex507 of the LSI ex500 needs to support a plurality of standards in order to be able to decode even when a plurality of video data complying with different standards is input.
- the signal processing unit ex507 corresponding to each standard is used individually, there is a problem that the circuit scale of the LSI ex500 increases and the cost increases.
- a decoding processing unit for executing the moving picture decoding method shown in each of the above embodiments and a decoding conforming to a standard such as MPEG-2, MPEG4-AVC, or VC-1
- the processing unit is partly shared.
- An example of this configuration is shown as ex900 in FIG. 35A.
- the moving picture decoding method shown in each of the above embodiments and the moving picture decoding method compliant with the MPEG4-AVC standard are processed in processes such as entropy coding, inverse quantization, deblocking filter, and motion compensation. Some contents are common.
- the decoding processing unit ex902 corresponding to the MPEG4-AVC standard is shared, and for other processing content specific to one aspect of the present disclosure that does not support the MPEG4-AVC standard, a dedicated decoding processing unit A configuration using ex901 is conceivable.
- a dedicated decoding processing unit ex901 is used for packet rearrangement, and other entropy decoding and deblocking filters are used. It is conceivable to share a decoding processing unit for any or all of the motion compensation processes.
- the decoding processing unit for executing the moving picture decoding method described in each of the above embodiments is shared, and the processing content specific to the MPEG4-AVC standard As for, a configuration using a dedicated decoding processing unit may be used.
- ex1000 in FIG. 35B shows another example in which processing is partially shared.
- a dedicated decoding processing unit ex1001 corresponding to the processing content specific to one aspect of the present disclosure
- a dedicated decoding processing unit ex1002 corresponding to the processing content specific to another conventional standard
- a common decoding processing unit ex1003 corresponding to the processing contents common to the moving image decoding method according to the above and other conventional moving image decoding methods.
- the dedicated decoding processing units ex1001 and ex1002 are not necessarily specialized for one aspect of the present disclosure or processing content specific to other conventional standards, and can execute other general-purpose processing. Also good.
- the configuration of the present embodiment can be implemented by LSI ex500.
- the processing content common to the moving image decoding method according to one aspect of the present disclosure and the moving image decoding method of the conventional standard is shared by the decoding processing unit, thereby reducing the circuit scale of the LSI, In addition, the cost can be reduced.
- the packet identifier (packet_id) is signaled by the header of the MMT packet
- the payload of the MMT packet can be transmitted using a protocol such as MPEG2-TS or RTP.
- the packet identifier may be separately signaled by defining a packet header of the protocol or a payload format for storing the MMT in the protocol and defining the payload header.
- each MMT packet can be filtered using a package identifier (package_id) for identifying the package.
- package_id package identifier
- the first buffer 120a and the second buffer 160a only need to correspond one-to-one with the asset, and the number of assets is different from the number of each of the first buffer 120a and the second buffer 160a. It may be.
- the number of the first buffer 120a and the second buffer 160a may be smaller than the number of assets.
- the second buffer 160a may not be allocated to an asset that does not require the second buffer 160a.
- reordering reordering
- first buffer an index number indicating what number the payload data stored in the MMTB is in decoding order
- an access unit may be input from the MMTB to a picture buffer or the like based on the index number.
- the arrival order to the system decoder may not be the decoding order even when the same asset is transmitted through one transmission path. Also in this case, the rearrangement according to the above-described embodiment can be applied.
- identification information indicating a model of a system decoder to which the transmitted MMT stream is compliant may be stored in auxiliary information such as configuration information or an MMT message.
- the model of the system decoder is, for example, the basic method shown in the first embodiment and the simple method shown in the second modification of the first embodiment.
- identification information indicating a model can also be included in reproduction assistance information such as SDP (Session Description Protocol).
- SDP Session Description Protocol
- identification information indicating the model is stored in a descriptor defined by the MPEG-2 system, and then PAT (Program Association Table) or PMT (Program Map Table). It may be transmitted as a part of the section data.
- a playback mode in which the signaled MMT package conforms to the behavior of the system decoder may be included in the auxiliary information.
- playback modes there are an MPU mode, a Fragment mode, and a Media Unit mode.
- the MMT package when used as pre-stored content, such as when the MMT package is downloaded and played back, it can be played back even if the MMT package does not conform to the system decoder. For example, in particular, in an environment where there is a margin in memory or processing speed such as a PC, playback is possible even if the MMT package does not conform to the system decoder. Therefore, a mode that does not guarantee compliance with the system decoder model may be provided and included in the auxiliary information.
- the MMT profile is You may prescribe. Then, the profile information may be separately included in the auxiliary information.
- an MP4 brand may be separately defined and included in the fytp / syt box.
- one stream can be configured by combining data of different assets.
- each of the first asset and the second asset is composed of five MPUs, and the MPUs of both MPUs match the decoding time of each MPU and the parameters related to the system decoder.
- the parameters are, for example, the buffer occupation amount at the start of decoding, the peak rate at the time of encoding, and the like.
- the stream after replacement can be made to conform to the behavior of the system decoder.
- one MMT packet may be transmitted through a plurality of transmission paths. That is, the same data of the same asset may be transmitted through different transmission paths.
- the previously received data is validated and the later received data (redundant data) is discarded.
- the later received data redundant data
- the data received on the broadcast side is discarded.
- auxiliary information such as configuration information for transmission.
- On the receiving side it is determined whether or not redundant data is transmitted. If it is transmitted, it is determined whether or not redundant data has been received.
- system decoder is not limited to MMT.
- the system decoder can also be applied to other formats in which data of the same package (corresponding to a program in the MPEG-2 system) is packetized and transmitted using one or more transmission paths.
- the minimum unit for packetization is, for example, an access unit or a unit obtained by dividing the access unit.
- each component (The filter part 110, the MMT buffer part 120, the decoding order acquisition part 130, the time acquisition part 140, the rearrangement part 150, the encoding data storage part which comprises the data decoding apparatus 100 which concerns on said embodiment.
- 160 and decoding unit 170 are realized by software such as a program executed on a computer having a CPU (Central Processing Unit), RAM, ROM (Read Only Memory) communication interface, I / O port, hard disk, display, and the like. Or may be realized by hardware such as an electronic circuit.
- the present disclosure can be used as a data decoding method, a data decoding device, a data transmission method, or the like, for example, a recorder, a television, a tablet terminal device, a mobile phone, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
- Communication Control (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
Abstract
Description
本発明者は、「背景技術」の欄において記載した従来のデータ復号方法などに関し、以下の問題が生じることを見出した。
[データ復号装置]
まず、本実施の形態に係るデータ復号装置(システムデコーダ)の概要について、図1を用いて説明する。図1は、本実施の形態に係るデータ復号装置100の構成を示すブロック図である。
続いて、本実施の形態に係るMMTストリームのデータ構造について、図2を用いて説明する。図2は、本実施の形態に係るMMTストリーム200のデータ構造を示す図である。
続いて、本実施の形態に係るMPUについて説明する。図3は、本実施の形態に係るMPUのデータ構造を示す図である。なお、上述したように、本実施の形態では、1つのMPUが1つのMMTパケットを構成する。
続いて、複数の伝送路を用いて1つのMMTパッケージを伝送する場合におけるMMTパケットのデータフローについて、図4を用いて説明する。図4は、本実施の形態に係るMMTパケットのデータの流れの一例を示す図である。
続いて、本実施の形態に係るMMTパケットの並び替えの具体例について、図5及び図6を用いて説明する。図5は、本実施の形態に係る複数の伝送路のそれぞれを伝送されるMMTストリームの一例を示す図である。図6は、本実施の形態に係る複数のMMTパケットを並び替える様子の一例を示す図である。
続いて、本実施の形態に係るデータ復号装置100(システムデコーダ)の動作の一例について、図7を用いて説明する。図7は、本実施の形態に係るデータ復号装置100の動作の一例を示すフローチャートである。
続いて、アクセスユニットのPTS(提示時刻)及びDTS(復号時刻)を決定する方法(図7のS140)の一例について説明する。
PTSm(0)は、m番目のMPUの表示順で先頭となるアクセスユニットの提示時刻である。また、deltaPTS(i)は、PTSm(i)とPTSm(0)との差分値である。
DTSm(0)は、m番目のMPUの復号順で先頭となるアクセスユニットの復号時刻である。また、deltaDTS(i)は、DTSm(i)とDTSm(0)との差分値である。
続いて、本実施の形態に係る並替部150の動作モードについて、図9を用いて説明する。具体的には、MMTバッファ部120の第1バッファ120aからデータを読み出すタイミングに関するモードについて説明する。図9は、本実施の形態に係る並替部150の動作モードの一例を示す図である。
続いて、第1バッファ120aからの引き抜きレート及びサイズについて説明する。
次に、第1バッファ120aのサイズについて説明する。
ここで、本実施の形態に係る符号化データを生成して送信するデータ送信装置について、図10を用いて説明する。図10は、本実施の形態に係るデータ送信装置400の構成を示すブロック図である。
続いて、本実施の形態に係るデータ送信装置400の動作の一例について、図11を用いて説明する。図11は、本実施の形態に係るデータ送信装置400の動作の一例を示すフローチャートである。
続いて、エンドツーエンド(end-to-end)の遅延又はジッタなどの揺らぎが大きい伝送路を用いる場合におけるデータ送信装置400及びデータ復号装置100の処理について説明する。
以上のように、本実施の形態に係るデータ復号方法では、符号化データに含まれる複数の符号化ストリームであって、複数の伝送路のそれぞれを用いて送信された符号化ストリームをパケット毎に受信する受信ステップと、受信した複数の符号化ストリームの複数のパケットを第1バッファ120aに格納する格納ステップと、第1バッファ120aに格納された複数のパケットを復号順に並び替える並替ステップと、復号順に並び替えられた複数のパケットを復号する復号ステップとを含む。
上記の実施の形態では、MMTパケットがMPU単位でパケット化されている場合について説明したが、本変形例では、MMTパケットがMPUを分割した分割単位でパケット化されている場合について説明する。具体的には、本変形例に係るMMTパケットは、MFUを用いてMPUをフラグメント化した単位(MPU Fragment)でパケット化されている。
上記の実施の形態では、第1バッファ120aから第2バッファ160aへデータを出力する例について説明したが、本変形例では、第2バッファ160aを設けずに、復号部170が第1バッファ120aから直接出力されたデータを復号する。
上記の実施の形態では、放送及び通信のいずれの場合もMMTでMMTパケットを伝送する例について説明したが、例えば、放送の場合、MPEG-2 TSを用いてMMTパケットを伝送することが想定される。
また、上記の実施の形態では、本開示に係るデータ復号方法及びデータ送信方法の具体的な実施の形態について示したが、これに限らない。例えば、本変形例に係るデータ復号方法は、図15に示すような複数のステップを含んでいてもよい。図15は、本変形例に係るデータ復号方法の一例を示すフローチャートである。
上記各実施の形態で示したデータ送信方法(動画像符号化方法(画像符号化方法))またはデータ復号方法(動画像復号化方法(画像復号方法))の構成を実現するためのプログラムを記憶メディアに記録することにより、上記各実施の形態で示した処理を独立したコンピュータシステムにおいて簡単に実施することが可能となる。記憶メディアは、磁気ディスク、光ディスク、光磁気ディスク、ICカード、半導体メモリ等、プログラムを記録できるものであればよい。
上記各実施の形態で示した動画像符号化方法または装置と、MPEG-2、MPEG4-AVC、VC-1など異なる規格に準拠した動画像符号化方法または装置とを、必要に応じて適宜切替えることにより、映像データを生成することも可能である。
上記各実施の形態で示した動画像符号化方法および装置、動画像復号化方法および装置は、典型的には集積回路であるLSIで実現される。一例として、図31に1チップ化されたLSIex500の構成を示す。LSIex500は、以下に説明する要素ex501、ex502、ex503、ex504、ex505、ex506、ex507、ex508、ex509を備え、各要素はバスex510を介して接続している。電源回路部ex505は電源がオン状態の場合に各部に対して電力を供給することで動作可能な状態に起動する。
上記各実施の形態で示した動画像符号化方法または装置によって生成された映像データを復号する場合、従来のMPEG-2、MPEG4-AVC、VC-1などの規格に準拠する映像データを復号する場合に比べ、処理量が増加することが考えられる。そのため、LSIex500において、従来の規格に準拠する映像データを復号する際のCPUex502の駆動周波数よりも高い駆動周波数に設定する必要がある。しかし、駆動周波数を高くすると、消費電力が高くなるという課題が生じる。
テレビや、携帯電話など、上述した機器・システムには、異なる規格に準拠する複数の映像データが入力される場合がある。このように、異なる規格に準拠する複数の映像データが入力された場合にも復号できるようにするために、LSIex500の信号処理部ex507が複数の規格に対応している必要がある。しかし、それぞれの規格に対応する信号処理部ex507を個別に用いると、LSIex500の回路規模が大きくなり、また、コストが増加するという課題が生じる。
以上、1つ又は複数の態様に係るデータ復号方法、データ復号装置及びデータ送信方法について、実施の形態に基づいて説明したが、本開示は、これらの実施の形態に限定されるものではない。本開示の主旨を逸脱しない限り、当業者が思いつく各種変形を本実施の形態に施したもの、及び、異なる実施の形態における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれる。
110,610 フィルタ部
111 受信部
112 格納部
120 MMTバッファ部
120a 第1バッファ
121 MMTBm
122 MMTBn
130 復号順取得部
140 時刻取得部
150 並替部
160 符号化データ記憶部
160a 第2バッファ
161 ピクチャバッファ
162 オーディオバッファ
170 復号部
171 画像復号部
172 音声復号部
181 参照バッファ
200 MMTストリーム
210 MMTパケット
220 ヘッダ
221 パケット識別子
222 時刻情報
230 ペイロード
231 アクセスユニット
240 対応情報
250 時刻オフセット情報
300 MPU
310 ftyp/stypボックス
320 mmpuボックス
330 ボックスセット
331 moofボックス
332 mdatボックス
400 データ送信装置
410 フラグ生成部
420 符号化部
430 多重化部
440 送信部
615 入力バッファ
Claims (13)
- 符号化データに含まれる複数の符号化ストリームであって、複数の伝送路のそれぞれを用いて送信された符号化ストリームをパケット毎に受信する受信ステップ(S110)と、
受信した複数の符号化ストリームの複数のパケットを第1バッファ(120a)に格納する格納ステップ(S120)と、
前記第1バッファに格納された複数のパケットを復号順に並び替える並替ステップ(S150)と、
復号順に並び替えられた複数のパケットを復号する復号ステップ(S160)とを含む
データ復号方法。 - 前記複数のパケットのそれぞれは、前記符号化データを構成する1以上のアセットに対応付けられ、
1以上の前記第1バッファにはそれぞれ、前記アセットが対応付けられ、
前記格納ステップ(S120)では、前記複数のパケットのそれぞれを、対応するアセットに分配し、対応する前記第1バッファに格納する
請求項1に記載のデータ復号方法。 - 前記複数のパケットのそれぞれは、対応するアセットを示すパケット識別子(packet_id)を含み、
前記格納ステップ(S120)では、前記パケットから前記パケット識別子を取得することで、前記パケットを分配する
請求項2に記載のデータ復号方法。 - 前記並替ステップ(S150)では、さらに、前記パケットを前記第1バッファから読み出すタイミングに関する複数のモードの1つを選択し、選択したモードである選択モードに従って、前記パケットを前記第1バッファから読み出して第2バッファに格納し、
前記復号ステップ(S160)では、前記第2バッファに格納されたパケットを復号する
請求項1~3のいずれか1項に記載のデータ復号方法。 - 前記複数のモードは、
前記パケットが前記第1バッファに格納された後に、前記第1バッファから前記パケットを読み出すことができる第1モード(MPU mode)と、
前記パケットを構成する複数の分割単位の1つである対象分割単位が前記第1バッファに格納された後に、前記第1バッファから前記対象分割単位を読み出すことができる第2モード(Fragment mode)と、
前記パケットの前記対象分割単位の前記第1バッファへの格納が完了する前に、前記第1バッファから前記対象分割単位の一部を読み出すことができる第3モード(Media unit mode)とを含む
請求項4に記載のデータ復号方法。 - 前記複数の分割単位は、アクセスユニット又はNALユニットである
請求項5に記載のデータ復号方法。 - 前記並替ステップ(S150)では、前記選択モードを示すモードフラグを前記符号化データから取得し、取得したモードフラグに基づいて前記選択モードを選択する
請求項4~6のいずれか1項に記載のデータ復号方法。 - 前記データ復号方法は、さらに、
前記複数のパケットのそれぞれの復号時刻を決定するための第1時刻情報を取得する時刻取得ステップ(S142)と、
前記第1時刻情報に基づいて、前記パケットを構成する複数の分割単位のそれぞれの復号時刻を示す第2時刻情報を算出する時刻算出ステップ(S143)とを含み、
前記並替ステップ(S150)では、前記第2時刻情報を参照して、前記パケットを前記分割単位毎に第2バッファに格納する
請求項1~3のいずれか1項に記載のデータ復号方法。 - 前記複数のパケットはそれぞれ、前記複数の伝送路のいずれか1つで送信され、
前記伝送路内におけるパケットは、復号順で送信される
請求項1~8のいずれか1項に記載のデータ復号方法。 - 前記パケットは、MPU(Media Processing Unit)である
請求項1~9のいずれか1項に記載のデータ復号方法。 - 前記複数の伝送路は、放送と、通信とを含む
請求項1~10のいずれか1項に記載のデータ復号方法。 - 符号化データに含まれる複数の符号化ストリームであって、複数の伝送路のそれぞれを用いて送信された符号化ストリームをパケット毎に受信する受信部(110)と、
前記受信部によって受信された複数の符号化ストリームの複数のパケットを格納するためのバッファ(120)と、
前記バッファに格納された複数のパケットを復号順に並び替える並替部(150)と、
前記並替部によって並び替えられた複数のパケットを復号する復号部(170)とを備える
データ復号装置。 - 符号化データに含まれる複数の符号化ストリームを構成する複数のパケットの並び替えの要否を示すフラグを生成するステップ(S220)と、
複数の伝送路のそれぞれを用いて、対応する前記符号化ストリームをパケット単位で送信し、かつ、前記複数の伝送路の少なくとも1つを用いて前記フラグを送信するステップ(S240)とを含む
データ送信方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811477957.XA CN109889888B (zh) | 2013-06-05 | 2014-06-03 | 再现方法、装置及生成方法、装置 |
JP2015501964A JP5788622B2 (ja) | 2013-06-05 | 2014-06-03 | 再生方法、再生装置および生成方法、生成装置 |
EP14808058.3A EP3007454A4 (en) | 2013-06-05 | 2014-06-03 | DATA DECODING METHOD, DATA DECODING DEVICE AND DATA TRANSMISSION METHOD |
EP20153166.2A EP3697100A1 (en) | 2013-06-05 | 2014-06-03 | Data decoding method, data decoding apparatus, and data transmitting method |
CN201480019590.0A CN105075281B (zh) | 2013-06-05 | 2014-06-03 | 数据解码方法、数据解码装置及数据发送方法 |
US14/943,060 US11070828B2 (en) | 2013-06-05 | 2015-11-17 | Method for decoding data, data decoding device, and method for transmitting data |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361831173P | 2013-06-05 | 2013-06-05 | |
US61/831,173 | 2013-06-05 | ||
JP2014107476 | 2014-05-23 | ||
JP2014-107476 | 2014-05-23 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/943,060 Continuation US11070828B2 (en) | 2013-06-05 | 2015-11-17 | Method for decoding data, data decoding device, and method for transmitting data |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014196189A1 true WO2014196189A1 (ja) | 2014-12-11 |
Family
ID=52007846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/002936 WO2014196189A1 (ja) | 2013-06-05 | 2014-06-03 | データ復号方法、データ復号装置及びデータ送信方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11070828B2 (ja) |
EP (2) | EP3697100A1 (ja) |
JP (4) | JP5788622B2 (ja) |
CN (2) | CN109889888B (ja) |
WO (1) | WO2014196189A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015226305A (ja) * | 2014-05-30 | 2015-12-14 | 三菱電機株式会社 | 符号化装置 |
WO2016139909A1 (ja) * | 2015-03-02 | 2016-09-09 | 日本電気株式会社 | 復号装置、受信機器、送信機器、送受信システム、復号方法、および復号用プログラムが記憶された記憶媒体 |
CN107211184A (zh) * | 2015-02-02 | 2017-09-26 | 日立麦克赛尔株式会社 | 广播接收装置、广播接收方法和内容输出方法 |
JP2018509060A (ja) * | 2015-02-13 | 2018-03-29 | サムスン エレクトロニクス カンパニー リミテッド | Mmtpストリームをmpeg−2 tsに変換する方法及び装置 |
US10194196B2 (en) | 2015-03-02 | 2019-01-29 | Nec Corporation | Decoding device, reception device, transmission device, transmission/reception system, decoding method, and storage medium having decoding program stored therein |
JP2019213223A (ja) * | 2015-04-02 | 2019-12-12 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 受信方法及び受信装置 |
JP2019220974A (ja) * | 2019-08-22 | 2019-12-26 | 三菱電機株式会社 | 復号装置 |
JP2020145682A (ja) * | 2020-04-15 | 2020-09-10 | 松野 桂子 | 信号処理装置 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3697100A1 (en) * | 2013-06-05 | 2020-08-19 | Sun Patent Trust | Data decoding method, data decoding apparatus, and data transmitting method |
WO2015007754A1 (en) | 2013-07-15 | 2015-01-22 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Low delay concept in multi-layered video coding |
CN113038188B (zh) | 2015-03-31 | 2023-06-06 | 松下电器(美国)知识产权公司 | 发送方法、接收方法、发送装置以及接收装置 |
EP3280150A1 (en) | 2015-03-31 | 2018-02-07 | Panasonic Intellectual Property Corporation of America | Transmission method, reception method, transmission device and reception device |
CN108023758B (zh) * | 2016-11-04 | 2020-06-02 | 华为技术有限公司 | 一种混合接入网络中处理报文的方法及网络设备 |
US11979340B2 (en) | 2017-02-12 | 2024-05-07 | Mellanox Technologies, Ltd. | Direct data placement |
US11190462B2 (en) * | 2017-02-12 | 2021-11-30 | Mellanox Technologies, Ltd. | Direct packet placement |
US20200013431A1 (en) * | 2017-03-24 | 2020-01-09 | Sony Corporation | Information processing apparatus, information recording medium, information processing method, and program |
US11252464B2 (en) | 2017-06-14 | 2022-02-15 | Mellanox Technologies, Ltd. | Regrouping of video data in host memory |
WO2019152300A1 (en) * | 2018-02-05 | 2019-08-08 | D&M Holdings Inc. | System and method for synchronizing networked rendering devices |
US10826838B2 (en) * | 2019-01-29 | 2020-11-03 | Microsoft Technology Licensing, Llc | Synchronized jitter buffers to handle codec switches |
CN111246284B (zh) * | 2020-03-09 | 2021-05-25 | 深圳创维-Rgb电子有限公司 | 视频流播放方法、系统、终端及存储介质 |
CN111628996A (zh) * | 2020-05-26 | 2020-09-04 | 南通职业大学 | 一种基于物联网的电子数据通信方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009239444A (ja) * | 2008-03-26 | 2009-10-15 | Mitsubishi Electric Corp | パケット順序制御方法、受信装置、送信装置および通信システム |
WO2013115121A1 (ja) * | 2012-01-31 | 2013-08-08 | シャープ株式会社 | 生成装置、再生装置、データ構造、生成方法、再生方法、制御プログラム、および記録媒体 |
US8638818B2 (en) | 2010-04-20 | 2014-01-28 | Samsung Electronics Co., Ltd | Interface apparatus and method for transmitting and receiving media data |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9008812B2 (en) * | 2008-06-19 | 2015-04-14 | Sirius Xm Radio Inc. | Method and apparatus for using selected content tracks from two or more program channels to automatically generate a blended mix channel for playback to a user upon selection of a corresponding preset button on a user interface |
JP3888642B2 (ja) * | 2001-10-05 | 2007-03-07 | アルパイン株式会社 | マルチメディア情報提供方法及び装置 |
CN1199403C (zh) * | 2002-05-15 | 2005-04-27 | 华为技术有限公司 | 通信网络中实现播放提示音和信号音功能的方法 |
US7542435B2 (en) * | 2004-05-12 | 2009-06-02 | Nokia Corporation | Buffer level signaling for rate adaptation in multimedia streaming |
US7882412B2 (en) * | 2004-10-05 | 2011-02-01 | Sanjiv Nanda | Enhanced block acknowledgement |
CN1969475B (zh) * | 2005-03-25 | 2012-07-04 | 桥扬科技有限公司 | 用于蜂窝广播和通信系统的方法和设备 |
JP4876504B2 (ja) * | 2005-09-27 | 2012-02-15 | 株式会社日立製作所 | 再生装置および再生方法 |
WO2007043235A1 (ja) * | 2005-10-07 | 2007-04-19 | Matsushita Electric Industrial Co., Ltd. | ストリーム再生制御装置 |
CN101390388B (zh) * | 2006-02-27 | 2010-09-15 | 松下电器产业株式会社 | 再现装置、携带式电话机和再现方法 |
US7992175B2 (en) * | 2006-05-15 | 2011-08-02 | The Directv Group, Inc. | Methods and apparatus to provide content on demand in content broadcast systems |
CN101422039B (zh) * | 2006-06-16 | 2011-04-06 | 三星电子株式会社 | 产生附加数据被填充到包的净荷区域中的传输流的传输流产生装置、发送/接收传输流的数字广播发送/接收装置及其方法 |
JP5025225B2 (ja) * | 2006-10-31 | 2012-09-12 | 株式会社東芝 | 通信装置、通信装置の制御方法および制御プログラム |
KR101407634B1 (ko) * | 2007-09-20 | 2014-06-13 | 삼성전자주식회사 | 다중 채널의 영상을 동시에 재생하는 장치 및 방법 |
US8111971B2 (en) * | 2007-12-05 | 2012-02-07 | Cisco Technology, Inc. | Systems and methods of reducing media stream delay through independent decoder clocks |
WO2009116972A1 (en) * | 2008-03-20 | 2009-09-24 | Thomson Licensing | System and method for processing priority transport stream data in real time in a multi-channel broadcast multimedia system |
US8369413B2 (en) * | 2009-04-23 | 2013-02-05 | Mediatek Inc. | Transport stream processing system and related method thereof |
US8355433B2 (en) * | 2009-08-18 | 2013-01-15 | Netflix, Inc. | Encoding video streams for adaptive video streaming |
CN103098462A (zh) * | 2010-08-06 | 2013-05-08 | 松下电器产业株式会社 | 编码方法、显示装置以及解码方法 |
JP2012095053A (ja) * | 2010-10-26 | 2012-05-17 | Toshiba Corp | ストリーム伝送システム、送信装置、受信装置、ストリーム伝送方法及びプログラム |
WO2012121572A2 (ko) * | 2011-03-10 | 2012-09-13 | 한국전자통신연구원 | 프로그램 연동형 스테레오스코픽 방송 서비스를 제공하기 위한 송신 장치 및 방법, 및 수신 장치 및 방법 |
KR101803970B1 (ko) * | 2011-03-16 | 2017-12-28 | 삼성전자주식회사 | 컨텐트를 구성하는 장치 및 방법 |
US8996719B2 (en) * | 2011-04-03 | 2015-03-31 | Jeremiah Condon | System and method of adaptive transport of multimedia data |
JPWO2013011696A1 (ja) * | 2011-07-21 | 2015-02-23 | パナソニック株式会社 | 送信装置、受信再生装置、送信方法及び受信再生方法 |
KR101946861B1 (ko) * | 2011-09-21 | 2019-02-13 | 삼성전자주식회사 | 멀티미디어 방송 서비스의 미디어 데이터 동기화 방법 및 장치 |
KR20130040090A (ko) * | 2011-10-13 | 2013-04-23 | 삼성전자주식회사 | 복합 네트워크에서 멀티미디어 데이터를 전송하기 위한 장치 및 그 방법 |
KR20130040132A (ko) * | 2011-10-13 | 2013-04-23 | 한국전자통신연구원 | 이종 ip 네트워크를 통한 미디어 코덱에 독립적인 미디어 데이터 전송 방법 |
US9319721B2 (en) * | 2011-10-13 | 2016-04-19 | Electronics And Telecommunications Research Institute | Method of configuring and transmitting an MMT transport packet |
WO2013077670A1 (ko) * | 2011-11-23 | 2013-05-30 | 한국전자통신연구원 | 스케일러빌리티 및 뷰 정보를 제공하는 스트리밍 서비스를 위한 방법 및 장치 |
US20140079368A1 (en) * | 2012-03-12 | 2014-03-20 | Panasonic Corporation | Display device and transmission device |
CN102983938A (zh) * | 2012-11-13 | 2013-03-20 | 中国人民解放军72671部队 | 基于qr码的无反馈单向数据传输方法及装置 |
JP6122626B2 (ja) * | 2012-12-06 | 2017-04-26 | 日本放送協会 | 復号装置およびプログラム |
JP5641090B2 (ja) * | 2013-03-14 | 2014-12-17 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
JP6237547B2 (ja) * | 2013-03-14 | 2017-11-29 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
EP3697100A1 (en) | 2013-06-05 | 2020-08-19 | Sun Patent Trust | Data decoding method, data decoding apparatus, and data transmitting method |
-
2014
- 2014-06-03 EP EP20153166.2A patent/EP3697100A1/en not_active Ceased
- 2014-06-03 CN CN201811477957.XA patent/CN109889888B/zh active Active
- 2014-06-03 JP JP2015501964A patent/JP5788622B2/ja active Active
- 2014-06-03 WO PCT/JP2014/002936 patent/WO2014196189A1/ja active Application Filing
- 2014-06-03 CN CN201480019590.0A patent/CN105075281B/zh active Active
- 2014-06-03 EP EP14808058.3A patent/EP3007454A4/en not_active Ceased
-
2015
- 2015-07-28 JP JP2015148193A patent/JP6026599B2/ja active Active
- 2015-11-17 US US14/943,060 patent/US11070828B2/en active Active
-
2016
- 2016-10-12 JP JP2016200548A patent/JP6280182B2/ja active Active
-
2018
- 2018-01-18 JP JP2018006551A patent/JP6431217B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009239444A (ja) * | 2008-03-26 | 2009-10-15 | Mitsubishi Electric Corp | パケット順序制御方法、受信装置、送信装置および通信システム |
US8638818B2 (en) | 2010-04-20 | 2014-01-28 | Samsung Electronics Co., Ltd | Interface apparatus and method for transmitting and receiving media data |
WO2013115121A1 (ja) * | 2012-01-31 | 2013-08-08 | シャープ株式会社 | 生成装置、再生装置、データ構造、生成方法、再生方法、制御プログラム、および記録媒体 |
Non-Patent Citations (3)
Title |
---|
"MPEG media transport (MMT), ISOIIEC DIS 23008-1", INFORMATION TECHNOLOGY - HIGH EFFICIENCY CODING AND MEDIA DELIVERY IN HETEROGENEOUS ENVIRONMENT |
"Study of ISO/IEC CD 23008-1 MPEG Media Transport", ISO/IEC JTC1/SC29/WG11 MPEG/N13089, October 2012 (2012-10-01), Shanghai, China, pages 108 - 110,41-42,9-10 * |
See also references of EP3007454A4 |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015226305A (ja) * | 2014-05-30 | 2015-12-14 | 三菱電機株式会社 | 符号化装置 |
US11405679B2 (en) | 2015-02-02 | 2022-08-02 | Maxell, Ltd. | Broadcast receiving apparatus, broadcast receiving method, and contents outputting method |
CN107211184A (zh) * | 2015-02-02 | 2017-09-26 | 日立麦克赛尔株式会社 | 广播接收装置、广播接收方法和内容输出方法 |
CN107211184B (zh) * | 2015-02-02 | 2020-10-20 | 麦克赛尔株式会社 | 广播接收装置、广播接收方法和内容输出方法 |
EP3255895A4 (en) * | 2015-02-02 | 2018-08-29 | Maxell, Ltd. | Broadcast receiver, broadcast receiving method and content output method |
EP4164231A1 (en) * | 2015-02-02 | 2023-04-12 | Maxell, Ltd. | Broadcast receiving apparatus, broadcast receiving method, and contents outputting method |
US11871071B2 (en) | 2015-02-02 | 2024-01-09 | Maxell, Ltd. | Broadcast receiving apparatus, broadcast receiving method, and contents outputting method |
JP2018509060A (ja) * | 2015-02-13 | 2018-03-29 | サムスン エレクトロニクス カンパニー リミテッド | Mmtpストリームをmpeg−2 tsに変換する方法及び装置 |
KR20190125543A (ko) * | 2015-03-02 | 2019-11-06 | 닛본 덴끼 가부시끼가이샤 | 디코딩 장치, 수신기기, 송신기기, 송수신 시스템, 디코딩 방법, 및 디코딩 프로그램이 저장된 저장매체 |
US10491944B2 (en) | 2015-03-02 | 2019-11-26 | Nec Corporation | Decoding device, reception device, transmission device, transmission/reception system, decoding method, and storage medium having decoding program stored therein |
US11128911B2 (en) | 2015-03-02 | 2021-09-21 | Nec Corporation | Decoding device, reception device, transmission device, transmission/reception system, decoding method, and storage medium having decoding program stored therein |
KR20190125542A (ko) * | 2015-03-02 | 2019-11-06 | 닛본 덴끼 가부시끼가이샤 | 디코딩 장치, 수신기기, 송신기기, 송수신 시스템, 디코딩 방법, 및 디코딩 프로그램이 저장된 저장매체 |
US10631037B2 (en) | 2015-03-02 | 2020-04-21 | Nec Corporation | Decoding device, reception device, transmission device, transmission/reception system, decoding method, and storage medium having decoding program stored therein |
KR102137350B1 (ko) * | 2015-03-02 | 2020-07-23 | 닛본 덴끼 가부시끼가이샤 | 디코딩 장치, 수신기기, 송신기기, 송수신 시스템, 디코딩 방법, 및 디코딩 프로그램이 저장된 저장매체 |
KR102137349B1 (ko) * | 2015-03-02 | 2020-07-23 | 닛본 덴끼 가부시끼가이샤 | 디코딩 장치, 수신기기, 송신기기, 송수신 시스템, 디코딩 방법, 및 디코딩 프로그램이 저장된 저장매체 |
US10194196B2 (en) | 2015-03-02 | 2019-01-29 | Nec Corporation | Decoding device, reception device, transmission device, transmission/reception system, decoding method, and storage medium having decoding program stored therein |
JPWO2016139909A1 (ja) * | 2015-03-02 | 2017-11-30 | 日本電気株式会社 | 復号装置、受信機器、送信機器、送受信システム、復号方法、および復号用プログラム |
WO2016139909A1 (ja) * | 2015-03-02 | 2016-09-09 | 日本電気株式会社 | 復号装置、受信機器、送信機器、送受信システム、復号方法、および復号用プログラムが記憶された記憶媒体 |
JP2019213223A (ja) * | 2015-04-02 | 2019-12-12 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 受信方法及び受信装置 |
JP7067653B2 (ja) | 2019-08-22 | 2022-05-16 | 三菱電機株式会社 | 復号装置 |
JP2021121108A (ja) * | 2019-08-22 | 2021-08-19 | 三菱電機株式会社 | 復号装置 |
JP2019220974A (ja) * | 2019-08-22 | 2019-12-26 | 三菱電機株式会社 | 復号装置 |
JP2020145682A (ja) * | 2020-04-15 | 2020-09-10 | 松野 桂子 | 信号処理装置 |
Also Published As
Publication number | Publication date |
---|---|
JP6026599B2 (ja) | 2016-11-16 |
CN105075281A (zh) | 2015-11-18 |
JP5788622B2 (ja) | 2015-10-07 |
JPWO2014196189A1 (ja) | 2017-02-23 |
EP3697100A1 (en) | 2020-08-19 |
EP3007454A4 (en) | 2016-06-01 |
CN109889888A (zh) | 2019-06-14 |
JP6431217B2 (ja) | 2018-11-28 |
US20160080755A1 (en) | 2016-03-17 |
EP3007454A1 (en) | 2016-04-13 |
US11070828B2 (en) | 2021-07-20 |
JP2017069961A (ja) | 2017-04-06 |
JP6280182B2 (ja) | 2018-02-14 |
CN109889888B (zh) | 2022-04-01 |
CN105075281B (zh) | 2019-01-01 |
JP2018093511A (ja) | 2018-06-14 |
JP2016027705A (ja) | 2016-02-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6431217B2 (ja) | 再生方法、再生装置および生成方法、生成装置 | |
JP7032588B2 (ja) | 送信方法および送信装置 | |
KR102072832B1 (ko) | 화상 부호화 방법, 화상 복호 방법, 화상 부호화 장치, 화상 복호 장치, 및, 화상 부호화 복호 장치 | |
CA2845548C (en) | Methods and apparatuses for encoding and decoding video using periodic buffer description | |
CN107801029B (zh) | 编码方法及编码装置 | |
JP6562369B2 (ja) | 符号化復号方法および符号化復号装置 | |
WO2012117722A1 (ja) | 符号化方法、復号方法、符号化装置及び復号装置 | |
JP6414712B2 (ja) | 多数の参照ピクチャを用いる動画像符号化方法、動画像復号方法、動画像符号化装置、および動画像復号方法 | |
CN107426575B (zh) | 影像编码方法、装置及影像解码方法、装置 | |
JP7265051B2 (ja) | 送信方法および送信装置 | |
WO2012124347A1 (en) | Methods and apparatuses for encoding and decoding video using reserved nal unit type values of avc standard | |
WO2013057938A1 (ja) | システム層処理装置、符号化装置、システム層処理方法、および符号化方法 | |
WO2013076991A1 (ja) | 画像符号化方法、画像符号化装置、画像復号方法、および、画像復号装置 | |
WO2012096157A1 (ja) | 画像符号化方法、画像復号方法、画像符号化装置および画像復号装置 | |
WO2011135829A1 (ja) | 符号化方法及び復号方法 | |
WO2013046616A1 (ja) | 画像符号化装置、画像復号装置、画像符号化方法及び画像復号方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480019590.0 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2015501964 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14808058 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014808058 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |