WO2013021656A1 - 再生装置、再生方法、集積回路、放送システム、及び放送方法 - Google Patents
再生装置、再生方法、集積回路、放送システム、及び放送方法 Download PDFInfo
- Publication number
- WO2013021656A1 WO2013021656A1 PCT/JP2012/005088 JP2012005088W WO2013021656A1 WO 2013021656 A1 WO2013021656 A1 WO 2013021656A1 JP 2012005088 W JP2012005088 W JP 2012005088W WO 2013021656 A1 WO2013021656 A1 WO 2013021656A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- image
- broadcast
- stream
- frame image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 58
- 230000005540 biological transmission Effects 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 14
- 230000008929 regeneration Effects 0.000 claims description 4
- 238000011069 regeneration method Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 description 56
- 230000004048 modification Effects 0.000 description 56
- 238000012986 modification Methods 0.000 description 56
- 239000000203 mixture Substances 0.000 description 51
- 238000010586 diagram Methods 0.000 description 49
- 238000000926 separation method Methods 0.000 description 33
- 238000004891 communication Methods 0.000 description 30
- 239000002131 composite material Substances 0.000 description 15
- 239000000284 extract Substances 0.000 description 14
- 230000015572 biosynthetic process Effects 0.000 description 10
- 238000003786 synthesis reaction Methods 0.000 description 10
- 230000006835 compression Effects 0.000 description 9
- 238000007906 compression Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000006978 adaptation Effects 0.000 description 6
- 230000001172 regenerating effect Effects 0.000 description 6
- 239000000872 buffer Substances 0.000 description 5
- 230000002123 temporal effect Effects 0.000 description 5
- 230000010354 integration Effects 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 230000000153 supplemental effect Effects 0.000 description 3
- 238000001308 synthesis method Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 102100037812 Medium-wave-sensitive opsin 1 Human genes 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000945 filler Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234318—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44012—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/631—Multimode Transmission, e.g. transmitting basic layers and enhancement layers of the content over different transmission paths or transmitting with different error corrections, different keys or with different transmission protocols
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/06—Systems for the simultaneous transmission of one television signal, i.e. both picture and sound, by more than one carrier
- H04N7/063—Simultaneous transmission of separate parts of one picture
Definitions
- the present invention relates to a technique for combining a video received by broadcast waves and a video received using a network and displaying them as one video.
- Non-Patent Document 1 receives a broadcast program transmitted by broadcast waves and a content transmitted via a network by a receiver, synchronizes, A technique of synthesizing and presenting (“Hybridcast (registered trademark)”) is disclosed.
- a receiver receives a broadcast video transmitted by broadcasting and an additional video transmitted via a network, and combines the broadcast video on the right and the additional video on the left.
- a video By displaying as a video, it is possible to display a video having a wider video transmitted by broadcast waves.
- legacy receiver since a receiver that does not support the above-described technology (hereinafter referred to as “legacy receiver”) can receive and display only video transmitted by broadcast waves, compatibility with the legacy receiver can be ensured.
- the broadcast video is also reproduced by the legacy receiver, and is often a high-importance video in which a high-priority subject is reflected as a subject.
- the broadcast video is a video in which the area around the soccer ball is roughly reflected.
- the content of the additional video is determined from a fixed relative position to the broadcast video, for example, a video arranged on the left side of the broadcast video, and may become a video with low importance.
- the broadcast video is a video around the goal on the left side when viewed from the viewer
- the additional video is only the audience seat on the left side of the goal on the left side when viewed from the viewer. This is a case where
- the present invention has been made in view of such a problem, and even when an arbitrary video that is not a video determined from a fixed relative position to the broadcast video is used as the additional video, the broadcast video and the additional video are combined. It is an object of the present invention to provide a playback apparatus that can generate and display one combined video.
- a playback device is a playback device that plays back an original video, and is a broadcast image obtained by extracting an arbitrary portion of each frame image in a group of frame images constituting the original video.
- First receiving means for acquiring broadcast video comprising a group from broadcast waves;
- second receiving means for receiving additional video comprising a remaining image group other than the extracted portion of each frame image in the frame image group;
- arrangement information acquisition means for acquiring arrangement information indicating the arrangement of the broadcast image and the remaining image in each frame image, and the broadcast image group and the remaining image group are arranged based on the arrangement information.
- a reproducing means for generating and displaying the frame image group.
- the playback apparatus can play back an original video using a broadcast video and an additional video that is an arbitrary video that is not a video determined from a fixed relative position of the broadcast video. .
- video The figure which shows an example of video arrangement
- regenerating apparatus Flowchart showing the procedure of video playback processing by the playback device
- positioning setting data The figure shown about the division
- positioning setting data The figure shown about the division
- regeneration of a broadcast image and an additional image are shared by several apparatuses.
- the figure which shows the structure of the digital stream of a MPEG-2 transport stream format The figure which shows how a video stream is stored in a PES packet sequence.
- the figure which shows the reference relation of the picture of the video stream Diagram showing the hierarchical structure of the video stream Diagram showing the configuration of the video access unit Diagram for explaining cropping area Diagram for explaining cropping area Diagram showing data structure of TS packet Diagram showing data structure of PMT Diagram for explaining the case of simply combining broadcast video and additional video
- the block diagram which shows the structure of the system which concerns on the modification of this invention The block diagram which shows the structure of the reproducing
- the content broadcast reception system includes a broadcast system and a playback device.
- the broadcast system transmits content by broadcasting (hereinafter, content transmitted by broadcasting is referred to as “broadcast content”), and transmits content related to the broadcast content (hereinafter referred to as “additional content”) by network communication.
- broadcast content content transmitted by broadcasting
- additional content content related to the broadcast content
- video is used as an example of content. That is, the broadcast content is a broadcast video, and the additional content is an additional video.
- the playback device receives broadcast video and additional video, combines them, and plays them.
- the content broadcast receiving system is designed for compatibility with a playback device (legacy device) that does not have the function of receiving and playing back additional video. That is, the broadcast system transmits broadcast video in a format that can be received and played back by the legacy device.
- a playback device legacy device
- the subject in FIG. 32 is a soccer field where a soccer game is held.
- the “soccer ball” has the highest importance as a subject, and then “advertising” is the second highest due to the relationship of sponsors and the like.
- this is only an example, and the present invention is not limited to this.
- FIG. 32A schematically shows a case where the video 3211 is generated by simply combining the additional video 3202 on the left side of the broadcast video 3201.
- Broadcast video 3201 reflects the scenery around the most important soccer ball.
- the video 3211 shows the scenery around the soccer ball with the highest importance and the entire advertisement 3206 as the subject with the second highest importance.
- the scenery around the soccer ball with the highest importance is shown as the broadcast video 3203.
- the additional video 3204 the landscape on the left side of the broadcast video 3203 is displayed. Then, in the playback device, the broadcast video 3203 and the additional video 3204 are combined to play a video such as the video 3212.
- the present invention eliminates inconveniences (for example, the second most important subject cannot be displayed) caused by the fact that only video determined from a fixed relative position to the broadcast video can be used as the additional video. is there.
- an ultra-wide video 310 is captured as an original video, and an arbitrary range of video is selected as a broadcast video 311 from the ultra-wide video 310 and transmitted as a broadcast stream.
- the ultra-wide video is a video obtained by shooting a video having a wider and higher resolution than a video broadcast using a broadcast wave.
- the remaining video 312 from which the broadcast video 311 is extracted and the additional video 320 generated by combining the video 313 are transmitted as an additional stream via the network.
- an additional video other than a video determined from a fixed relative position to the broadcast video.
- the playback apparatus side can generate and display an ultra-wide video 310 from the broadcast video 311 and the additional video 320.
- FIG. 1 is a block diagram showing a configuration of the broadcast system 100.
- the broadcast system 100 includes a wide video photographing unit 101, a video separation unit 102, a video arrangement unit 103, a broadcast stream creation unit 104, and an additional stream creation unit 105.
- the broadcast system 100 includes a processor and a memory, and the functions of the wide video photographing unit 101, the video separation unit 102, the video arrangement unit 103, the broadcast stream creation unit 104, and the additional stream creation unit 105 are as follows. This is realized by the processor executing the program stored in the memory.
- the wide video photographing unit 101 is composed of a video camera that shoots an ultra-wide video having a wider and higher resolution than a video broadcast using a broadcast wave. Output.
- the broadcast wave image is a full HD image having a horizontal 1920 pixel ⁇ vertical 1080 pixel number
- the ultra-wide image has a horizontal 3820 pixel ⁇ 1080 vertical pixel number. It is assumed that the video has.
- the video separation unit 102 receives video data related to an ultra-wide video from the wide video shooting unit 101, and separates the ultra-wide video into a video (hereinafter referred to as “broadcast video”) transmitted by a broadcast wave and another video.
- a broadcast video hereinafter referred to as “broadcast video”
- video separation function Function
- the video separation function is realized by separating the frame image into a plurality of images for each frame image constituting the video.
- the video separation function consists of a selection range acquisition function and an image separation function.
- 3 is an image having a width of c pixels and a height of d pixels.
- the video separation unit 102 acquires designation of a range to be transmitted as broadcast video (hereinafter referred to as “selection range”) in the ultra-wide video 310.
- selection range is specified by the user operating an input device (not shown) such as a remote controller.
- the selection range is shown in a rectangular portion surrounded by a wavy line in the ultra-wide video 310.
- the selection range is a range surrounded by a rectangle whose upper left corner has a coordinate (a, 0), a width (ba), and a height d.
- coordinates such as (x, y) may be indicated for video, images, etc., but these coordinates indicate the position of the pixel with the upper left corner of the video, image as the origin (0, 0). Yes.
- the video separation unit 102 extracts the video 311 within the selected range from the ultra-wide video 310.
- the video 311 is a rectangular video having a width ba and a height d.
- the video separation unit 102 obtains the remaining video 312 and the video 313 as a result of extracting the video 311 from the ultra-wide video 310.
- the video separation unit 102 outputs selection range information indicating the selection range to the video placement unit 103.
- the selection range information includes the coordinates of the upper left corner of the selection range in the wide image, the width of the selection range, and the height of the selection range.
- the video layout unit 103 has a video layout function.
- the video layout function consists of an additional video generation function and a video layout setting data generation function.
- the additional video generation function is a function for generating a video to be transmitted by communication as an additional video from the remaining video after the broadcast video is separated from the ultra-wide video by the video separation unit 102. .
- the remaining video may be one or more.
- the video separation unit 102 generates the video 320 by combining the video 312 and the video 313 that are not transmitted by broadcast waves.
- the image 312 is a rectangular image in which the coordinates of the upper left corner point are (0, 0), the width is a, and the height is d.
- the video 313 is a rectangular video in which the coordinates of the upper left corner point are (b, 0), the width is c ⁇ b, and the height is d.
- the additional video 320 is a combination of the video 312 and the video 313, and is a rectangular video having a width of a + (c ⁇ b) and a height of d.
- each frame image constituting the additional video is generally rectangular because the video is normally composed of a rectangular frame image, and is usually used when transmission, compression coding, decoding, etc. are performed on the additional video. This is because it can be handled in the same manner as the frame image.
- the video layout setting data generation function is a function that generates video layout setting data.
- the video arrangement setting data is information indicating how to arrange the broadcast video and the remaining video in order to generate an ultra-wide video.
- the video separation unit 102 refers to the selection range information and generates video layout setting data.
- the video arrangement setting data indicates how the broadcast video 311, the remaining video 312 and the video 313 should be arranged in order to generate the ultra-wide video 310.
- FIG. 4 is a diagram showing an example of video layout setting data.
- the video layout setting data 400 includes a rectangular ID, stream area information, and composite position information for the broadcast video generated by the video separation unit 102 separating the video and each remaining video.
- the rectangle ID is an ID for identifying each video generated by the video separation unit 102 separating the video.
- the rectangle ID is given a value “0” for the broadcast video 311, a value “1” for the video 312, and a value “2” for the video 313. Yes.
- Stream area information includes stream identification information, x position, y position, width, and height.
- the stream specifying information indicates whether the video specified by the rectangular ID is transmitted by broadcast wave or communication.
- the value of the stream specifying information is “0” when transmitted by broadcast waves, and is “1” when transmitted by communication.
- the x position and the y position indicate the x coordinate and y coordinate in the broadcast video or additional video of the video specified by the rectangle ID. Whether the video specified by the rectangular ID is included in the broadcast video or the additional video can be distinguished by the stream specifying information. That is, when the stream specifying information indicates a broadcast wave, it is included in the broadcast video, and when the stream specifying information indicates communication, it is included in the additional video.
- Width and height indicate the width and height of the image specified by the rectangle ID.
- the composite position information consists of an X position and a Y position.
- the X position and the Y position indicate the x coordinate and the y coordinate in the wide image of the image specified by the rectangle ID.
- the video separation unit 102 sets “0” to the rectangle ID, sets “0” indicating the broadcast wave to the stream identification information, sets “0” to the x position, and y Set the position to “0”, set the width to “ba”, set the height to “d”, set the X position to “a” which is the x coordinate in the ultra-wide image, and set the Y position Is set to “0” which is the y coordinate in the ultra-wide image.
- the video separation unit 102 sets “2” for the rectangular ID, sets “1” indicating communication to the stream specifying information, and the x position is the x coordinate in the additional video.
- “A” is set, “y” in the additional video is set to “0”, “cb” is set to width, “d” is set to height, and “x” is set to X position.
- “B” that is the x coordinate in the ultra-wide video is set, and “0” that is the y coordinate in the ultra-wide video is set in the Y position.
- the video arrangement setting data is described so as to be generated for each ultra-wide video, but actually, it is generated for each of one or a plurality of frame images constituting the ultra-wide video, It is possible to control separation and combination for each frame image.
- the broadcast stream creation unit 104 has a function of converting broadcast video into a format that can be broadcast by broadcast waves and outputting the format (hereinafter referred to as “broadcast stream creation function”).
- the broadcast stream creation unit 104 As a broadcast stream creation function, the broadcast stream creation unit 104 generates a video stream by compressing and encoding the broadcast video with a predetermined video codec such as MPEG-2 (Moving Picture Experts Group-2), MPEG-4 AVC, etc. .
- a predetermined video codec such as MPEG-2 (Moving Picture Experts Group-2), MPEG-4 AVC, etc.
- the broadcast stream creation unit 104 creates an audio stream by compressing and encoding the audio with a predetermined audio codec such as AC3 (Audio Code number 3) or AAC (Advanced Audio Coding).
- a predetermined audio codec such as AC3 (Audio Code number 3) or AAC (Advanced Audio Coding).
- the broadcast stream creation unit 104 generates a single system stream such as MPEG-2 TS by multiplexing a video stream, an audio stream, a caption stream, data broadcasting data, and the like.
- broadcast stream (121).
- the additional stream creation unit 105 has a function (hereinafter referred to as “additional stream creation function”) for converting and outputting the additional video into a format for transmission by communication.
- the additional stream creation unit 105 generates a video stream by compression-coding the additional video generated by the video placement unit 103 with a predetermined video codec such as MPEG-2, MPEG-4 AVC. Then, a system stream such as one MPEG-2 TS is generated by multiplexing a video stream, an audio stream, a caption stream, data broadcasting data, and the like.
- a predetermined video codec such as MPEG-2, MPEG-4 AVC.
- the additional stream creation unit 105 stores the video arrangement setting data 123 generated by the video arrangement unit 103 in the additional stream 122.
- the additional stream creation unit 105 stores the video arrangement setting data 123 in the video stream as a part of supplementary data or as a part of a descriptor such as PMT (Program Map Table).
- the broadcast system 100 is configured to transmit a broadcast stream using a broadcast wave, similar to that provided in a general digital broadcast transmission system.
- the transmitter includes a modulator, a frequency converter, and an RF transmitter.
- the sending unit transmits the broadcast stream generated by the broadcast stream creation unit 104 to the playback device 200 or the like by broadcast waves.
- the broadcasting system 100 includes a network interface for transmitting and receiving data using a network.
- the network interface transmits the additional stream generated by the additional stream creation unit 105 to the playback device 200 via the network.
- FIG. 6 is a flowchart showing a procedure of video transmission processing by the broadcasting system 100.
- the wide image photographing unit 101 photographs an ultra wide image (for example, an ultra wide image 310) as an original image (S601).
- the video separation unit 102 acquires the selection range in the captured ultra-wide video by the selection range acquisition function (S602).
- the video separation unit 102 extracts a selection range from each frame image constituting the original video by the video separation function, and generates a broadcast video including the extracted image group (S603).
- the broadcast stream creation unit 104 creates a broadcast stream in which broadcast video is multiplexed by the broadcast stream creation function (S604).
- the video placement unit 103 creates an additional image from the remaining images in which the selection range is extracted from each frame image constituting the ultra-wide video by the additional video creation function, and the additional video including the generated additional image group Is generated (S605).
- the video layout unit 103 generates video layout setting data by the video layout setting data generation function (S606).
- the additional stream creation unit 105 creates an additional stream obtained by multiplexing the additional video and the video arrangement setting data using the additional stream creation function (S607).
- the broadcast stream creation unit 104 broadcasts a broadcast stream using a broadcast wave (S608).
- the additional stream creation unit 105 transmits the additional stream to the playback device 200 through communication (S609).
- FIG. 2 is a block diagram showing the configuration of the playback apparatus 200.
- the playback apparatus 200 includes a tuner 201, a broadcast stream decoding unit 202, a first video plane 203, a NIC (network interface card) 211, an additional stream decoding unit 212, a second video plane 213, and a video.
- a composition method setting unit 214, an image composition unit 221, and a composition image plane 222 are configured.
- the playback device 200 includes a processor and a memory, and functions of the tuner 201, the broadcast stream decoding unit 202, the NIC 211, the additional stream decoding unit 212, the video composition method setting unit 214, and the video composition unit 221 are as follows. This is realized by the processor executing a program stored in the memory.
- the tuner 201 is realized by a tuner LSI including a digital tuner, an OFDM (Orthogonal Frequency Division Multiplexing) demodulation unit, an error correction unit, and a demultiplexing unit.
- a tuner LSI including a digital tuner, an OFDM (Orthogonal Frequency Division Multiplexing) demodulation unit, an error correction unit, and a demultiplexing unit.
- the tuner 201 receives a broadcast wave, extracts a signal about a predetermined channel (service ID) designated by a user using a remote controller (not shown), and TS packets related to video, audio, subtitles, and the like from the extracted signal. Has a function of extracting and outputting.
- the broadcast stream decoding unit 202 is realized by an AV signal processing LSI, and has a function of generating a video frame by receiving and decoding a TS packet related to video, and outputting it at a timing indicated by a PTS (Presentation Time Stamp). Have.
- PTS Presentation Time Stamp
- the broadcast stream decoding unit 202 receives a TS packet from the tuner 201 and writes the generated video frame (hereinafter referred to as “broadcast video frame”) to the first video plane 203.
- the first video plane 203 is composed of a frame memory.
- the first video plane 203 stores the broadcast video frame generated by the broadcast stream decoding unit 202.
- plane ID 1 is assigned to the first video plane 203.
- the NIC 211 is composed of a communication LSI and has a function of transmitting and receiving data via a network.
- the NIC 211 When the NIC 211 receives the additional stream via the network, the NIC 211 outputs the received additional stream to the additional stream decoding unit 212.
- the additional stream decoding unit 212 is realized by an AV signal processing LSI, and has a function of decoding a video stream included in the additional stream.
- the additional stream decoding unit 212 receives an additional stream from the NIC 211 and decodes the video stream included in the additional stream to thereby generate a video frame (hereinafter referred to as “additional video frame”). .) Is written to the second video plane 213. Further, the additional stream decoding unit 212 extracts the video arrangement setting data 241 stored in the additional stream.
- the second video plane 213 is composed of a frame memory.
- the second video plane 213 stores the additional video frame generated by the additional stream decoding unit 212.
- the video composition method setting unit 214 refers to the video layout setting data 241 and synthesizes the broadcast video frame stored in the first video plane 203 and the additional video frame stored in the second video plane 213, It has a composition instruction function for outputting a composition instruction using the composition information for generating a frame image constituting an ultra-wide video as a parameter.
- composition instruction function The composition instruction function by the image composition method setting unit 214 will be described using a specific example.
- the video composition method setting unit 214 acquires video arrangement setting data 241 corresponding to the PTS of the broadcast video frame stored in the first video plane 203 and the additional video frame stored in the second video plane 213.
- the video layout setting data 241 has the contents shown in the video layout setting data 400 of FIG.
- the video composition method setting unit 214 counts the number of rectangular IDs (the number of images) in the video layout setting data 400 and outputs it to the video composition unit 221.
- a compositing instruction (referred to as “combining instruction 1” for convenience) is output.
- a compositing instruction (referred to as “combining instruction 3” for convenience) is output.
- the video composition method setting unit 214 also reads these.
- the video compositing unit 221 has a compositing instruction with the plane ID, x position, y position, width, height, X position, and Y position as parameters ( For the sake of convenience, this is referred to as “synthesis instruction N”).
- the video composition unit 221 is realized by an AV signal processing LSI.
- the video composition unit 221 has a video composition function for composing each video according to the compositing instructions 1 to N notified from the video compositing method setting unit 214 and generating an ultra-wide video.
- the generation of the ultra-wide video is performed by generating each of the frame images constituting the ultra-wide video.
- the video composition unit 221 outputs the frame images constituting the ultra wide video stored in the composite video plane 222 to a display (not shown) at the PTS timing. Thereby, an ultra-wide video is reproduced.
- Video composition function The video composition function will be described using a specific example.
- the video composition unit 221 acquires the number of images from the video composition method setting unit 214.
- the number of images matches the number of images that are combined to produce an ultra-wide image.
- the video composition unit 221 initializes the composite video plane 222.
- the video composition unit 221 acquires a composition instruction from the video composition method setting unit 214. Then, in the video plane (first video plane 203 or second video plane 213) identified by the plane ID included as a parameter in the compositing instruction, the width ⁇ high where the coordinates of the upper left corner are (x position, y position) This image is written at a position having the (X position, Y position) of the composite video plane 222 as the coordinates of the upper left corner.
- the width is ba and the height is d from the position (0, 0) of the first video plane 203.
- the image is read, and the read image is written at the position (a, 0) of the composite video plane 222.
- the video compositing unit 221 when the video compositing unit 221 receives the compositing instruction 2 described above, the video compositing unit 221 reads and reads an image having the width a and the height d from the position (0, 0) of the second video plane 213. The image is written at the position (0, 0) of the composite video plane 222.
- the video compositing unit 221 when the video compositing unit 221 receives the compositing instruction 3 described above, the video compositing unit 221 reads an image having a width c ⁇ b and a height d from the position (a, 0) of the second video plane 213. The read image is written at the position (b, 0) of the composite video plane 222.
- an ultra-wide image (for example, the ultra-wide image 310 in FIG. 3) is restored on the composite video plane 222.
- the composite video plane 222 is composed of a frame memory.
- the composite video plane 222 stores the wide video frame output from the video composition unit 221.
- FIG. 8 is a flowchart showing a procedure of video playback processing by the playback device 200.
- the playback apparatus 200 first receives a broadcast stream by the tuner 201 (S801). Then, the NIC 211 receives the additional stream (S802).
- the broadcast stream decoding unit 202 generates a broadcast image constituting the broadcast video by decoding the broadcast stream (S803). Further, the additional stream decoding unit 212 generates an additional image constituting the additional video by decoding the additional stream (S804). Further, the additional stream decoding unit 212 extracts video arrangement setting data from the additional stream (S805).
- the image composition method setting unit 214 outputs a composition instruction to the image composition unit 221 based on the image arrangement setting data by the composition instruction function.
- the video composition unit 221 extracts a plurality of small images (hereinafter referred to as “partial images”) from the additional images based on the composition instruction (S806).
- the video composition unit 221 generates a frame image constituting an ultra-wide video as an original video by synthesizing the broadcast image and a plurality of partial images based on the synthesis instruction, and stores the frame image in the synthesized video plane 222. (S807).
- the video composition unit 221 reproduces the original video by displaying each frame image stored in the composite video plane 222 according to the PTS (S808).
- PTS PTS
- the additional stream creation unit 105 stores the video layout setting data (123) in a descriptor such as a PMT, but it is sufficient if it can be transmitted to the playback device 200.
- the video layout setting data may be stored in a SIT (Service Information Table).
- SIT Service Information Table
- the video arrangement setting data may be stored in supplemental data of each frame in the video stream.
- the video arrangement setting data may be stored only in the first access unit of the GOP (Group of Pictures), and may be valid in the GOP.
- a section for validating the video layout setting data in the video stream is provided and stored together with time information such as the PTS of the start time and the PTS of the end time of the valid section. It is good as well.
- a PID may be assigned to the video arrangement setting data and multiplexed into a stream.
- the video layout setting data may be stored in the broadcast stream, not in the additional stream.
- one video arrangement setting data may be generated for each frame group instead of for each frame.
- the selection range for the ultra-wide video 310 is specified by the user operating an input device such as a remote controller. However, it is sufficient if the selection range can be specified. .
- the user touches the touch panel on which the ultra-wide video 310 is displayed so as to divide the area (for example, draw a rectangle), and the video separation unit 102 acquires information indicating the touched range. Then, the area may be determined as the selection range.
- an important subject or the like in the ultra-wide video 310 may be detected by image recognition processing or the like, and an area within a predetermined range centered on the subject may be set as the selection range.
- image processing is performed on an ultra-wide video 310 that displays a soccer field where a soccer game is being performed, the position of the soccer ball is recognized, and an area where the soccer ball is centered is automatically selected as a selection range. May be.
- the video to be transmitted using the broadcast stream and the additional stream is an ultra-wide video.
- the video is not limited to this, and other video with high-quality broadcast video is transmitted. It is good.
- a high-resolution video may be transmitted as an example of high-quality broadcasting video.
- FIG. 9 is a diagram for explaining an example in which an original video with high resolution is transmitted using a broadcast stream and an additional stream.
- An image 901 in FIG. 9 represents one frame image in a group of frame images constituting the original video.
- the video separation unit 102 of the broadcast system 100 divides the image 901 into three images 902 to 904.
- An image 902 is a broadcast image.
- the broadcast stream creation unit 104 multiplexes the broadcast image (image 902, etc.) as a part of the broadcast stream, and transmits this broadcast stream by a broadcast wave.
- the video layout unit 103 combines them up and down to generate an image 911 as an additional image constituting the additional video. To do.
- the video arrangement unit 103 generates video arrangement setting data indicating the arrangement of the image 902 to the image 904 in the image 901.
- the additional stream creation unit 105 multiplexes the additional image (image 911 or the like) as a part of the additional stream. Further, the additional stream creation unit 105 multiplexes the video arrangement setting data as a part of the additional stream.
- the additional stream creation unit 105 transmits the additional stream by network communication.
- FIG. 9B is a diagram showing video layout setting data 900 related to the image 901.
- the details of the video layout setting data 900 are the same as those of the video layout setting data 400 described above, and only the values are different.
- the playback device 200 on the receiving side receives the broadcast stream and the additional stream, and combines the image 902, the image 903, and the image 904 based on the video arrangement setting data included in the additional stream to restore the high-resolution image 901. To do.
- the playback device 200 restores other frame images of the image 901 in the same manner, thereby restoring the frame image group and playing back the original video.
- a broadcast video and a widened video may be transmitted as an original video using a broadcast stream and an additional stream.
- FIG. 10 is a diagram for explaining an example in which an original video is transmitted using a broadcast stream and an additional stream.
- An image 1001 in FIG. 10 represents one frame image in a group of frame images constituting the original video.
- the video separation unit 102 of the broadcast system 100 divides an image 1001 that is a frame image constituting the original video into three images 1002 to 1004.
- An image 1002 is a broadcast image.
- the other frame images constituting the original video are also divided as in the case of the image 1001.
- the broadcast stream creation unit 104 multiplexes the broadcast image (the image 1002, etc.) as a part of the broadcast stream, and transmits this broadcast stream by a broadcast wave.
- the image 1004 is vertically long. ) Is not a rectangular image (image 911 as an example).
- an image 1015 is generated by combining the image 1014 rotated 270 degrees clockwise with the lower end of the image 1003.
- an empty image 1006 for example, a black color image
- the video layout unit 103 generates video layout setting data indicating the layout of the image 1002 to the image 1004 in the image 1001.
- the additional stream creation unit 105 multiplexes the image 1015 as a part of the additional stream. Further, the additional stream creation unit 105 multiplexes the video arrangement setting data as a part of the additional stream.
- the additional stream creation unit 105 transmits the additional stream by network communication.
- FIG. 10B is a diagram showing video layout setting data 1000 related to the image 1001.
- the details of the video layout setting data 1000 are the same as those of the video layout setting data 400 described above, but an item “rotation angle” for designating the rotation angle for each image is added.
- the playback device 200 on the receiving side receives the broadcast stream and the additional stream, and combines the images 1002, 1003, and 1014 based on the video arrangement setting data included in the additional stream, and restores the image 1001.
- the image 1014 is used after the posture is changed by rotating it clockwise by 90 degrees designated as the “rotation angle”.
- the image 1014 after the orientation change is the image 1004.
- the video layout setting data may be expanded so that the horizontal magnification and the vertical magnification for enlarging / reducing each image can be designated. .
- FIG. 11 is a diagram for explaining a case where enlargement / reduction is performed when an additional image is generated by combining a plurality of images.
- An image 1101 in FIG. 11 represents one frame image in a group of frame images constituting the original video.
- the video separation unit 102 of the broadcasting system 100 divides the image 1101 into three images 1102 to 1104.
- An image 1102 is a broadcast image.
- the other frame images constituting the original video are also divided in the same manner as the image 1101.
- the broadcast stream creation unit 104 multiplexes broadcast images (images 1102, etc.) as a part of the broadcast stream, and transmits this broadcast stream by broadcast waves.
- the image 1103 is halved in the horizontal direction (image 1113), the image 1104 is halved in the vertical direction (image 1114), and then the additional image is synthesized. 1115 is generated.
- the video layout unit 103 generates video layout setting data indicating the layout of the image 1102 to the image 1104 in the image 1101.
- the additional stream creation unit 105 multiplexes the image 1115 as a part of the additional stream. Further, the additional stream creation unit 105 multiplexes the video arrangement setting data as a part of the additional stream.
- the additional stream creation unit 105 transmits the additional stream by network communication.
- FIG. 11B is a diagram showing video layout setting data 1100 related to the image 1101.
- the details of the video layout setting data 1100 are the same as those of the video layout setting data 400 described above, but items for specifying the horizontal and vertical magnifications for each image are added.
- the playback device 200 on the receiving side receives the broadcast stream and the additional stream, and combines the image 1102, the image 1113, and the image 1114 based on the video arrangement setting data included in the additional stream to restore the image 1101.
- the image 1113 is enlarged twice in the horizontal direction (image 903)
- the image 1114 is enlarged after being doubled in the vertical direction (image 904).
- a broadcast video smaller than the wide video is displayed on a display capable of displaying a wide video
- a monochrome image such as black is displayed in an area other than the area where the broadcast video is displayed on the display.
- a predetermined background image may be displayed.
- the broadcast video may be enlarged and displayed on the display.
- margin images 1232 and / or 1233 are provided at the left end portion of the image 1222 and / or the right end portion of the image 1223 so that the macroblock has a boundary (1231). You may adjust so that it may not straddle.
- the broadcast stream and the additional stream are generated as independent streams and transmitted through separate paths. However, it is sufficient if the broadcast video and the additional video can be transmitted.
- the broadcast stream and the additional stream may be generated as subordinate streams and transmitted via one route.
- compression encoding using inter-view reference MPEG-4 AVC / H.
- MPEG-4 MVC Multiview Video Coding
- FIG. 13 is a diagram for explaining an encoding system based on MPEG-4 MVC.
- Streams conforming to the MPEG-4 MVC standard are “Basic View Video Stream” that is compatible with playback of conventional equipment, and “Extended View Video” that enables playback of video from a different viewpoint by processing simultaneously with the basic view. Stream ".
- a basic view video stream is generated from a broadcast stream, and an extended view video stream is generated from an additional stream.
- the basic view video stream is a frame (picture) group constituting a video compressed using inter-picture predictive coding using only temporal redundancy.
- an extended view video stream compresses a frame group constituting a video by using inter-view predictive coding using inter-view redundancy in addition to inter-picture predictive coding using temporal redundancy. It is a thing.
- the pictures of the basic view video stream are compressed with reference to the pictures at the same display time (PTS) of the extended view video stream.
- the first P picture (P0) of the extended view video stream refers to the I picture (I0) of the basic view video stream.
- the B picture (B1) of the extended view video stream refers to the Br picture (Br1) of the basic view video stream.
- the second P picture (P3) of the extended view video stream refers to the P picture (P3) of the basic view video stream.
- the basic view video stream does not refer to the extended view video stream, it can be played back alone.
- the correlation between the left-eye video and the right-eye video is large. Therefore, when the left-eye video is multiplexed on the basic view video stream and the right-eye video is multiplexed on the extended view video stream, the inter-view prediction encoding is performed between the viewpoints, thereby reducing the data amount of the extended view video stream. The amount of data of the basic view video stream can be greatly reduced.
- the video (image) pairs to be compressed using the inter-view reference usually have the same resolution, but in this modification, as shown in FIG. 14, the broadcast video 1402 is encoded as a basic view video stream.
- an MPEG-4 MVC stream (hereinafter referred to as “MPEG-4 MVC broadcast stream”) is generated.
- the basic view video stream 1402 is all referred to for the broadcast video 1402 equivalent part (part 1403) in the ultra-wide video 1401. Therefore, the amount of data obtained by compressing and encoding this portion 1403 is very small.
- the broadcast system 100 transmits an MPEG-4 MVC broadcast stream to the playback device 200 by broadcasting.
- the playback device 200 receives an MPEG-4 MVC broadcast stream by broadcasting. Then, the playback device 200 generates and displays the ultra-wide video 1401 by decoding the extended view video stream while referring to the basic view video stream in the MPEG-4 MVC broadcast stream.
- the broadcast video 1501 when a broadcast video including the broadcast video 1501 is compressed and encoded as a basic view video stream, the broadcast video 1501 is enlarged to, for example, twice the area to generate an enlarged video 1502.
- the extended view video stream may be generated by compressing and encoding the ultra-wide video 1503 with reference to this between views.
- a resolution other than the resolution of the broadcast image can be selected as the resolution of the video transmitted by the extended view video stream.
- multi-angle a service that allows a user to select and play a video desired by a user from a plurality of different viewpoint videos
- FIG. 16 is a block diagram showing a configuration of a system according to this modification.
- the system according to this modification includes a broadcast system 1601 and a playback device 1602.
- the broadcast stream is transmitted from the broadcast system 1601 to the playback device 1602 by broadcasting, and the additional stream is transmitted from the broadcast system 1601 to the playback device 1602 via network communication.
- the broadcast system 1601 includes a broadcast video photographing unit 1611, a broadcast stream creating unit 104, another viewpoint video photographing unit 1612, an additional stream creating unit 105, another viewpoint video photographing unit 1613, an additional stream creating unit 1614, A transmission additional stream selection unit 1615 and a switch 1616 are included.
- the broadcast video shooting unit 1611 includes a video camera, and outputs the shot video (broadcast wave video) to the broadcast stream creation unit 104.
- the broadcast stream creation unit 104 is the same as that already described with reference to FIG.
- the different viewpoint video photographing unit 1612 is composed of a video camera, and outputs the photographed video (different viewpoint video 1) to the broadcast stream creating unit 104.
- the additional stream creation unit 105 has the same configuration as that already described with reference to FIG. 1 and generates the additional stream 122 by compressing and encoding the different viewpoint video 1.
- the additional stream creation unit 105 also stores the different viewpoint video information 1631 in the additional stream instead of the video layout setting data 123.
- the different viewpoint video photographing unit 1613 is composed of a video camera, and outputs the photographed video (different viewpoint video 2) to the broadcast stream creating unit 1614.
- the additional stream creation unit 1614 has the same configuration as the additional stream creation unit 105 described above, and generates the additional stream 124 by compressing and encoding the different viewpoint video 2.
- the additional stream creation unit 105 also stores the different viewpoint video information 1631 in the additional stream.
- the transmission additional stream selection unit 1615 transmits the additional stream 122 and the additional stream 124 requested by the playback device 1602 to the playback device 1602.
- the request from the playback device 1602 is made using, for example, a URL format.
- a separate URL is assigned in advance to each of the additional stream 122 and the additional stream 124, and the playback device 1602 requests access to the URL corresponding to the additional stream that is desired to be transmitted.
- the transmission additional stream selection unit 1615 receives this access request, and switches the switch 1616 to output an additional stream corresponding to the URL requested to be accessed by the playback device 1602. Then, the transmission additional stream selection unit 1615 transmits an additional stream corresponding to the URL requested to be accessed by the playback device 1602 to the playback device 1602.
- the transmission additional stream selection unit 1615 creates different viewpoint video information.
- FIG. 17B is a diagram showing an example of different viewpoint video information (different viewpoint video information 1631 and 1651).
- the different viewpoint video information includes “angle ID” which is a unique ID for specifying the viewpoint (angle), “stream type” in which a method for specifying and accessing the stream type is described, and the content of the angle video
- angle ID which is a unique ID for specifying the viewpoint (angle)
- stream type in which a method for specifying and accessing the stream type is described
- the table information is a pair of “contents” with
- the switch 1616 is a switch that switches between the additional stream 122 and the additional stream 124 to be transmitted based on the control by the transmission additional stream selection unit 1615.
- the playback device 1602 includes a tuner 201, a broadcast stream decoding unit 202, a first video plane 203, a NIC 211, an additional stream decoding unit 212, a second video plane 213, a video synthesis unit 221, and a synthesized video plane 222. , And a reception additional stream selection unit 1652.
- the tuner 201, the broadcast stream decoding unit 202, the first video plane 203, the NIC 211, the second video plane 213, the video synthesis unit 221, and the synthesized video plane 222 are the same as those already described with reference to FIG.
- the additional stream decoding unit 212 is basically the same as that already described with reference to FIG. 2, but extracts the different viewpoint video information 1651 included in the additional stream and transmits it to the reception additional stream selection unit 1652. Is different.
- the different viewpoint video information 1651 is the same as the different viewpoint video information 1631.
- the received additional stream selection unit 1652 receives the different viewpoint video information from the additional stream decoding unit 212, and prompts the user to select one from a plurality of different viewpoint videos by performing menu display or the like. Then, the transmission system 1601 is requested to transmit an additional stream storing another viewpoint video selected by the user by remote control input or the like via the network. In response to this request, the NIC 211 receives from the broadcast system 1601 an additional stream in which the selected different viewpoint video is stored.
- the received additional stream selection unit 1652 displays the menu 1701 in FIG.
- the “content” item in the different viewpoint video information 1651 is displayed briefly.
- the user selects one of the “contents” displayed in the menu using the remote controller or the like.
- the received additional stream selection unit 1652 refers to the different viewpoint video information 1651 and reads the “angle ID” corresponding to the selected “content” and the “stream type” corresponding to the angle ID. Then, the received additional stream selection unit 1652 requests the broadcast system 1601 to transmit the stream related to the read stream type using the URL described in the stream type.
- the user's favorite video can be selected and reproduced from a plurality of additional streams of different viewpoint videos.
- the different viewpoint video information 1631 (1651) may be stored in the broadcast stream instead of the additional stream, or may be configured as a metadata file independently of the stream.
- the different viewpoint video information 1631 (1651) may be configured to individually set information (PID as an example) used to acquire video and audio as shown in FIG.
- the stream is identified using the PID, but it is sufficient if the stream can be identified.
- it may be specified using a tag such as a reference destination PID of the hierarchical transmission descriptor.
- a reduced video of a plurality of different viewpoint videos (hereinafter referred to as “different viewpoint reduced videos”). ) May be displayed on one screen, and the user may select a desired different viewpoint video from these different viewpoint videos.
- each viewpoint video is identified by an angle ID.
- an additional stream storing “index video” as shown as video 1901 in FIG. 19A and “correspondence information between display area and angle ID” is used.
- “Correspondence relationship information between display area and angle ID” includes the display area in which each different viewpoint reduced video is displayed in the index video (video 1901 as an example), and the angle ID of the displayed different viewpoint reduced video. This is information indicating the correspondence relationship.
- the image 1902 in FIG. 19 is a diagram visually representing the content of the correspondence information, that is, the correspondence between the four different viewpoint reduced videos displayed in the video 1901 and the angle ID (1902).
- the angle ID of the different viewpoint reduced video in the upper left area is 0 and the angle ID of the different viewpoint reduced video in the upper right area is 1 in the area divided vertically and horizontally.
- the angle ID of the different viewpoint reduced video in the lower left area is 2, and the angle ID of the different viewpoint reduced video in the lower right area is 3.
- the playback device 1602 first receives the additional stream including the index video, decodes it, and displays it on the display. Then, the user is made to select one of the different viewpoint reduced videos (videos 1911 to 1914 in FIG. 19 as an example) included in the index video.
- the playback device 1602 displays the different viewpoint video associated with the different viewpoint reduced video selected by the user.
- a menu image in which a portion other than the character display portion in the image 1902 is made translucent is superimposed on the video 1901.
- the angle ID 0 to 3 may be selectable by a remote controller or the like.
- the playback device 1602 is a device having a touch panel such as a tablet terminal
- an angle selection index image is displayed on the touch panel, and the user can touch any one of a plurality of displayed different viewpoint reduced images. It may be selected.
- the tablet terminal refers to the stream type associated with the angle ID for identifying the different viewpoint reduced video in the different viewpoint video information, and receives the additional stream represented by the stream type by communication from the broadcasting system. And play it.
- the tablet terminal is configured such that a separate playback device receives and decodes the additional stream, converts it into a format that can be displayed by the tablet terminal, and displays what has been transferred using a wireless LAN or the like. Also good. (10)
- it is preferable that the video can be switched seamlessly without interruption when switching to another viewpoint video.
- three configuration examples for seamless switching will be described with reference to FIG.
- FIG. 20 shows an example of a detailed configuration of the additional stream decoding unit.
- FIG. 20A shows a configuration similar to a decoder model of a video stream in the MPEG-2 system.
- the PID filter refers to the PID of the TS packet, filters the TS packet, and outputs it to the TB (transport buffer) 2004.
- the TS packet input to TB 2004 is removed from the TS header and transferred to MB (main buffer) 2005 at a predetermined rate.
- the predetermined rate indicates the amount of data that must be retrieved from the TB 2004 per unit time.
- packetized video ES data (video PES data) is accumulated in MB 2005.
- the system stream header information such as TS packet header and PES header is removed, and in EB2006, only the video elementary stream is stored. Become.
- Dec (decoder) 2007 has a function of decoding a video elementary stream, and data is transmitted from EB 2006 to Dec 2007 at a timing of DTS (Decoding Time Stamp).
- DTS Decoding Time Stamp
- Dec 2007 is stored in DPB (decoded picture buffer) 2008 and output to Plane (display plane) 2009 at the PTS timing.
- DPB decoded picture buffer
- the input switching between the additional stream # 1 (hereinafter TS1) and the additional stream # 2 (hereinafter TS2) is performed using the PID filter 2003. It is necessary to do this by switching the input to.
- FIG. 20B is a diagram showing a configuration having a plurality of video stream decoder models.
- FIG. 20C is a diagram illustrating a configuration in which a plurality of buffers and a single core portion (configuration after Dec 2042) are provided in the video stream decoder model.
- the timing of the DTS and the PTS since the timing of the DTS and the PTS does not match, it is configured to display the DPB picture of the TS1 until the timing of the PTS comes after the switching by the DTS of the TS2. That is, SW1 (2041) is switched at the DTS timing, and SW2 (2043) is switched at the PTS timing. Thereby, seamless switching can be realized.
- the previous PTS is stored in the DPB 2008 of TS1.
- the picture stored at the timing may be displayed.
- the wide video may be divided into two, the right half video may be displayed on one television, and the left half video may be displayed on another television.
- the playback apparatus 200 includes a video output unit 2101 at the subsequent stage of the composite video plane 222 as shown in FIG.
- the video output unit 2101 divides the wide video so that it can be displayed using the above-described two televisions, outputs the right half video from output 1, and outputs the left half video from output 2.
- the video output unit 2101 has acquired in advance information on a television for displaying a wide video separately on two televisions.
- the video arrangement unit 103 of the broadcast system 100 divides the wide video and includes display instruction information for displaying on a plurality of displays on the playback device 200 side in the video placement setting data 123, so that the playback device 200 includes the display instruction information. It is good also as transmitting.
- the display instruction information indicates that, for example, the right half of the wide video restored by the playback device 200 is displayed on the left display when viewed from the user, and the right half of the wide video is displayed on the right display when viewed from the user. Indicates.
- the playback device 200 When the wide video is restored, the playback device 200 refers to the video layout setting data, and outputs the right half of the wide video as output 1 so that it is displayed on the left display as viewed from the user. Is output as output 2 to be displayed on the right display as viewed from the user. (12) In the above-described embodiment, both the broadcast stream and the additional stream are received and played back by the playback device 200. However, the reception and playback processing may be shared by a plurality of devices.
- the television 1 (corresponding to the playback device 200) receives a broadcast wave and plays a broadcast video. Also, the additional stream is received by the mobile terminal via the network and the additional video is reproduced. At this time, the mobile terminal transmits the additional video to the television 2 via the wireless LAN (Local Area Network), and the television 2 displays the additional video.
- the wireless LAN Local Area Network
- Broadcast video and additional video can be displayed on the television 1 and the television 2.
- the television 1 and the mobile terminal access the time server in order to synchronize the images when displaying the video of the broadcast stream and the additional stream on a plurality of televisions. And (b) adding information for synchronization to a signal of a remote controller (not shown) and receiving the information by the TV 1 and the portable terminal to synchronize the time.
- C A method of decoding broadcast waves of the same channel between the television 1 and the portable terminal and synchronizing them using PCR.
- the additional stream is transmitted from the broadcasting system 100 to the playback device 200 via the network.
- the additional stream may be transmitted from the broadcasting system 100 to the playback device 200 by a broadcast wave.
- FIG. 33 is a block diagram showing a configuration of the broadcasting system 100 according to the present modification.
- the additional stream creation unit 105 creates the additional video as an additional stream 122 by converting it into a format that can be broadcast by broadcast waves instead of communication.
- a sending unit (not shown) transmits a stream in which the broadcast stream 121 generated by the broadcast stream generation unit 104 and the additional stream 122 generated by the additional stream generation unit 105 are multiplexed by broadcast waves.
- FIG. 34 is a block diagram showing the configuration of the playback apparatus 200 according to this modification.
- the playback apparatus 200 of FIG. 34 does not have the NIC 211, and the tuner 201 extracts the TS packet related to the broadcast stream 121 and the TS packet related to the additional stream 122 from the received broadcast wave. Then, the tuner 201 outputs TS packets related to the broadcast stream 121 to the broadcast stream decoding unit 202, and outputs TS packets related to the additional stream 122 to the additional stream decoding unit 212.
- the additional stream can also be transmitted from the broadcast system 100 to the playback device 200 by broadcast waves.
- FIG. 35 is a block diagram showing a configuration of a system according to this modification.
- the additional stream creation unit 105 and the additional stream creation unit 1614 convert the different viewpoint video into a format that can be broadcast by broadcast waves, not by communication, as additional streams (122, 124). Creating.
- a sending unit (not shown) transmits a stream obtained by multiplexing the broadcast stream 121, the additional stream 122, and the additional stream 124 using broadcast waves.
- the playback device 1602 in FIG. 35 does not have the NIC 211, and the tuner 201 outputs TS packets related to the broadcast stream 121 to the broadcast stream decoding unit 202 from the received broadcast wave.
- the received additional stream selection unit 1652 requests the tuner 201 to extract an additional stream in which another viewpoint video selected by a user performing remote control input or the like is stored.
- the tuner 201 extracts a TS packet related to the requested additional stream (122 or 124). Then, the tuner 201 outputs TS packets related to the additional stream to the additional stream decoding unit 212.
- the additional stream can also be transmitted from the broadcast system 100 to the playback device 200 by broadcast waves.
- the additional stream is also transmitted from the broadcast system 100 to the playback apparatus 200 by broadcast waves. It is good also as transmitting.
- FIG. 36 is a block diagram showing the configuration of the playback apparatus 200 according to this modification.
- the playback apparatus 200 in FIG. 36 does not have the NIC 211 as in the case of the modification (13) described above, and the tuner 201 adds the TS packet related to the broadcast stream 121 from the received broadcast wave. TS packets related to the stream 122 are extracted. Then, the tuner 201 outputs TS packets related to the broadcast stream 121 to the broadcast stream decoding unit 202, and outputs TS packets related to the additional stream 122 to the additional stream decoding unit 212.
- the additional stream can also be transmitted from the broadcast system 100 to the playback device 200 by broadcast waves.
- the broadcast stream and the additional stream are transmitted from the broadcast system to the playback device by one broadcast wave.
- the broadcast stream and the additional stream are transmitted by a plurality of broadcast waves. It is good also as composition to do.
- the broadcast stream and the additional stream may be transmitted using different broadcast waves.
- FIG. 37 is a diagram showing a configuration of a broadcasting system 100 according to this modification.
- FIG. 38 is a diagram showing the configuration of the playback apparatus 200 according to this modification.
- the playback device 200 of FIG. 38 includes a tuner 3801, and receives an additional stream using broadcast waves using the tuner 3801.
- the additional stream can also be transmitted from the broadcast system to the playback device using the broadcast wave.
- the system described in the above modification (14) may also be configured to transmit a broadcast stream and an additional stream using a plurality of broadcast waves, as in the case of the modification (16). Good.
- FIG. 39 is a block diagram showing a configuration of a system according to this modification.
- a stream in which the additional stream 122 and the additional stream 124 are multiplexed is transmitted by broadcast waves.
- the playback device 1602 in FIG. 39 has a tuner 3901 and receives a broadcast wave in which an additional stream is multiplexed.
- the reception additional stream selection unit 1652 requests the tuner 3901 to extract an additional stream in which another viewpoint video selected by a user performing remote control input or the like is stored.
- the tuner 3901 extracts TS packets related to the requested additional stream (122 or 124). Then, the tuner 3901 outputs TS packets related to the additional stream to the additional stream decoding unit 212.
- the broadcast stream and the additional stream can be transmitted by a plurality of broadcast waves.
- the playback apparatus 200 described in the above modification (15) also receives a broadcast stream and an additional stream using a plurality of broadcast waves. It is good.
- FIG. 40 is a block diagram showing the configuration of the playback apparatus 200 according to this modification.
- the playback device 200 of FIG. 40 includes a tuner 3801 as in the case of the above-described modification (16), and receives an additional stream using broadcast waves using the tuner 3801.
- the broadcast stream and the additional stream can be received by a plurality of broadcast waves.
- a control program consisting of a program code of a machine language or a high-level language to be executed by various circuits connected to the processor can be recorded on a recording medium, or can be distributed and distributed via various communication paths.
- a recording medium includes an IC card, a hard disk, an optical disk, a flexible disk, a ROM, a flash memory, and the like.
- the distributed and distributed control program is used by being stored in a memory or the like that can be read by the processor, and the processor executes the control program to realize each function as shown in each embodiment. Will come to be.
- the processor may be compiled and executed or executed by an interpreter.
- Each functional component shown in the above-described embodiment (the wide video photographing unit 101, the video separation unit 102, the video arrangement unit 103, the broadcast stream creation unit 104, the additional stream creation unit 105, and the playback device of the broadcast system 100) 200 tuner 201, broadcast stream decoding unit 202, first video plane 203, NIC 211, additional stream decoding unit 212, second video plane 213, video synthesis method setting unit 214, video synthesis unit 221, synthesized video plane 222, etc.) Further, it may be realized as a circuit that executes the function, or may be realized by executing a program by one or a plurality of processors.
- each functional component described above is typically realized as an LSI which is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
- the name used here is LSI, but it may also be called IC, system LSI, super LSI, or ultra LSI depending on the degree of integration.
- the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- integrated circuit technology comes out to replace LSI's as a result of the advancement of semiconductor technology or a derivative other technology, it is naturally also possible to carry out function block integration using this technology. Biotechnology can be applied.
- the MPEG-2 transport stream is a stream conforming to the standard standardized in ISO / IEC13818-1 and ITU-T recommendation H222.0.
- FIG. 23 is a diagram showing the structure of a digital stream in the MPEG-2 transport stream format.
- the transport stream is obtained by multiplexing a video stream composed of a video frame sequence, an audio stream composed of an audio frame sequence, a subtitle stream, and the like.
- the video stream is a program in which the main video of the program is compression-coded using a method such as MPEG-2, MPEG-4, or AVC.
- the audio stream is obtained by compressing and encoding the main audio and sub audio of a program using a method such as Dolby AC-3, MPEG-2 AAC, MPEG-4 AAC, HE-AAC.
- the subtitle stream is a stream that stores the subtitle information of the program.
- a video stream 2301 composed of a plurality of video frames is converted into a PES packet sequence 2302 and further converted into TS packets 2303.
- an audio stream 2304 including a plurality of audio frames is converted into a PES packet sequence 2305 and further converted into a TS packet 2306.
- the data of the subtitle stream 2307 is converted into a PES packet sequence 2308, and further converted into TS packets 2309.
- the MPEG-2 transport stream 2313 is obtained by multiplexing the TS packet 2303, the TS packet 2306, and the TS packet 2309 as one stream.
- Each stream included in the transport stream can be identified by a stream identification ID called PID.
- PID stream identification ID
- the playback apparatus extracts the target stream.
- the correspondence between PID and stream is stored in a descriptor of a PMT packet described later.
- FIG. 24 is a diagram showing how a video stream is stored in a PES packet sequence.
- Stream 2401 indicates a video frame sequence in the video stream.
- the PES packet sequence 2402 indicates a PES packet sequence.
- a plurality of Video Presentation Units in the video stream are divided for each picture and stored in the payload of the PES packet. .
- Each PES packet has a PES header, and a PTS (Presentation Time-Stamp) that is a picture display time and a DTS (Decoding Time-Stamp) that is a picture decoding time are stored in the PES header.
- PTS Presentation Time-Stamp
- DTS Decoding Time-Stamp
- Video compression coding such as MPEG-2, MPEG-4 AVC, SMPTE VC-1 (Society of Motion Picture Engineers Engineers VC-1) uses redundancy in the spatial and temporal directions of moving images.
- Inter-picture predictive coding is used as a technique that uses temporal redundancy.
- a picture is a frame or field with reduced image redundancy.
- inter-picture predictive coding as compression coding of a picture to be coded, a picture that is forward or backward in display time order is used as a reference picture, thereby reducing temporal redundancy. Then, the amount of motion from the reference picture is detected, and the difference between the motion compensated picture and the picture to be coded is compression-coded, thereby reducing the redundancy in the spatial direction.
- FIG. 25 is a diagram showing a reference relationship between pictures in a general video stream.
- the arrow in FIG. 25 indicates a reference picture.
- the pictures are divided into I pictures, PI pictures, and B pictures according to the encoding method.
- An I picture (such as I 0 in the figure) is a picture generated by performing intra-picture predictive coding using only a current picture without using a reference picture.
- a P picture (P 3 , P 6, etc. in the figure) is a picture that is subjected to inter-picture prediction encoding with reference to one already processed picture.
- a B picture (B 1 , B 2 , B 4 , B 5, etc. in the figure) is a picture that performs inter-picture predictive coding by simultaneously referring to two already processed pictures.
- a B picture that is referred to by another picture is called a Br picture.
- FIG. 26 is a diagram for explaining a hierarchical structure of a video stream.
- the video stream is composed of a plurality of GOPs that are basic units of the encoding process. Editing of moving images and random access are performed in units of this GOP.
- the GOP includes one or more video access units described above.
- the video access unit includes an AU identification code, a sequence header, a picture header, supplementary data, compressed picture data, padding data, a sequence end code, a stream end code, and the like. These data are stored in units of NAL units in the case of MPEG-4 AVC.
- AU identification code is a start code indicating the head of the access unit.
- the sequence header is a header for storing common information in a playback sequence composed of a plurality of video access units. Specifically, information such as resolution, frame rate, aspect ratio, and bit rate is stored in the sequence header.
- the picture header is a header that stores information such as the encoding method of the entire picture.
- Supplementary data is additional information that is not essential for decoding compressed data.
- character information of closed captions to be displayed on the TV in synchronization with the video, GOP structure information, and the like are stored.
- Compressed picture data stores compressed and encoded picture data.
- the padding data stores meaningless data for formatting. For example, it is used as stuffing data for maintaining a predetermined bit rate.
- the sequence end code is data indicating the end of the reproduction sequence.
- the stream end code is data indicating the end of the bit stream.
- the contents of the AU identification code, sequence header, picture header, supplemental data, compressed picture data, padding data, sequence end code, and stream end code differ depending on the video encoding method.
- the AU identification code is AU delimiter (Access Unit Delimiter)
- the sequence header is SPS (Sequence Parameter Set)
- the picture header is PPS (Picture Parameter).
- Set supplementary data
- SEI Supplemental Enhancement Information
- compressed picture data is a plurality of slices
- padding data is Filler Data
- sequence end code is End of Sequence
- stream The termination code is End of Stream.
- the sequence header is sequence_Header, sequence_extension, group_of_picture_header, the picture header is picture_header, picture_coding_extension, the compressed data is _, the supplementary data is er, And the sequence end code is sequence_end_code. Further, although no AU identification code exists, the break of the access unit can be detected using the start code of each header.
- sequence header may be described only in the video access unit at the head of the GOP and may not be described in other video access units.
- a video access unit that does not include a picture header in its own video access unit and that refers to the picture header of the previous video access unit in code order.
- I picture data is stored as compressed picture data, and an AU identification code, sequence header, picture header, and compressed picture data are always described. . Supplementary data, padding data, sequence end code, and stream end code are described as necessary.
- Cropping is extracting a part of the area from the encoded frame area. Scaling is to enlarge or reduce the extracted area.
- an area to be actually displayed is specified as a “cropping area” (a cropping area 2802 as an example) from the encoded frame area 2801. Then, the cropping region 2802 is up-converted according to scaling information indicating a scaling method when actually displaying the image on a television or the like, thereby generating a region (image) 2803 that is actually used for display.
- a “cropping area” a cropping area 2802 as an example
- the cropping area can be specified using frame_cropping information stored in the SPS.
- the frame_cropping information indicates the difference between the upper line / underline / left line / right line of the cropping area and the upper line / underline / left line / right line of the encoded frame area. Specified as crop amount for top, bottom, left and right.
- set frame_cropping_flag to 1 and set frame_crop_top_offset / frame_crop_bottom_offset / frame_crop_left_offset / right_crop_right_crop
- the vertical and horizontal sizes of the cropping area (display_horizontal_size of display_display_extension, display_vertical_size), the center of the encoded frame area and the center of the cropping area
- the difference information (frame_center_horizontal_offset of frame_display_extension, frame_center_vertical_offset) can be specified.
- the scaling information is set as an aspect ratio, for example.
- the playback device uses the aspect ratio information to up-convert and display the cropping area.
- aspect ratio information (aspect_ratio_idc) is stored in the SPS as scaling information.
- FIG. 30 is a diagram illustrating a data structure of a TS packet.
- the TS packet is a 188-byte fixed-length packet composed of a 4-byte TS header, an adaptation field, and a TS payload.
- the TS header is composed of transport_priority, PID, adaptation_field_control, and the like.
- PID is an ID for identifying a stream multiplexed in a transport stream as described above.
- the transport_priority is information for identifying the type of packet in TS packets with the same PID.
- Adaptation_field_control is information for controlling the configuration of the adaptation field and the TS payload. There are cases where only one of the adaptation field and the TS payload exists or both, and adaptation_field_control indicates the presence / absence thereof.
- adaptation_field_control When adaptation_field_control is 1, only the TS payload exists, when adaptation_field_control is 2, only the adaptation field exists, and when adaptation_field_control is 3, both TS payload and adaptation field are present.
- the adaptation field is a storage area for storing information such as PCR and stuffing data for making the TS packet a fixed length of 188 bytes.
- a PES packet is divided and stored in the TS payload.
- TS packets included in a transport stream include PAT (Program Association Table), PMT (Program Map Table), PCR (Program Clock Reference), and the like in addition to video, audio, and subtitle streams. These packets are called PSI (Program Specific Information).
- PAT indicates what the PID of the PMT used in the transport stream is, and the PID of the PAT itself is 0.
- the PMT has PID of each stream such as video / audio / subtitles included in the transport stream and attribute information of the stream corresponding to each PID, and has various descriptors related to the transport stream.
- the descriptor includes copy control information for instructing permission / non-permission of copying of the AV stream.
- the PCR is information on the STC time corresponding to the timing at which the PCR packet is transferred to the decoder. have.
- FIG. 31 is a diagram showing the data structure of the PMT.
- a PMT header describing the length of data included in the PMT is arranged at the head of the PMT. After that, a plurality of descriptors related to the transport stream are arranged.
- the copy control information described above is described as a descriptor.
- a plurality of pieces of stream information regarding each stream included in the transport stream are arranged after the descriptor.
- the stream information includes a stream descriptor in which a stream type, a stream PID, and stream attribute information (frame rate, aspect ratio, etc.) are described to identify a compression codec of the stream. ⁇ 6. Supplement 2>
- the configuration of the content broadcast receiving system as one embodiment of the present invention, and its modifications and effects will be described.
- a playback apparatus is a playback apparatus that plays back an original video, and a broadcast image group in which an arbitrary part of each frame image is extracted from the frame image group constituting the original video.
- First receiving means for acquiring a broadcast video comprising a broadcast wave
- second receiving means for receiving an additional video comprising a remaining image group other than the extracted portion of each frame image in the frame image group
- Arrangement information acquisition means for acquiring arrangement information indicating the arrangement of broadcast images and remaining images in each frame image for the frame image group, and arranging the broadcast image group and the remaining image group based on the arrangement information
- a reproducing means for generating and displaying the frame image group.
- the original video can be reproduced using the broadcast video and an additional video that is an arbitrary video that is not a video determined from a fixed relative position of the broadcast video.
- Each of the frame images is divided into the broadcast image and a plurality of partial images.
- the remaining image is formed by combining the plurality of partial images. Including information indicating an arrangement position of each of the plurality of partial images in a remaining image, wherein the reproduction unit divides the remaining image into the plurality of partial images with reference to the arrangement information and combines the information with the broadcast image By doing so, the frame image may be generated.
- each of the frame images is a left partial image as one of the plurality of partial images, the broadcast image, and a right partial image as one of the plurality of partial images in the horizontal direction.
- the reproduction means refers to the arrangement information, divides the remaining image into the left partial image and the right partial image, combines the left partial image with the left end of the broadcast image, and
- the frame image may be generated by combining the right partial image with the right end of the broadcast image.
- each of the frame images includes, in the vertical direction, an upper partial image as one of the plurality of partial images, the broadcast image, and a lower partial image as one of the plurality of partial images.
- the reproduction means refers to the arrangement information, divides the remaining image into the upper partial image and the lower partial image, combines the upper partial image with an upper end of the broadcast image, and
- the frame image may be generated by combining the lower partial image with the lower end of the broadcast image.
- a video having a higher resolution than the broadcast image can be used as the original video.
- at least one of the plurality of partial images is enlarged or reduced when combined with another partial image, and the arrangement information includes an original about the enlarged or reduced partial image.
- a magnification for returning to the size of the image is described, and when the broadcast unit combines the broadcast image and the plurality of partial images, the enlarged or reduced partial image is included in the arrangement information. It is good also as combining after performing the process which returns to the original magnitude
- At least one of the plurality of partial images is rotated when it is combined with another partial image, and the arrangement information includes a position for returning the rotated partial image to a posture before rotation.
- a rotation angle is described, and when the reproduction unit combines the broadcast image and the plurality of partial images, the rotated partial image is only the rotation angle described in the arrangement information. It is good also as combining after rotating.
- the broadcast system may transmit the additional video via a network, and the second receiving unit may receive the additional video via the network.
- the original video can be received separately for broadcast waves and communications.
- the arrangement information is generated one by one for each predetermined group of frame images having the same extracted portion, and the reproduction unit generates the frame image group
- the frame information constituting the group may be generated by referring to the arrangement information generated for the group.
- a playback method is a playback method executed by a playback device that plays back an original video, and an arbitrary portion of each frame image in a group of frame images that constitute the original video.
- a receiving step an arrangement information obtaining step for obtaining arrangement information indicating an arrangement of a broadcast image and a remaining image in each frame image for the frame image group, and the broadcast image group and the remaining image group based on the arrangement information. And a reproducing step of generating and displaying the frame image group.
- An integrated circuit is an integrated circuit used in a playback device that plays back an original video, and is a broadcast in which an arbitrary part of each frame image is extracted from a group of frame images constituting the original video.
- First receiving means for acquiring a broadcast video comprising an image group from a broadcast wave; and second receiving means for receiving an additional video comprising a remaining image group other than the extracted portion of each frame image in the frame image group;
- Arrangement information acquisition means for acquiring arrangement information indicating arrangement of broadcast images and remaining images in each frame image, and arrangement of the broadcast image group and the remaining image group based on the arrangement information.
- a reproducing means for generating and displaying the frame image group.
- a broadcasting system is a broadcasting system that transmits an original video, and extracts an arbitrary part of each frame image in a frame image group constituting the original video, and the extraction
- a first transmission means for transmitting a broadcast video composed of the group of images using a broadcast wave, and generating an additional video composed of a remaining image group other than the extracted portion of each frame image in the frame image group
- Second transmission means for transmitting and arrangement information transmitting means for generating and transmitting arrangement information indicating the arrangement of broadcast images and remaining images in each frame image for the frame image group.
- a broadcasting method is a broadcasting method executed by a broadcasting system that transmits an original video, and extracts an arbitrary part of each frame image in a group of frame images constituting the original video, A first transmission step of transmitting a broadcast video composed of the extracted image group using a broadcast wave, and generating an additional video composed of a remaining image group other than the extracted portion of each frame image in the frame image group And a second transmission step of transmitting, and an arrangement information transmitting step of generating and transmitting arrangement information indicating the arrangement of the broadcast image and the remaining image in each frame image for the frame image group.
- An integrated circuit is an integrated circuit used in a broadcasting system for transmitting an original video, and extracts an arbitrary part of each frame image in a frame image group constituting the original video, A first transmission means for transmitting a broadcast video composed of the extracted image group using a broadcast wave, and an additional video composed of a remaining image group other than the extracted portion of each frame image in the frame image group are generated. And a second transmission means for transmitting, and an arrangement information transmitting means for generating and transmitting arrangement information indicating the arrangement of the broadcast image and the remaining image in each frame image for the frame image group.
- the original video can be transmitted to the playback device using the broadcast video and an additional video that is an arbitrary video that is not a video determined from a fixed relative position of the broadcast video.
- the playback device generates a combined video by combining a broadcast video and an additional video, and displays the video as an additional video determined from a fixed relative position to the broadcast video. Any video that is not available can be used, and is useful for devices that receive, combine, and play back video from both broadcast and communication.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Systems (AREA)
Abstract
Description
以下、本発明の一実施の形態に係るコンテンツ放送受信システムについて説明する。
<2.放送システム100>
<2-1.構成>
図1は、放送システム100の構成を示すブロック図である。
ワイド映像撮影部101は、放送波を用いて放送される映像よりもワイドかつ高解像度の超ワイド映像を撮影するビデオカメラで構成され、撮影した超ワイド映像に係る映像データを映像分離部102に出力する。
映像分離部102は、ワイド映像撮影部101から超ワイド映像に係る映像データを受け取り、超ワイド映像を、放送波で伝送する映像(以下、「放送映像」という。)と他の映像とに分離する機能(以下、「映像分離機能」という。)を有する。
映像分離機能は、映像を構成するフレーム画像それぞれについて、フレーム画像を複数の画像に分離することにより実現する。
以下、選択範囲取得機能について、図3を用いて説明する。
以下、画像分離機能について、図3を用いて説明する。
映像配置部103は、映像配置機能を有する。
追加映像生成機能は、映像分離部102により超ワイド映像から放送映像を分離した後の残余の映像から、追加映像として通信により送信するための映像を生成する機能である。
映像配置設定データ生成機能は、映像配置設定データを生成する機能である。
放送ストリーム作成部104は、放送映像を、放送波で放送できるフォーマットに変換し、出力する機能(以下、「放送ストリーム作成機能」という。)を有する。
追加ストリーム作成部105は、追加映像を、通信により送信するためのフォーマットに変換し、出力する機能(以下、「追加ストリーム作成機能」という。)を有する。
図1には、図示していないが、放送システム100は、一般的なデジタル放送の送出システムが備えているのと同様の、放送波を用いて放送ストリームを送信するための構成である、OFDM変調部、周波数変換部、及びRF送出部などから成る送出部を備えている。
図6は、放送システム100による映像送信処理の手順を示すフローチャートである。
<3-1.構成>
図2は、再生装置200の構成を示すブロック図である。
チューナ201は、デジタルチューナ、OFDM(Orthogonal Frequency Division Multiplexing)復調部、誤り訂正部、及び多重分離部などを含むチューナLSIで実現されている。
放送ストリームデコード部202は、AV信号処理LSIで実現されており、映像に関するTSパケットを受信し、復号することによりビデオフレームを生成し、PTS(Presentation Time Stamp)で示されるタイミングで出力する機能を有する。
第1映像プレーン203は、フレームメモリで構成される。第1映像プレーン203は、放送ストリームデコード部202により生成された放送ビデオフレームを格納する。
NIC211は、通信用のLSIで構成されており、ネットワークを介してデータを送受信する機能を有する。
追加ストリームデコード部212は、AV信号処理LSIで実現されており、追加ストリームに含まれるビデオストリームのデコード機能を有する。
第2映像プレーン213は、フレームメモリで構成される。第2映像プレーン213は、追加ストリームデコード部212により生成された追加ビデオフレームを格納する。
映像合成方法設定部214は、映像配置設定データ241を参照し、第1映像プレーン203に格納されている放送ビデオフレームと、第2映像プレーン213に格納されている追加ビデオフレームとを合成し、超ワイド映像を構成するフレーム画像を生成するための合成情報をパラメータとした合成指示を出力する合成指示機能を有する。
映像合成方法設定部214による合成指示機能について、具体例を用いて説明する。
映像合成部221は、AV信号処理LSIで実現されている。
映像合成機能について、具体例により説明する。
合成映像プレーン222は、フレームメモリで構成される。合成映像プレーン222は、映像合成部221から出力されるワイドビデオフレームを格納する。
図8は、再生装置200による映像再生処理の手順を示すフローチャートである。
<4.変形例>
以上、本発明に係るコンテンツ放送受信システムの実施形態を説明したが、例示したコンテンツ放送受信システムを以下のように変形することも可能であり、本発明が上述の実施形態で示した通りのコンテンツ放送受信システムに限られないことは勿論である。
また、映像配置設定データについて、各画像を拡大縮小するための水平方向の倍率、垂直方向の倍率を指定できるよう拡張してもよい。
(8)コンテンツ放送受信システムにおいて、放送ストリームと追加ストリームとを用い、複数の別視点映像から、ユーザが望む映像を選択して再生できる(以下、「マルチアングル」という。)サービスを実現することとしてもよい。
放送システム1601は、図16に示すように、放送映像撮影部1611、放送ストリーム作成部104、別視点映像撮影部1612、追加ストリーム作成部105、別視点映像撮影部1613、追加ストリーム作成部1614、送出追加ストリーム選択部1615、及びスイッチ1616を含んで構成される。
再生装置1602は、図16に示すように、チューナ201、放送ストリームデコード部202、第1映像プレーン203、NIC211、追加ストリームデコード部212、第2映像プレーン213、映像合成部221、合成映像プレーン222、及び受信追加ストリーム選択部1652を含んで構成される。
(10)上述の変形例において、別視点映像への切り替え時には、映像が途切れることなくシームレスに切り替えできることが好ましい。以下、シームレスに切り替えるための3つの構成例について図20を用いて説明する。
図20(a)は、MPEG-2システムにおけるビデオストリームのデコーダモデルと同様の構成を示す。
図20(b)は、ビデオストリームのデコーダモデルを複数有する構成を示す図である。
図20(c)は、ビデオストリームのデコーダモデルにおいて、バッファは複数あり、コア部分(Dec2042以降の構成)が1個である構成を示す図である。
(11)上述の実施形態では、ワイド映像をワイドテレビに表示する構成で説明したが、ワイド映像を表示できれば足りる。
(12)上述の実施形態では、放送ストリームと追加ストリームの双方を再生装置200において受信し、再生することとしたが、受信及び再生の処理を複数の装置で分担して行ってもよい。
<5.補足1>
上述の実施形態についての補足説明を行う。
<ストリームの多重化>
次に、デジタルテレビ放送波などで伝送されるストリームである、MPEG-2トランスポートストリーム形式のデジタルストリームについて説明する。
<ビデオストリームの圧縮符号化>
ここでは、ビデオストリームの圧縮符号化について説明する。
<ビデオストリームの階層構造>
図26は、ビデオストリームの階層構造について説明するための図である。
<クロッピング及びスケーリング>
クロッピングは、符号化されたフレーム領域から、一部の領域を抜き出すことである。スケーリングは、その抜き出した領域を拡大、縮小することである。
<TSパケット>
図30は、TSパケットのデータ構造を示す図である。
<6.補足2>
以下、更に本発明の一実施形態としてのコンテンツ放送受信システムの構成及びその変形例と効果について説明する。
(1)本発明の一実施形態に係る再生装置は、原映像を再生する再生装置であって、前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出した放送画像群から成る放送映像を放送波から取得する第1受信手段と、前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を受信する第2受信手段と、前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を取得する配置情報取得手段と、前記配置情報に基づき、前記放送画像群と前記残画像群とを配置することにより前記フレーム画像群を生成し、表示する再生手段とを備える。
(2)また、前記フレーム画像のそれぞれは、前記放送画像と、複数の部分画像とに分割されており、前記残画像は、前記複数の部分画像が結合されて成り、前記配置情報は、前記残画像における、前記複数の部分画像それぞれの配置位置を示す情報を含み、前記再生手段は、前記配置情報を参照して、前記残画像を前記複数の部分画像に分割し、前記放送画像と結合することにより前記フレーム画像を生成することとしてもよい。
(3)また、前記フレーム画像のそれぞれは、水平方向に、前記複数の部分画像の1つとしての左部分画像、前記放送画像、及び前記複数の部分画像の1つとしての右部分画像の3つに分割され、前記再生手段は、前記配置情報を参照して、前記残画像を前記左部分画像及び前記右部分画像に分割し、前記左部分画像を前記放送画像の左端に結合し、前記右部分画像を前記放送画像の右端に結合することにより前記フレーム画像を生成することとしてもよい。
(4)また、前記フレーム画像のそれぞれは、垂直方向に、前記複数の部分画像の1つとしての上部分画像、前記放送画像、及び前記複数の部分画像の1つとしての下部分画像の3つに分割され、前記再生手段は、前記配置情報を参照して、前記残画像を前記上部分画像及び前記下部分画像に分割し、前記上部分画像を前記放送画像の上端に結合し、前記下部分画像を前記放送画像の下端に結合することにより前記フレーム画像を生成することとしてもよい。
(5)また、前記複数の部分画像の少なくとも1つは、他の部分画像と結合される場合に拡大又は縮小されており、前記配置情報には、前記拡大又は縮小された部分画像について、元の大きさに戻すための倍率が記載されており、前記再生手段は、前記放送画像と前記複数の部分画像とを結合する場合に、前記拡大又は縮小された部分画像については、前記配置情報に基づき元の大きさに戻す処理を行ってから結合することとしてもよい。
(6)また、前記放送システムは、ネットワークを介して前記追加映像を送信し、前記第2受信手段は、前記ネットワークを介して前記追加映像を受信することとしてもよい。
(7)また、前記配置情報は、前記抽出される部分が同じであるフレーム画像から成る所定のグループ毎に、1つずつ生成されており、前記再生手段は、前記フレーム画像群を生成する場合に、前記グループそれぞれについて、当該グループについて生成された配置情報を参照し、当該グループを構成するフレーム画像を生成することとしてもよい。
(8)本発明の一実施形態に係る再生方法は、原映像を再生する再生装置が実行する再生方法であって、前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出した放送画像群から成る放送映像を放送波から取得する第1受信ステップと、前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を受信する第2受信ステップと、前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を取得する配置情報取得ステップと、前記配置情報に基づき、前記放送画像群と前記残画像群とを配置することにより前記フレーム画像群を生成し、表示する再生ステップとを含む。
(9)本発明の一実施形態に係る放送システムは、原映像を送信する放送システムであって、前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出し、前記抽出した画像群から成る放送映像を、放送波を用いて送信する第1送信手段と、前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を生成し、送信する第2送信手段と、前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を生成し、送信する配置情報送信手段とを備える。
101 ワイド映像撮影部
102 映像分離部
103 映像配置部
104 放送ストリーム作成部
105 追加ストリーム作成部
106 追加ストリーム作成部
200 再生装置
201 チューナ
202 放送ストリームデコード部
203 映像プレーン
211 NIC
212 追加ストリームデコード部
213 映像プレーン
214 映像合成方法設定部
215 映像合成方法設定部
221 映像合成部
222 合成映像プレーン
Claims (13)
- 原映像を再生する再生装置であって、
前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出した放送画像群から成る放送映像を放送波から取得する第1受信手段と、
前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を受信する第2受信手段と、
前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を取得する配置情報取得手段と、
前記配置情報に基づき、前記放送画像群と前記残画像群とを配置することにより前記フレーム画像群を生成し、表示する再生手段とを備える
ことを特徴とする再生装置。 - 前記フレーム画像のそれぞれは、前記放送画像と、複数の部分画像とに分割されており、
前記残画像は、前記複数の部分画像が結合されて成り、
前記配置情報は、前記残画像における、前記複数の部分画像それぞれの配置位置を示す情報を含み、
前記再生手段は、前記配置情報を参照して、前記残画像を前記複数の部分画像に分割し、前記放送画像と結合することにより前記フレーム画像を生成する
ことを特徴とする請求項1記載の再生装置。 - 前記フレーム画像のそれぞれは、水平方向に、前記複数の部分画像の1つとしての左部分画像、前記放送画像、及び前記複数の部分画像の1つとしての右部分画像の3つに分割され、
前記再生手段は、前記配置情報を参照して、前記残画像を前記左部分画像及び前記右部分画像に分割し、前記左部分画像を前記放送画像の左端に結合し、前記右部分画像を前記放送画像の右端に結合することにより前記フレーム画像を生成する
ことを特徴とする請求項2記載の再生装置。 - 前記フレーム画像のそれぞれは、垂直方向に、前記複数の部分画像の1つとしての上部分画像、前記放送画像、及び前記複数の部分画像の1つとしての下部分画像の3つに分割され、
前記再生手段は、前記配置情報を参照して、前記残画像を前記上部分画像及び前記下部分画像に分割し、前記上部分画像を前記放送画像の上端に結合し、前記下部分画像を前記放送画像の下端に結合することにより前記フレーム画像を生成する
ことを特徴とする請求項2記載の再生装置。 - 前記複数の部分画像の少なくとも1つは、他の部分画像と結合される場合に拡大又は縮小されており、
前記配置情報には、前記拡大又は縮小された部分画像について、元の大きさに戻すための倍率が記載されており、
前記再生手段は、前記放送画像と前記複数の部分画像とを結合する場合に、前記拡大又は縮小された部分画像については、前記配置情報に基づき元の大きさに戻す処理を行ってから結合する
ことを特徴とする請求項2記載の再生装置。 - 前記複数の部分画像の少なくとも1つは、他の部分画像と結合される場合に回転されており、
前記配置情報には、前記回転された部分画像について、回転前の姿勢に戻すための回転角が記載されており、
前記再生手段は、前記放送画像と前記複数の部分画像とを結合する場合に、前記回転された部分画像については、前記配置情報に記載された回転角の分だけ回転させてから結合する
ことを特徴とする請求項2記載の再生装置。 - 前記放送システムは、ネットワークを介して前記追加映像を送信し、
前記第2受信手段は、前記ネットワークを介して前記追加映像を受信する
ことを特徴とする請求項1記載の再生装置。 - 前記配置情報は、前記抽出される部分が同じであるフレーム画像から成る所定のグループ毎に、1つずつ生成されており、
前記再生手段は、前記フレーム画像群を生成する場合に、前記グループそれぞれについて、当該グループについて生成された配置情報を参照し、当該グループを構成するフレーム画像を生成する
ことを特徴とする請求項1記載の再生装置。 - 原映像を再生する再生装置が実行する再生方法であって、
前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出した放送画像群から成る放送映像を放送波から取得する第1受信ステップと、
前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を受信する第2受信ステップと、
前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を取得する配置情報取得ステップと、
前記配置情報に基づき、前記放送画像群と前記残画像群とを配置することにより前記フレーム画像群を生成し、表示する再生ステップとを含む
ことを特徴とする再生方法。 - 原映像を再生する再生装置に用いられる集積回路であって、
前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出した放送画像群から成る放送映像を放送波から取得受信する第1受信手段と、
前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を受信する第2受信手段と、
前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を取得する配置情報取得手段と、
前記配置情報に基づき、前記放送画像群と前記残画像群とを配置することにより前記フレーム画像群を生成し、表示する再生手段とを備える
ことを特徴とする集積回路。 - 原映像を送信する放送システムであって、
前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出し、前記抽出した画像群から成る放送映像を、放送波を用いて送信する第1送信手段と、
前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を生成し、送信する第2送信手段と、
前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を生成し、送信する配置情報送信手段とを備える
ことを特徴とする放送システム。 - 原映像を送信する放送システムが実行する放送方法であって、
前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出し、前記抽出した画像群から成る放送映像を、放送波を用いて送信する第1送信ステップと、
前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を生成し、送信する第2送信ステップと、
前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を生成し、送信する配置情報送信ステップとを含む
ことを特徴とする放送方法。 - 原映像を送信する放送システムに用いられる集積回路であって、
前記原映像を構成するフレーム画像群における、各フレーム画像の任意の部分を抽出し、前記抽出した画像群から成る放送映像を、放送波を用いて送信する第1送信手段と、
前記フレーム画像群における、各フレーム画像の前記抽出された部分以外の残画像群から成る追加映像を生成し、送信する第2送信手段と、
前記フレーム画像群についての、各フレーム画像における放送画像及び残画像の配置を示す配置情報を生成し、送信する配置情報送信手段とを備える
ことを特徴とする集積回路。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280003152.6A CN103125123B (zh) | 2011-08-11 | 2012-08-09 | 再现装置、再现方法、集成电路、广播系统及广播方法 |
US13/824,581 US8773584B2 (en) | 2011-08-11 | 2012-08-09 | Playback apparatus, playback method, integrated circuit, broadcast system, and broadcast method using a broadcast video and additional video |
JP2013505237A JP5979499B2 (ja) | 2011-08-11 | 2012-08-09 | 再生装置、再生方法、集積回路、放送システム、及び放送方法 |
EP12821943.3A EP2744197A4 (en) | 2011-08-11 | 2012-08-09 | PLAYING DEVICE, PLAYING PROCEDURE, INTEGRATED CIRCUIT, BROADCASTING SYSTEM AND BROADCASTING METHOD |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161522393P | 2011-08-11 | 2011-08-11 | |
US61/522,393 | 2011-08-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013021656A1 true WO2013021656A1 (ja) | 2013-02-14 |
Family
ID=47668190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/005088 WO2013021656A1 (ja) | 2011-08-11 | 2012-08-09 | 再生装置、再生方法、集積回路、放送システム、及び放送方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8773584B2 (ja) |
EP (1) | EP2744197A4 (ja) |
JP (1) | JP5979499B2 (ja) |
CN (1) | CN103125123B (ja) |
WO (1) | WO2013021656A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014142203A1 (ja) * | 2013-03-14 | 2014-09-18 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
WO2015001947A1 (ja) * | 2013-07-05 | 2015-01-08 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
WO2022118460A1 (ja) * | 2020-12-04 | 2022-06-09 | 日本電信電話株式会社 | 制御装置、制御方法、およびプログラム |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9190110B2 (en) | 2009-05-12 | 2015-11-17 | JBF Interlude 2009 LTD | System and method for assembling a recorded composition |
US11232458B2 (en) | 2010-02-17 | 2022-01-25 | JBF Interlude 2009 LTD | System and method for data mining within interactive multimedia |
EP2866457B1 (en) * | 2012-06-22 | 2019-03-20 | Saturn Licensing LLC | Reception device, and synchronous processing method therefor |
CN106233745B (zh) * | 2013-07-29 | 2021-01-15 | 皇家Kpn公司 | 向客户端提供瓦片视频流 |
TWI520610B (zh) * | 2013-08-01 | 2016-02-01 | 晨星半導體股份有限公司 | 電視控制裝置與相關方法 |
US20160255412A1 (en) * | 2013-10-22 | 2016-09-01 | Sharp Kabushiki Kaisha | Display control device, distribution device, display control method, and display control system |
US20150253974A1 (en) | 2014-03-07 | 2015-09-10 | Sony Corporation | Control of large screen display using wireless portable computer interfacing with display controller |
US9653115B2 (en) | 2014-04-10 | 2017-05-16 | JBF Interlude 2009 LTD | Systems and methods for creating linear video from branched video |
US10694192B2 (en) | 2014-06-27 | 2020-06-23 | Koninklijke Kpn N.V. | HEVC-tiled video streaming |
EP3162074A1 (en) | 2014-06-27 | 2017-05-03 | Koninklijke KPN N.V. | Determining a region of interest on the basis of a hevc-tiled video stream |
JP6239472B2 (ja) * | 2014-09-19 | 2017-11-29 | 株式会社東芝 | エンコード装置、デコード装置、ストリーミングシステム、および、ストリーミング方法 |
US9792957B2 (en) | 2014-10-08 | 2017-10-17 | JBF Interlude 2009 LTD | Systems and methods for dynamic video bookmarking |
US11412276B2 (en) | 2014-10-10 | 2022-08-09 | JBF Interlude 2009 LTD | Systems and methods for parallel track transitions |
WO2016105322A1 (en) * | 2014-12-25 | 2016-06-30 | Echostar Ukraine, L.L.C. | Simultaneously viewing multiple camera angles |
US10582265B2 (en) | 2015-04-30 | 2020-03-03 | JBF Interlude 2009 LTD | Systems and methods for nonlinear video playback using linear real-time video players |
WO2017029400A1 (en) | 2015-08-20 | 2017-02-23 | Koninklijke Kpn N.V. | Forming one or more tile streams on the basis of one or more video streams |
US10460765B2 (en) | 2015-08-26 | 2019-10-29 | JBF Interlude 2009 LTD | Systems and methods for adaptive and responsive video |
CN107924690B (zh) * | 2015-09-02 | 2021-06-25 | 交互数字Ce专利控股公司 | 用于促进扩展场景中的导航的方法、装置和系统 |
EP3360330B1 (en) | 2015-10-08 | 2021-03-24 | Koninklijke KPN N.V. | Enhancing a region of interest in video frames of a video stream |
US11164548B2 (en) | 2015-12-22 | 2021-11-02 | JBF Interlude 2009 LTD | Intelligent buffering of large-scale video |
US11856271B2 (en) | 2016-04-12 | 2023-12-26 | JBF Interlude 2009 LTD | Symbiotic interactive video |
US20180070048A1 (en) * | 2016-09-05 | 2018-03-08 | International Business Machines Corporation | Management of media content on a storage medium |
US11050809B2 (en) | 2016-12-30 | 2021-06-29 | JBF Interlude 2009 LTD | Systems and methods for dynamic weighting of branched video paths |
US10257578B1 (en) | 2018-01-05 | 2019-04-09 | JBF Interlude 2009 LTD | Dynamic library display for interactive videos |
US11601721B2 (en) | 2018-06-04 | 2023-03-07 | JBF Interlude 2009 LTD | Interactive video dynamic adaptation and user profiling |
US20200296462A1 (en) | 2019-03-11 | 2020-09-17 | Wci One, Llc | Media content presentation |
US20200296316A1 (en) | 2019-03-11 | 2020-09-17 | Quibi Holdings, LLC | Media content presentation |
US11523185B2 (en) | 2019-06-19 | 2022-12-06 | Koninklijke Kpn N.V. | Rendering video stream in sub-area of visible display area |
US11490047B2 (en) | 2019-10-02 | 2022-11-01 | JBF Interlude 2009 LTD | Systems and methods for dynamically adjusting video aspect ratios |
US12096081B2 (en) | 2020-02-18 | 2024-09-17 | JBF Interlude 2009 LTD | Dynamic adaptation of interactive video players using behavioral analytics |
US11245961B2 (en) | 2020-02-18 | 2022-02-08 | JBF Interlude 2009 LTD | System and methods for detecting anomalous activities for interactive videos |
CN111479162B (zh) * | 2020-04-07 | 2022-05-13 | 成都酷狗创业孵化器管理有限公司 | 直播数据传输方法、装置以及计算机可读存储介质 |
US12047637B2 (en) | 2020-07-07 | 2024-07-23 | JBF Interlude 2009 LTD | Systems and methods for seamless audio and video endpoint transitions |
US11882337B2 (en) | 2021-05-28 | 2024-01-23 | JBF Interlude 2009 LTD | Automated platform for generating interactive videos |
US12155897B2 (en) | 2021-08-31 | 2024-11-26 | JBF Interlude 2009 LTD | Shader-based dynamic video manipulation |
US11934477B2 (en) | 2021-09-24 | 2024-03-19 | JBF Interlude 2009 LTD | Video player integration within websites |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007104540A (ja) * | 2005-10-07 | 2007-04-19 | Open Interface Inc | 撮影画像配信装置、撮影画像配信プログラム及び撮影画像配信方法 |
JP2010258707A (ja) * | 2009-04-23 | 2010-11-11 | Sharp Corp | 映像表示装置、映像表示方法、プログラムおよび記録媒体 |
JP2011091578A (ja) * | 2009-10-21 | 2011-05-06 | Canon Inc | 映像判定装置、映像表示装置及びそれらの制御方法、プログラム |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2304788A (en) * | 1987-07-27 | 1989-03-01 | David M. Geshwind | A method for transmitting high-definition television over low-bandwidth channels |
GB8721565D0 (en) * | 1987-09-14 | 1987-10-21 | Rca Corp | Video signal processing system |
US5068728A (en) * | 1990-06-22 | 1991-11-26 | Albert Macovski | Compatible increased aspect ratio television system |
JP3241368B2 (ja) * | 1990-09-27 | 2001-12-25 | 日本電気株式会社 | テレビジョン信号伝送方式 |
JPH05191803A (ja) * | 1992-01-13 | 1993-07-30 | Hitachi Ltd | 動画像の合成装置および多地点間会議装置 |
US5430486A (en) * | 1993-08-17 | 1995-07-04 | Rgb Technology | High resolution video image transmission and storage |
EP0964576B1 (en) * | 1994-04-15 | 2004-03-03 | Koninklijke Philips Electronics N.V. | Method and arrangement for receiving digital video signals |
JPH07298258A (ja) * | 1994-04-28 | 1995-11-10 | Nippon Telegr & Teleph Corp <Ntt> | 画像符号化復号化方法 |
US5768539A (en) * | 1994-05-27 | 1998-06-16 | Bell Atlantic Network Services, Inc. | Downloading applications software through a broadcast channel |
GB9526304D0 (en) * | 1995-05-22 | 1996-02-21 | British Sky Broadcasting Ltd | Interactive services interface |
US6020931A (en) * | 1996-04-25 | 2000-02-01 | George S. Sheng | Video composition and position system and media signal communication system |
JPH11122610A (ja) * | 1997-10-17 | 1999-04-30 | Toshiba Corp | 画像符号化方法及び画像復号化方法並びにこれらの装置 |
EP1297695A2 (en) * | 2000-06-26 | 2003-04-02 | NDS Limited | Time shifted interactive television |
EP1233614B1 (fr) * | 2001-02-16 | 2012-08-08 | C.H.I. Development Mgmt. Ltd. XXIX, LLC | Système de transmission et de traitement vidéo pour générer une mosaique utilisateur |
US7206352B2 (en) * | 2001-04-02 | 2007-04-17 | Koninklijke Philips Electronics N.V. | ATSC digital television system |
JP4646446B2 (ja) * | 2001-06-28 | 2011-03-09 | パナソニック株式会社 | 映像信号処理装置 |
US7050113B2 (en) * | 2002-03-26 | 2006-05-23 | International Business Machines Corporation | Digital video data scaler and method |
FR2848372B1 (fr) * | 2002-12-09 | 2005-04-01 | Medialive | Synchronisation de flux audiovisuels securises |
CN1509081A (zh) * | 2002-12-20 | 2004-06-30 | �ʼҷ����ֵ��ӹɷ�����˾ | 通过广播和网络流传送双层hdtv信号的方法和系统 |
US7911536B2 (en) * | 2004-09-23 | 2011-03-22 | Intel Corporation | Screen filled display of digital video content |
US7489833B2 (en) * | 2004-10-06 | 2009-02-10 | Panasonic Corporation | Transmitting device, reconstruction device, transmitting method and reconstruction method for broadcasts with hidden subtitles |
JP2006115001A (ja) * | 2004-10-12 | 2006-04-27 | Fujitsu Ltd | 映像送信装置および映像受信装置 |
JP4760288B2 (ja) * | 2005-10-13 | 2011-08-31 | ソニー株式会社 | 画像表示システム、表示装置、画像再合成装置、画像再合成方法及びプログラム |
US8407741B2 (en) * | 2006-11-20 | 2013-03-26 | Sk Planet Co., Ltd. | System, server and method for providing supplementary information service related to broadcast content |
JP4645638B2 (ja) * | 2007-11-22 | 2011-03-09 | ソニー株式会社 | 信号送信装置、信号送信方法、信号受信装置及び信号受信方法 |
CN101783873A (zh) * | 2009-01-19 | 2010-07-21 | 北京视典无限传媒技术有限公司 | 数字化多媒体信息传输平台 |
JP2011199389A (ja) * | 2010-03-17 | 2011-10-06 | Sony Corp | 画像処理装置、画像変換方法、およびプログラム |
JP5132808B1 (ja) * | 2011-10-11 | 2013-01-30 | 株式会社東芝 | デジタル放送録画再生装置およびデジタル放送録画再生方法 |
-
2012
- 2012-08-09 US US13/824,581 patent/US8773584B2/en active Active
- 2012-08-09 JP JP2013505237A patent/JP5979499B2/ja active Active
- 2012-08-09 CN CN201280003152.6A patent/CN103125123B/zh active Active
- 2012-08-09 WO PCT/JP2012/005088 patent/WO2013021656A1/ja active Application Filing
- 2012-08-09 EP EP12821943.3A patent/EP2744197A4/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007104540A (ja) * | 2005-10-07 | 2007-04-19 | Open Interface Inc | 撮影画像配信装置、撮影画像配信プログラム及び撮影画像配信方法 |
JP2010258707A (ja) * | 2009-04-23 | 2010-11-11 | Sharp Corp | 映像表示装置、映像表示方法、プログラムおよび記録媒体 |
JP2011091578A (ja) * | 2009-10-21 | 2011-05-06 | Canon Inc | 映像判定装置、映像表示装置及びそれらの制御方法、プログラム |
Non-Patent Citations (2)
Title |
---|
KINJI MATSUMURA; YASUAKI KANATSUGU: "HybridcastTM No Gaiyou To Gijutsu", NHK STRL R&D, NO.124, 2010 |
See also references of EP2744197A4 |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102145154B1 (ko) | 2013-03-14 | 2020-08-14 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
US10862638B2 (en) | 2013-03-14 | 2020-12-08 | Sony Corporation | Transmission apparatus, transmission method, reception apparatus, and reception method |
US11075728B2 (en) | 2013-03-14 | 2021-07-27 | Sony Corporation | Transmission apparatus, transmission method, reception apparatus, and reception method |
KR20190136131A (ko) * | 2013-03-14 | 2019-12-09 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
KR102052684B1 (ko) | 2013-03-14 | 2019-12-06 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
KR101753503B1 (ko) | 2013-03-14 | 2017-07-03 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
KR20170077285A (ko) * | 2013-03-14 | 2017-07-05 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
US9876616B2 (en) | 2013-03-14 | 2018-01-23 | Sony Corporation | Transmission apparatus, transmission method, reception apparatus, and reception method |
RU2642834C2 (ru) * | 2013-03-14 | 2018-01-29 | Сони Корпорейшн | Передающее устройство, способ передачи, приемное устройство и способ приема |
WO2014142203A1 (ja) * | 2013-03-14 | 2014-09-18 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
KR102102349B1 (ko) | 2013-03-14 | 2020-04-21 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
KR20200040941A (ko) * | 2013-03-14 | 2020-04-20 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
KR20160030104A (ko) * | 2013-07-05 | 2016-03-16 | 소니 주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
JPWO2015001947A1 (ja) * | 2013-07-05 | 2017-02-23 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
CN105340281A (zh) * | 2013-07-05 | 2016-02-17 | 索尼公司 | 发送装置、发送方法、接收装置和接收方法 |
RU2652099C2 (ru) * | 2013-07-05 | 2018-04-25 | Сони Корпорейшн | Устройство передачи, способ передачи, устройство приема и способ приема |
WO2015001947A1 (ja) * | 2013-07-05 | 2015-01-08 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
KR102269311B1 (ko) * | 2013-07-05 | 2021-06-28 | 소니그룹주식회사 | 송신 장치, 송신 방법, 수신 장치 및 수신 방법 |
CN105340281B (zh) * | 2013-07-05 | 2019-06-28 | 索尼公司 | 发送装置、发送方法、接收装置和接收方法 |
US11330311B2 (en) | 2013-07-05 | 2022-05-10 | Saturn Licensing Llc | Transmission device, transmission method, receiving device, and receiving method for rendering a multi-image-arrangement distribution service |
WO2022118460A1 (ja) * | 2020-12-04 | 2022-06-09 | 日本電信電話株式会社 | 制御装置、制御方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP5979499B2 (ja) | 2016-08-24 |
CN103125123B (zh) | 2017-04-26 |
US8773584B2 (en) | 2014-07-08 |
CN103125123A (zh) | 2013-05-29 |
EP2744197A4 (en) | 2015-02-18 |
US20130235270A1 (en) | 2013-09-12 |
EP2744197A1 (en) | 2014-06-18 |
JPWO2013021656A1 (ja) | 2015-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5979499B2 (ja) | 再生装置、再生方法、集積回路、放送システム、及び放送方法 | |
JP6229962B2 (ja) | 符号化装置及び符号化方法 | |
WO2013021643A1 (ja) | 放送通信連携システム、データ生成装置及び受信装置 | |
US20120033039A1 (en) | Encoding method, display device, and decoding method | |
US20110181693A1 (en) | Method and apparatus for generating data stream for providing 3-dimensional multimedia service, and method and apparatus for receiving the data stream | |
WO2013061526A1 (ja) | 放送受信装置、放送受信方法およびプログラム | |
US20120293619A1 (en) | Generating a 3d video signal | |
TW201234833A (en) | Encoding method, display apparatus, and decoding method | |
US20120050476A1 (en) | Video processing device | |
WO2012111320A1 (ja) | 映像符号化装置、映像符号化方法、映像符号化プログラム、映像再生装置、映像再生方法及び映像再生プログラム | |
WO2013099290A1 (ja) | 映像再生装置、映像再生方法、映像再生プログラム、映像送信装置、映像送信方法及び映像送信プログラム | |
WO2013031549A1 (ja) | 送信装置、送信方法および受信装置 | |
US9357200B2 (en) | Video processing device and video processing method | |
WO2012169204A1 (ja) | 送信装置、受信装置、送信方法及び受信方法 | |
JP6008292B2 (ja) | ビデオストリームの映像のデータ作成装置及び再生装置 | |
JPWO2013099289A1 (ja) | 再生装置、送信装置、再生方法及び送信方法 | |
WO2013018490A1 (ja) | 送信装置、送信方法および受信装置 | |
WO2012026342A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
JP2013026696A (ja) | 送信装置、送信方法および受信装置 | |
WO2013018489A1 (ja) | 送信装置、送信方法および受信装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280003152.6 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2013505237 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13824581 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2012821943 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012821943 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12821943 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |