WO2012043352A1 - 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 - Google Patents
立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 Download PDFInfo
- Publication number
- WO2012043352A1 WO2012043352A1 PCT/JP2011/071564 JP2011071564W WO2012043352A1 WO 2012043352 A1 WO2012043352 A1 WO 2012043352A1 JP 2011071564 W JP2011071564 W JP 2011071564W WO 2012043352 A1 WO2012043352 A1 WO 2012043352A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- information
- image data
- display
- subtitle
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/025—Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame
- H04N7/03—Subscription systems therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/025—Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/003—Aspects relating to the "2D+depth" image format
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/005—Aspects relating to the "3D+depth" image format
Definitions
- the present invention relates to a stereoscopic image data transmission device, a stereoscopic image data transmission method, a stereoscopic image data reception device, and a stereoscopic image data reception method, and in particular, a stereoscopic image data transmission device that transmits superimposition information data such as captions together with stereoscopic image data. Etc.
- Patent Document 1 proposes a transmission method using a television broadcast radio wave of stereoscopic image data.
- stereoscopic image data having left-eye image data and right-eye image data is transmitted, and stereoscopic image display using binocular parallax is performed.
- FIG. 41 shows the relationship between the display position of the left and right images of an object (object) on the screen and the playback position of the stereoscopic image in stereoscopic image display using binocular parallax.
- the right and left line of sight intersects in front of the screen surface.
- the position is in front of the screen surface.
- DPa represents a horizontal disparity vector related to the object A.
- the right and left lines of sight intersect on the screen surface. It becomes on the surface.
- the left image Lc is shifted to the left side and the right image Rc is shifted to the right side, the right and left lines of sight intersect at the back of the screen surface.
- the playback position is behind the screen.
- DPc represents a horizontal disparity vector related to the object C.
- Non-Patent Document 1 describes details of the HDMI standard.
- the transmission side transmits superimposition information data such as captions together with the two-dimensional image data.
- the reception side processes the data of the superimposition information, generates display data for displaying the superimposition information, and superimposes the superimposition information by superimposing the display data on the two-dimensional image data. Two-dimensional images are obtained.
- the superimposition information data such as captions when transmitting stereoscopic image data.
- the superimposition information data is for a two-dimensional image
- the superimposition information data for the two-dimensional image is combined with the transmission format of the stereoscopic image data in accordance with the stereoscopic image data transmission format.
- a process for generating display data to be superimposed on the data is required. Therefore, such a high-level processing function is necessary as a set-top box that receives stereoscopic image data, which is expensive.
- An object of the present invention is to facilitate processing on the receiving side when transmitting superimposition information data such as captions together with stereoscopic image data.
- the concept of this invention is An image data output unit for outputting stereoscopic image data of a predetermined transmission format having left eye image data and right eye image data; A superimposition information data output unit for outputting superimposition information data to be superimposed on an image based on the left eye image data and the right eye image data; The superimposition information data output from the superimposition information data output unit is converted into left-eye superimposition information data corresponding to the left-eye image data included in the stereoscopic image data of the predetermined transmission format and the predetermined transmission format.
- a superimposition information data processing unit for converting into superimposition information data for transmission having right eye superimposition information data corresponding to the right eye image data included in the stereoscopic image data; Corresponding to the first display area corresponding to the display position of the left eye superimposition information and the display position of the right eye superimposition information within the display area of the transmission superimposition information data output from the superimposition information data processing unit
- the second display area is set, and the area information of each of the first display area and the second display area, and the superimposition information included in the first display area and the second display area are set.
- Display control information generation for generating display control information including information on each target frame to be displayed and parallax information for shifting and adjusting display positions of superimposition information included in the first display area and the second display area, respectively And
- the first data stream including the stereoscopic image data output from the image data output unit, the transmission superimposition information data output from the superimposition information data processing unit, and the display control information generation unit
- a stereoscopic image data transmission apparatus comprising: a data transmission unit that transmits a multiplexed data stream having a second data stream including display control information.
- the image data output unit outputs stereoscopic image data of a predetermined transmission format having left-eye image data and right-eye image data.
- the transmission format of stereoscopic image data includes a side-by-side (Side By Side) method, a top-and-bottom (Top & Bottom) method, and the like.
- the superimposition information data output unit outputs the superimposition information data to be superimposed on the left eye image data and the right eye image data.
- the superimposition information is information such as subtitles, graphics, and text superimposed on the image.
- the superimposition information data processing unit converts the superimposition information data into transmission superimposition information data having left eye superimposition information data and right eye superimposition information data.
- the data of the left eye superimposition information is data corresponding to the left eye image data included in the stereoscopic image data of the predetermined transmission format described above, and is superimposed on the left eye image data included in the stereoscopic image data on the receiving side.
- the right eye superimposition information data is data corresponding to the right eye image data included in the stereoscopic image data of the above-described predetermined transmission format, and is superimposed on the right eye image data included in the stereoscopic image data on the receiving side. This is data for generating display data of right eye superimposition information.
- the superimposition information data is, for example, subtitle data (DVB subtitle data).
- left eye superimposition information data and right eye superimposition information data are generated as follows. For example, when the transmission method of stereoscopic image data is a side-by-side method, the superimposition information data processing unit generates left eye superimposition information data and right eye superimposition information data as data of different objects in the same region. The For example, when the transmission method of stereoscopic image data is a top-and-bottom method, the superimposition information data processing unit generates left-eye superimposition information data and right-eye superimposition information data as data of objects in different regions. Is done.
- it further includes a disparity information output unit that outputs disparity information between the left eye image based on the left eye image data and the right eye image based on the right eye image data, and the superimposition information data processing unit is output from the disparity information output unit.
- the reception side does not perform a process of providing a parallax between the left eye superimposition information and the right eye superimposition information, and displays the superimposition information such as subtitles between each object in the image. Perspective consistency can be maintained in an optimum state.
- the display control information generating unit includes a first display area corresponding to the display position of the left eye superimposition information and a second display area corresponding to the display position of the right eye superimposition information in the display area of the transmission superimposition information data. Is set, and display control information related to the first and second display areas is generated. These first and second display areas are set, for example, in response to a user operation or automatically.
- the display control information includes area information on the first display area and area information on the second display area.
- the display control information includes information on a target frame for displaying the superimposition information included in the first display area and information on a target frame for displaying the superimposition information included in the second display area.
- the display control information includes disparity information for shifting and adjusting the display position of the superimposition information included in the first display area, and disparity information for shifting and adjusting the display position of the superimposition information included in the second display area. Is included. These pieces of parallax information are for providing parallax between the superimposition information included in the first display area and the superposition information included in the second display area.
- a disparity information output unit that outputs disparity information between a left eye image based on left eye image data and a right eye image based on right eye image data is further provided, and the display control information generation unit is output from the disparity information output unit Based on the parallax information, the parallax information for shifting and adjusting the display positions of the superimposition information included in the first display area and the second display area may be acquired.
- the data transmission unit transmits a multiplexed data stream including the first data stream and the second data stream.
- the first data stream includes stereoscopic image data in a predetermined transmission format output from the image data output unit.
- the second data stream includes transmission superimposition information data output from the superimposition information data processing unit and display control information generated by the display control information generation unit.
- the superimposing information data for transmission having the left eye superimposing information data and the right eye superimposing information data corresponding to the transmission format is transmitted together with the stereoscopic image data. Therefore, on the receiving side, based on the transmission superimposition information data, the display data of the left eye superimposition information superimposed on the left eye image data possessed by the stereoscopic image data and the right eye superimposed on the right eye image data possessed by the stereoscopic image data Display data of superimposition information can be easily generated, and processing can be facilitated.
- the first display area corresponding to the display position of the left eye superimposition information and the second display area corresponding to the display position of the right eye superimposition information Display control information (region information, target frame information, parallax information) related to the display region is transmitted.
- region information, target frame information, parallax information related to the display region is transmitted.
- parallax can be given to the display position of the superimposition information in the first display area and the second display area, and in the display of the superimposition information such as captions, the consistency of perspective between each object in the image is improved. It becomes possible to maintain the optimum state.
- the parallax information included in the display control information generated by the display control information generation unit may have sub-pixel accuracy.
- the shift operation is smoothly performed. Can contribute to the improvement of image quality.
- the display control information generated by the display control information generation unit further controls on / off of the display of the superimposition information included in the first display area and the second display area.
- Command information may be included.
- the display of the superimposition information of the first display area and the second display area based on the area information and the disparity information included in the display control information together with the command information is turned on. Alternatively, it can be turned off.
- the data transmission unit inserts identification information for identifying that the second data stream includes transmission superimposition information data corresponding to the transmission format of the stereoscopic image data into the multiplexed data stream. It may be made to do. In this case, on the receiving side, whether or not the second data stream includes transmission superimposition information data (stereoscopic image superimposition information data) corresponding to the transmission format of the stereoscopic image data is determined based on this identification information. Identification becomes possible.
- the superimposition information data is subtitle data
- the superimposition information data display area is a region
- the first display area and the second display area are set to be included in the region. It may be a subregion.
- the subregion region is a newly defined region.
- a data receiver for receiving a multiplexed data stream having a first data stream and a second data stream;
- the first data stream includes stereoscopic image data of a predetermined transmission format having left eye image data and right eye image data
- the second data stream includes transmission superimposition information data and display control information
- the transmission superimposition information data includes left eye superimposition information data corresponding to the left eye image data and right eye superimposition information data corresponding to the right eye image data included in the stereoscopic image data of the predetermined transmission format.
- the display control information includes a first display area corresponding to a display position of the left eye superimposition information set in a display area of the transmission superimposition information data and a second display area corresponding to a display position of the right eye superimposition information.
- An image data acquisition unit that acquires the stereoscopic image data from the first data stream included in the multiplexed data stream received by the data reception unit;
- a superimposition information data acquisition unit that acquires the transmission superimposition information data from the second data stream included in the multiplexed data stream received by the data reception unit;
- a display control information acquisition unit that acquires the display control information from the second data stream of the multiplexed data stream received by the data reception unit;
- Display that generates display data for displaying the left eye superimposition information and the right eye superimposition information superimposed on the left eye image and the right eye image, respectively, based on the transmission superimposition information data acquired by the superimposition information data acquisition unit
- a data generator Of the display data generated by the display data generation unit, based on the region information of the first display region and the second display region of the display control information acquired by the display control information acquisition unit.
- a display data extraction unit for extracting display data of the first display area and the second display area;
- the positions of the display data in the first display area and the second display area extracted by the display data extraction unit are based on the parallax information included in the display control information acquired by the display control information acquisition unit.
- Shift adjustment section for adjusting the shift,
- the display data of the first display area and the second display area that have been shift-adjusted by the shift adjustment unit are acquired from the stereoscopic image data acquired by the image data acquisition unit, respectively.
- the stereoscopic image data receiving apparatus further includes a data synthesis unit that obtains output stereoscopic image data by superimposing the target frame indicated by the target frame information included in the display control information acquired by the unit.
- the data receiving unit receives a multiplexed data stream having the first data stream and the second data stream.
- the first data stream includes stereoscopic image data in a predetermined transmission format having left eye image data and right eye image data.
- the second data stream includes transmission superimposition information data (stereoscopic image superimposition information data) having left-eye superimposition information data and right-eye superimposition information data.
- the left-eye superimposition information data is data corresponding to the left-eye image data included in the stereoscopic image data of the predetermined transmission format described above, and the left-eye superimposition information to be superimposed on the left-eye image data included in the stereoscopic image data is displayed.
- Data for generating data is data corresponding to the right-eye image data included in the stereoscopic image data of the above-described predetermined transmission format, and the right-eye superimposition information that is superimposed on the right-eye image data included in the stereoscopic image data. This is data for generating display data.
- the second data stream includes display control information.
- the display control information includes a first display area corresponding to the display position of the left eye superimposition information set in the display area of the superimposition information data for transmission and a second display corresponding to the display position of the right eye superimposition information. Each area information of the area is included.
- the display control information includes information on the target frame for displaying the superimposition information included in the first display area and the second display area, respectively. Further, the display control information includes disparity information for shifting and adjusting the display positions of the superimposition information included in the first display area and the second display area.
- the stereoscopic image data of a predetermined transmission format is acquired from the first data stream included in the multiplexed data stream received by the data reception unit by the image data acquisition unit. Further, the superimposition information data acquisition unit acquires the superimposition information data for transmission from the second data stream included in the multiplexed data stream received by the data reception unit. Further, the display control information acquisition unit acquires display control information from the second data stream included in the multiplexed data stream received by the data reception unit.
- the display data generating unit generates display data for displaying superimposed information on the left eye image and the right eye image based on the transmission superimposed information data acquired by the superimposed information data acquiring unit. Then, based on the area information of the first display area and the second display area included in the display control information among the display data generated by the display data generation section by the display data extraction section, the first display area and Display data of the second display area is extracted.
- the display data extracted in this way is a display target.
- the position of the display data in the first display area and the second display area extracted by the display data extraction unit by the shift adjustment unit is based on the parallax information included in the display control information acquired by the display control information acquisition unit. Shift adjusted.
- the display data of the first display area and the second display area that have been shift-adjusted by the data adjustment unit by the data combining unit is the display control information of the stereoscopic image data acquired by the image data acquisition unit.
- Output stereoscopic image data is obtained by being superimposed on the target frame indicated by the information of the target frame acquired by the acquisition unit.
- the output stereoscopic image data is transmitted to an external device by a digital interface unit such as HDMI, for example.
- the output stereoscopic image data causes the display panel to display a left eye image and a right eye image for allowing the user to perceive a stereoscopic image.
- the superimposing information data for transmission having the left-eye superimposing information data and the right-eye superimposing information data corresponding to the transmission format is received together with the stereoscopic image data. Therefore, based on the transmission superimposition information data, the display data of the left eye superimposition information superimposed on the left eye image data possessed by the stereoscopic image data and the display data of the right eye superimposition information superimposed on the right eye image data possessed by the stereoscopic image data Can be easily generated, and the processing can be facilitated.
- the first display area corresponding to the display position of the left eye superimposition information and the second display area corresponding to the display position of the right eye superimposition information Display control information (region information, target frame information, parallax information) related to the display region is received. Therefore, only the superimposition information of the first display area and the second display area can be superimposed and displayed on the target frame.
- parallax can be given to the display position of the superimposition information in the first display area and the second display area, and in the display of the superimposition information such as captions, the consistency of perspective between each object in the image is improved. It becomes possible to maintain the optimum state.
- the multiplexed data stream received by the data receiving unit is identification information for identifying that the second data stream includes superimposition information data for transmission corresponding to the transmission format of the stereoscopic image data.
- a superimposition information data identification unit for identifying that the superimposition information data for transmission corresponding to the transmission format is included may be further provided. In this case, this identification information makes it possible to identify whether or not the second data stream includes transmission superimposition information data (stereoscopic image superimposition information data) corresponding to the transmission format of the stereoscopic image data.
- transmission superimposition information data having left-eye superimposition information data and right-eye superimposition information data corresponding to the transmission format is transmitted together with the stereoscopic image data from the transmission side to the reception side. Therefore, on the receiving side, based on the transmission superimposition information data, the display data of the left eye superimposition information superimposed on the left eye image data possessed by the stereoscopic image data and the right eye superimposed on the right eye image data possessed by the stereoscopic image data Display data of superimposition information can be easily generated, and processing can be facilitated.
- the first display area corresponding to the display position of the left eye superimposition information and the second display area corresponding to the display position of the right eye superimposition information is transmitted.
- Display control information region information, target frame information, parallax information
- the superimposition information of the first display area and the second display area can be superimposed and displayed on the target frame.
- parallax can be given to the display position of the superimposition information in the first display area and the second display area, and in the display of the superimposition information such as captions, the consistency of perspective between each object in the image is improved. It becomes possible to maintain the optimum state.
- FIG. 6 is a diagram for explaining a configuration example (cases A to E) of data (including a parallax information group). It is a figure which shows notionally the production method of the subtitle data for stereoscopic images in case the transmission format of stereoscopic image data is a side-by-side system. It is a figure which shows an example of the region (region) and object (object) by the subtitle data for stereo images, and also a subregion (Subregion). It is a figure which shows the creation example (example 1) of each segment of the subtitle data for stereo images in case the transmission format of stereo image data is a side by side system.
- FIG. 1 It is a figure which shows the creation example (example 2) of each segment of the subtitle data for stereoscopic images in case the transmission format of stereoscopic image data is a side by side system. It is a figure which shows notionally the production method of the subtitle data for stereoscopic images in case the transmission format of stereoscopic image data is a top and bottom system. It is a figure which shows an example of the region (region) and object (object) by the subtitle data for stereo images, and also a subregion (Subregion). It is a figure which shows the creation example (example 1) of each segment of the subtitle data for stereo images in case the transmission format of stereo image data is a top and bottom system.
- stereoscopic image display using binocular parallax it is a figure for demonstrating the relationship between the display position of the left-right image of the object on a screen, and the reproduction
- FIG. 1 shows a configuration example of an image transmission / reception system 10 as an embodiment.
- the image transmission / reception system 10 includes a broadcasting station 100, a set top box (STB) 200, and a television receiver (TV) 300.
- STB set top box
- TV television receiver
- the set top box 200 and the television receiver 300 are connected by a digital interface of HDMI (High-Definition Multimedia Interface).
- the set top box 200 and the television receiver 300 are connected using an HDMI cable 400.
- the set top box 200 is provided with an HDMI terminal 202.
- the television receiver 300 is provided with an HDMI terminal 302.
- One end of the HDMI cable 400 is connected to the HDMI terminal 202 of the set top box 200, and the other end of the HDMI cable 400 is connected to the HDMI terminal 302 of the television receiver 300.
- the broadcasting station 100 transmits the bit stream data BSD on a broadcast wave.
- the broadcast station 100 includes a transmission data generation unit 110 that generates bit stream data BSD.
- the bit stream data BSD includes stereoscopic image data, audio data, superimposition information data, and the like.
- the stereoscopic image data has a predetermined transmission format, and has left-eye image data and right-eye image data for displaying a stereoscopic image.
- the superimposition information is generally subtitles, graphics information, text information, etc., but in this embodiment, it is a subtitle (caption).
- FIG. 2 shows a configuration example of the transmission data generation unit 110 in the broadcast station 100.
- the transmission data generation unit 110 includes cameras 111L and 111R, a video framing unit 112, a parallax vector detection unit 113, a microphone 114, a data extraction unit 115, and changeover switches 116 to 118.
- the transmission data generation unit 110 includes a video encoder 119, an audio encoder 120, a subtitle generation unit 121, a disparity information creation unit 122, a subtitle processing unit 123, a subtitle encoder 125, and a multiplexer 125. Yes.
- the camera 111L captures a left eye image and obtains left eye image data for stereoscopic image display.
- the camera 111R captures the right eye image and obtains right eye image data for stereoscopic image display.
- the video framing unit 112 processes the left eye image data obtained by the camera 111L and the right eye image data obtained by the camera 111R into stereoscopic image data (3D image data) corresponding to the transmission format.
- the video framing unit 112 constitutes an image data output unit.
- the first transmission method is a top-and-bottom method. As shown in FIG. 4A, in the first half of the vertical direction, the data of each line of the left eye image data is transmitted, and the vertical direction In the latter half of the method, the data of each line of the left eye image data is transmitted. In this case, since the lines of the left eye image data and the right eye image data are thinned out to 1 ⁇ 2, the vertical resolution is halved with respect to the original signal.
- the second transmission method is a side-by-side (Side By Side) method.
- pixel data of the left eye image data is transmitted, and in the second half in the horizontal direction.
- the pixel data of the right eye image data is transmitted.
- the pixel data in the horizontal direction is thinned out to 1/2.
- the horizontal resolution is halved with respect to the original signal.
- the third transmission method is a frame sequential method, in which left-eye image data and right-eye image data are sequentially switched for each frame and transmitted as shown in FIG. 4 (c).
- This frame sequential method may be referred to as a full frame method or a backward compatible method.
- the parallax vector detection unit 113 detects, for example, a parallax vector for each pixel constituting the image based on the left eye image data and the right eye image data.
- a detection example of a disparity vector will be described.
- the parallax vector of the right eye image with respect to the left eye image will be described.
- the left eye image is a detected image
- the right eye image is a reference image.
- the disparity vectors at the positions (xi, yi) and (xj, yj) are detected.
- a case where a disparity vector at the position of (xi, yi) is detected will be described as an example.
- a 4 ⁇ 4, 8 ⁇ 8, or 16 ⁇ 16 pixel block (parallax detection block) Bi is set in the left eye image with the pixel at the position (xi, yi) at the upper left. Then, a pixel block matching the pixel block Bi is searched in the right eye image.
- a search range centered on the position of (xi, yi) is set in the right eye image, and each pixel in the search range is sequentially set as a pixel of interest, for example, 4 ⁇ 4 similar to the above-described pixel block Bi. 8 ⁇ 8 or 16 ⁇ 16 comparison blocks are sequentially set.
- the sum of the absolute differences for each corresponding pixel is obtained.
- the pixel block Bi when the pixel value of the pixel block Bi is L (x, y) and the pixel value of the comparison block is R (x, y), the pixel block Bi, a certain comparison block, The sum of absolute differences between the two is represented by ⁇
- n pixels are included in the search range set in the right eye image, n total sums S1 to Sn are finally obtained, and the minimum sum Smin is selected. Then, the position of the upper left pixel (xi ′, yi ′) is obtained from the comparison block from which the sum Smin is obtained. Thus, the disparity vector at the position (xi, yi) is detected as (xi′ ⁇ xi, yi′ ⁇ yi).
- the left eye image has the pixel at the position (xj, yj) at the upper left, for example, 4 ⁇ 4, 8 ⁇ 8, or 16
- a x16 pixel block Bj is set and detected in the same process.
- the microphone 114 detects sound corresponding to the images photographed by the cameras 111L and 111R, and obtains sound data.
- the data extraction unit 115 is used in a state where the data recording medium 115a is detachably mounted.
- the data recording medium 115a is a disk-shaped recording medium, a semiconductor memory, or the like.
- audio data, superimposition information data, and parallax vectors are recorded in association with stereoscopic image data including left-eye image data and right-eye image data.
- the data extraction unit 115 extracts and outputs stereoscopic image data, audio data, and disparity vectors from the data recording medium 115a.
- the data extraction unit 115 constitutes an image data output unit.
- the stereoscopic image data recorded on the data recording medium 115 a corresponds to the stereoscopic image data obtained by the video framing unit 112.
- the audio data recorded on the data recording medium 115 a corresponds to the audio data obtained by the microphone 114.
- the disparity vector recorded on the data recording medium 115 a corresponds to the disparity vector detected by the disparity vector detection unit 113.
- the changeover switch 116 selectively extracts the stereoscopic image data obtained by the video framing unit 112 or the stereoscopic image data output from the data extraction unit 115.
- the changeover switch 116 is connected to the a side in the live mode and takes out the stereoscopic image data obtained by the video framing unit 112, and is connected to the b side in the playback mode and is output from the data extraction unit 115. Extract stereo image data.
- the changeover switch 117 selectively extracts the disparity vector detected by the disparity vector detection unit 113 or the disparity vector output from the data extraction unit 115.
- the changeover switch 117 is connected to the a side in the live mode and extracts the disparity vector detected by the disparity vector detection unit 113, and is connected to the b side in the reproduction mode and output from the data extraction unit 115. Extract disparity vectors.
- the changeover switch 118 selectively takes out the voice data obtained by the microphone 114 or the voice data output from the data fetch unit 115.
- the changeover switch 118 is connected to the a side in the live mode and takes out the audio data obtained by the microphone 114, and is connected to the b side in the playback mode, and the audio data output from the data extraction unit 115 is taken out. Take out.
- the video encoder 119 performs encoding such as MPEG4-AVC, MPEG2, or VC-1 on the stereoscopic image data extracted by the changeover switch 116 to generate a video data stream (video elementary stream).
- the audio encoder 120 performs encoding such as AC3 or AAC on the audio data extracted by the changeover switch 118 to generate an audio data stream (audio elementary stream).
- the subtitle generation unit 121 generates subtitle data that is DVB (Digital Video Broadcasting) subtitle data. This subtitle data is subtitle data for a two-dimensional image.
- the subtitle generation unit 121 constitutes a superimposition information data output unit.
- the disparity information creating unit 122 performs a downsizing process on the disparity vector (horizontal disparity vector) for each pixel (pixel) extracted by the changeover switch 117, and disparity information (horizontal disparity vector) to be applied to the subtitle. ).
- the parallax information creation unit 122 constitutes a parallax information output unit. Note that the disparity information applied to the subtitle can be attached in units of pages, regions, or objects. The disparity information does not necessarily have to be generated by the disparity information creating unit 122, and a configuration in which the disparity information is separately supplied from the outside is also possible.
- FIG. 7 shows an example of the downsizing process performed by the parallax information creating unit 122.
- the disparity information creating unit 122 obtains a disparity vector for each block using a disparity vector for each pixel (pixel) as illustrated in FIG.
- a block corresponds to an upper layer of pixels located at the lowest layer, and is configured by dividing an image (picture) region into a predetermined size in the horizontal direction and the vertical direction.
- the disparity vector of each block is obtained, for example, by selecting the disparity vector having the largest value from the disparity vectors of all the pixels (pixels) existing in the block.
- the disparity information creating unit 122 obtains a disparity vector for each group (Group Of ⁇ ⁇ ⁇ ⁇ Block) using the disparity vector for each block as illustrated in FIG.
- a group is an upper layer of a block, and is obtained by grouping a plurality of adjacent blocks together.
- each group is composed of four blocks bounded by a broken line frame.
- the disparity vector of each group is obtained, for example, by selecting the disparity vector having the largest value from the disparity vectors of all blocks in the group.
- the disparity information creating unit 122 obtains a disparity vector for each partition (Partition) using the disparity vector for each group as illustrated in FIG.
- the partition is an upper layer of the group and is obtained by grouping a plurality of adjacent groups together.
- each partition is configured by two groups bounded by a broken line frame.
- the disparity vector of each partition is obtained, for example, by selecting the disparity vector having the largest value from the disparity vectors of all groups in the partition.
- the disparity information creating unit 122 obtains a disparity vector of the entire picture (entire image) located in the highest layer using the disparity vector for each partition.
- the entire picture includes four partitions that are bounded by a broken line frame. Then, the disparity vector for the entire picture is obtained, for example, by selecting the disparity vector having the largest value from the disparity vectors for all partitions included in the entire picture.
- the disparity information creating unit 122 performs the downsizing process on the disparity vector for each pixel (pixel) located in the lowest layer, and the disparity vectors of the respective regions in each layer of the block, group, partition, and entire picture Can be requested.
- disparity vectors of four layers of blocks, groups, partitions, and entire pictures are obtained in addition to the pixel (pixel) layer.
- the number of hierarchies, how to cut areas in each hierarchy, and the number of areas are not limited to this.
- the subtitle processing unit 123 converts the subtitle data generated by the subtitle generation unit 121 into a stereoscopic image (three-dimensional image) subtitle corresponding to the transmission format of the stereoscopic image data extracted by the changeover switch 116. Convert to data.
- the subtitle processing unit 123 forms a superimposition information data processing unit, and the converted subtitle data for stereoscopic image data forms superimposition information data for transmission.
- the stereoscopic image subtitle data includes left eye subtitle data and right eye subtitle data.
- the left-eye subtitle data is data corresponding to the left-eye image data included in the above-described stereoscopic image data, and the display of the left-eye subtitle superimposed on the left-eye image data included in the stereoscopic image data on the receiving side.
- Data for generating data is data corresponding to the right-eye image data included in the above-described stereoscopic image data, and display data of the right-eye subtitle to be superimposed on the right-eye image data included in the stereoscopic image data on the receiving side. It is data for generating.
- the subtitle processing unit 123 shifts at least the left eye subtitle or the right eye subtitle based on the disparity information (horizontal disparity vector) to be applied to the subtitle from the disparity information creating unit 122, and Parallax can be given to the right eye subtitle.
- the subtitle processing unit 123 shifts at least the left eye subtitle or the right eye subtitle based on the disparity information (horizontal disparity vector) to be applied to the subtitle from the disparity information creating unit 122, and Parallax can be given to the right eye subtitle.
- each object in the image can be displayed in the display of the subtitle (caption) without performing parallax processing on the receiving side.
- the consistency of perspective between the two can be maintained in an optimum state.
- the subtitle processing unit 123 includes a display control information generation unit 124.
- the display control information generation unit 124 generates display control information related to the subregion.
- the sub-region is an area defined only within the region.
- This subregion includes a left eye subregion (left eye SR) and a right eye subregion (right eye SR).
- the subregion is referred to as the left eye SR
- the right eye subregion is referred to as the right eye SR.
- the left-eye subregion is an area set corresponding to the display position of the left-eye subtitle in the region that is the display area of the transmission superimposition information data.
- the right-eye subregion is an area set corresponding to the display position of the right-eye subtitle in the region that is the display area of the transmission superimposition information data.
- the left eye subregion constitutes a first display area
- the right eye subregion constitutes a second display area.
- the regions of the left eye SR and the right eye SR are set for each subtitle data generated by the subtitle generation unit 121, for example, based on a user operation or automatically. In this case, the regions of the left eye SR and the right eye SR are set so that the left eye subtitle in the left eye SR and the right eye subtitle in the right eye SR correspond to each other.
- the display control information includes area information for the left eye SR and area information for the right eye SR.
- the display control information includes information on a target frame that displays a left-eye subtitle included in the left eye SR and information on a target frame that displays a right-eye subtitle included in the right eye SR.
- the information of the target frame displaying the left eye subtitle included in the left eye SR indicates the frame of the left eye image
- the information of the target frame displaying the right eye subtitle included in the right eye SR is the frame of the right eye image. Indicates.
- the display control information includes disparity information (disparity) for shifting the display position of the left eye subtitle included in the left eye SR and disparity information for shifting the display position of the right eye subtitle included in the right eye SR. And are included. These pieces of parallax information are for giving parallax between the left eye subtitle included in the left eye SR and the right eye subtitle included in the right eye SR.
- the display control information generation unit 123 performs shift adjustment to be included in the above-described display control information based on, for example, the disparity information (horizontal disparity vector) to be applied to the subtitle created by the disparity information creating unit 122. Is obtained.
- the disparity information “Disparity1” of the left eye SR and the disparity information “Disparity2” of the right eye SP have the same absolute value, and the difference between them corresponds to the disparity information (Disparity) to be applied to the subtitle. It is determined to be a value.
- the transmission format of the stereoscopic image data is the side-by-side format
- the value corresponding to the disparity information (Disparity) is “Disparity / 2”.
- the value corresponding to the disparity information (Disparity) is “Disparity”.
- the display control information generation unit 124 generates parallax information included in the display control information so as to have sub-pixel accuracy.
- the parallax information includes an integer part and a decimal part as shown in FIG.
- a subpixel is a subdivision of a pixel (Integer pixel) that constitutes a digital image. Since the disparity information has sub-pixel accuracy, the reception side can shift the display positions of the left-eye subtitle in the left eye SR and the right-eye subtitle in the right eye SR with sub-pixel accuracy. It becomes.
- FIG. 8B schematically shows an example of shift adjustment with subpixel accuracy, and shows an example in which the display position of the subtitle in the region partition is shifted from the solid line frame position to the broken line frame position. Yes.
- DDS display definition segment
- PCS page composition segment
- RCS region composition segment
- CDS CLUT definition segment
- ODS objectdata segment
- a segment of SCS (Subregion composition segment) is newly defined. Then, the display control information generated by the display control information generation unit 124 as described above is inserted into this SCS segment. Details of the processing of the subtitle processing unit 123 will be described later.
- the subtitle encoder 125 generates a subtitle data stream (subtitle elementary stream) including the subtitle data for stereoscopic images output from the subtitle processing unit 123 and display control information.
- the multiplexer 126 multiplexes the data streams from the video encoder 119, the audio encoder 120, and the subtitle encoder 125, and obtains a multiplexed data stream as bit stream data (transport stream) BSD.
- the operation of the transmission data generation unit 110 shown in FIG. 2 will be briefly described.
- the camera 111L captures a left eye image.
- the left eye image data for stereoscopic image display obtained by the camera 111L is supplied to the video framing unit 112.
- the camera 111R captures a right eye image.
- Right-eye image data for stereoscopic image display obtained by the camera 111R is supplied to the video framing unit 112.
- the left-eye image data and the right-eye image data are processed into a state corresponding to the transmission format, and stereoscopic image data is obtained (see FIGS. 4A to 4C).
- the stereoscopic image data obtained by the video framing unit 112 is supplied to the fixed terminal on the a side of the changeover switch 116.
- the stereoscopic image data obtained by the data extraction unit 115 is supplied to the fixed terminal on the b side of the changeover switch 116.
- the changeover switch 116 is connected to the a side, and the stereoscopic image data obtained by the video framing unit 112 is taken out from the changeover switch 116.
- the selector switch 116 is connected to the b side, and the stereoscopic image data output from the data extracting unit 115 is extracted from the selector switch 116.
- the stereoscopic image data extracted by the changeover switch 116 is supplied to the video encoder 119.
- the stereoscopic image data is encoded by MPEG4-AVC, MPEG2, VC-1, or the like, and a video data stream including the encoded video data is generated. This video data stream is supplied to the multiplexer 126.
- the audio data obtained by the microphone 114 is supplied to the fixed terminal on the a side of the changeover switch 118. Also, the audio data obtained by the data extraction unit 115 is supplied to the fixed terminal on the b side of the changeover switch 118.
- the changeover switch 118 In the live mode, the changeover switch 118 is connected to the a side, and the audio data obtained by the microphone 114 is extracted from the changeover switch 118.
- the changeover switch 118 is connected to the b side, and the audio data output from the data extraction unit 115 is taken out from the changeover switch 118.
- the audio data extracted by the changeover switch 118 is supplied to the audio encoder 120.
- the audio encoder 120 performs encoding such as MPEG-2Audio AAC or MPEG-4 AAC on the audio data, and generates an audio data stream including the encoded audio data. This audio data stream is supplied to the multiplexer 126.
- Left eye image data and right eye image data obtained by the cameras 111L and 111R are supplied to the parallax vector detection unit 113 through the video framing unit 112.
- the disparity vector detection unit 113 detects disparity vectors for each pixel (pixel) based on the left eye image data and the right eye image data. This disparity vector is supplied to the fixed terminal on the a side of the changeover switch 117. Further, the disparity vector for each pixel (pixel) output from the data extraction unit 115 is supplied to a fixed terminal on the b side of the changeover switch 117.
- the changeover switch 117 is connected to the a side, and the parallax vector for each pixel (pixel) obtained by the parallax vector detection unit 113 is extracted from the changeover switch 117.
- the selector switch 117 is connected to the b side, and the disparity vector for each pixel (pixel) output from the data extracting unit 115 is extracted from the selector switch 117.
- the subtitle generation unit 121 generates subtitle data (for two-dimensional images) that is DVB subtitle data. This subtitle data is supplied to the parallax information creation unit 122 and the subtitle processing unit 123.
- the disparity vector for each pixel (pixel) extracted by the changeover switch 117 is supplied to the disparity information creating unit 122.
- the subtitle processing unit 123 the subtitle data for the two-dimensional image generated by the subtitle generation unit 121 is converted into stereoscopic image subtitle data corresponding to the transmission format of the stereoscopic image data extracted by the changeover switch 116 described above.
- the stereoscopic image subtitle data includes left-eye subtitle data and right-eye subtitle data.
- the subtitle processing unit 123 shifts at least the left-eye subtitle or the right-eye subtitle based on the disparity information to be applied to the subtitle from the disparity information creation unit 122, so that it is between the left-eye subtitle and the right-eye subtitle. May be given parallax.
- the display control information generation unit 124 of the subtitle processing unit 123 generates display control information (region information, target frame information, parallax information) related to the subregion (Subregion).
- the subregion includes the left eye subregion (left eye SR) and the right eye subregion (right eye SR). Therefore, area information, target frame information, and parallax information of the left eye SR and right eye SR are generated as display control information.
- the left eye SR is set corresponding to the display position of the left eye subtitle, for example, in a region that is a display area of superimposing information data for transmission based on a user operation.
- the right eye SR is set corresponding to the display position of the right eye subtitle, for example, based on a user operation or automatically in a region that is a display area of the superimposed information data for transmission.
- the stereoscopic image subtitle data and display control information obtained by the subtitle processing unit 123 are supplied to the subtitle encoder 125.
- the subtitle encoder 125 generates a subtitle data stream including stereoscopic image subtitle data and display control information.
- the subtitle data stream includes a newly defined SCS segment including display control information, as well as segments such as DDS, PCS, RCS, CDS, and ODS into which stereoscopic image subtitle data is inserted.
- each data stream from the video encoder 119, the audio encoder 120, and the subtitle encoder 125 is supplied to the multiplexer 126.
- each data stream is packetized and multiplexed to obtain a multiplexed data stream as bit stream data (transport stream) BSD.
- FIG. 9 shows a configuration example of a transport stream (bit stream data).
- This transport stream includes PES packets obtained by packetizing each elementary stream.
- PES packets obtained by packetizing each elementary stream.
- a PES packet “Video PES” of a video elementary stream a PES packet “AudioPES” of an audio elementary stream, and a PES packet ““ Subtitle PES ”of a subtitle elementary stream are included.
- the subtitle elementary stream includes stereoscopic image subtitle data and display control information.
- This stream includes SCS segments including newly defined display control information, as well as conventionally known segments such as DDS, PCS, RCS, CDS, and ODS.
- FIG. 10 shows the structure of PCS (page_composition_segment).
- the segment type of this PCS is “0x10” as shown in FIG. “Region_horizontal_address” and “region_vertical_address” indicate the start position of the region.
- the structure of other segments such as DDS, RSC, and ODS is not shown.
- the DDS segment type is “0x14”
- the RCS segment type is “0x11”
- the CDS segment type is “0x12”
- the ODS segment type is “0x13”.
- the SCS segment type is “0x49”. The detailed structure of the SCS segment will be described later.
- the transport stream also includes a PMT (Program Map Table) as PSI (Program Specific Information).
- PSI Program Specific Information
- This PSI is information describing to which program each elementary stream included in the transport stream belongs.
- the transport stream includes an EIT (EventInformation Table) as SI (Serviced Information) for managing each event.
- SI Serviced Information
- the PMT has a program descriptor (ProgramDescriptor) that describes information related to the entire program.
- the PMT includes an elementary loop having information related to each elementary stream. In this configuration example, there are a video elementary loop, an audio elementary loop, and a subtitle elementary loop.
- information such as a packet identifier (PID) is arranged for each stream, and a descriptor (descriptor) describing information related to the elementary stream is also arranged, although not shown.
- PID packet identifier
- descriptor descriptor
- the component descriptor (Component_Descriptor) is inserted under the EIT.
- the subtitle data stream includes the subtitle data for stereoscopic images.
- “stream_content” of “component_descriptor” indicating distribution content indicates a subtitle (subtitle)
- the subtitle processing unit 123 converts the subtitle data for two-dimensional images into subtitle data for stereoscopic images.
- the subtitle processing unit 123 generates display control information (including region information of the left eye SR and right eye SR, target frame information, and parallax information) in the display control information generation unit 124.
- “Case A” or “Case B” can be considered as shown in FIG.
- a series of segments related to subtitle display of DDS, PCS, RCS, CDS, ODS, SCS, and EDS are created before the start of a predetermined number of frame periods in which subtitles are displayed, Information (PTS) is added and sent together.
- PTS Picture-Time Transport Stream
- subtitle display periods the predetermined number of frame periods in which the subtitles are displayed.
- a series of segments related to subtitle display of DDS, PCS, RCS, CDS, ODS, and SCS are created before the start of a predetermined number of frame periods (subtitle display periods) in which the subtitles are displayed.
- Time information (PTS) is added and transmitted at once.
- SCS segments with updated disparity information are sequentially created, and time information (PTS) is added and transmitted.
- PTS time information
- EDS EDS segment is also created and transmitted with time information (PTS) added.
- this display on / off control turns on (validates) this display when performing display based on the disparity information in the SCS segment of a certain frame, and changes the disparity information in the SCS segment of the previous frame. This is control for turning off (invalidating) the display based on the display.
- the SCS segment includes command information for controlling on / off of display (details will be described later).
- An example of display on / off control on the receiving side will be described with reference to FIGS. 14 and 15.
- FIG. 14 shows an example of SCS segments sequentially transmitted to the receiving side.
- SCSs corresponding to the T0 frame, T1 frame, and T2 frame are sequentially sent.
- FIG. 15 shows a shift example of the display position of the left eye subtitle in the left eye SR and the right eye subtitle in the right eye SR by the SCS corresponding to each of the T0 frame, the T1 frame, and the T2 frame.
- the disparity information (Disparity_0) for obtaining the display position SP0 of the left eye subtitle in the left eye SR and the display of the display position SR0 are turned on (valid).
- Command information (Display_ON) is included.
- disparity information (Disparity_1) for obtaining the display position SR1 of the right eye subtitle in the right eye SR, and command information for turning on (validating) the display of the display position SR1 (Display_ON) is included.
- the left eye subtitle in the left eye SR is displayed (superposed) at the display position SR0 on the left eye image.
- the right eye subtitle in the right eye SR is displayed (superposed) at the display position SR1 on the right eye image.
- the SCS of the T1 frame includes command information (Display_OFF) for turning off (invalidating) the display positions SR0 and SR1. Also, in the SCS of the T1 frame, disparity information (Disparity_2) for obtaining the display position SR2 of the subtitle in the left eye SR, and command information (Display_ON) for turning on (validating) the display of the display position SR2 It is included. Further, in the SCS of the T1 frame, disparity information (Disparity_3) for obtaining the display position SR3 of the subtitle in the right eye SR, and command information (Display_ON) for turning on (validating) the display of the display position SR3 are displayed. )It is included.
- the display position SR0 on the left eye image is turned off (invalid) and the display position SR1 on the right eye image is displayed. Is turned off (disabled).
- the left eye subtitle in the left eye SR is displayed (superposed) at the display position SR2 on the left eye image.
- the right eye subtitle in the right eye SR is displayed (superposed) at the display position SR3 on the right eye image.
- the SCS of the T2 frame includes command information (Display_OFF) for turning off (invalidating) the display positions SR2 and SR3.
- disparity information Disposity_4 for obtaining the display position SR4 of the subtitle in the left eye SR
- command information Disposable_ON for turning on (validating) the display of the display position SR4 It is included.
- disparity information Disposity_5 for obtaining the display position SR5 of the subtitle in the right eye SR
- command information (Display_ON) for turning on (validating) the display of the display position SR5 are displayed.
- It is included.
- the display position SR2 on the left-eye image is turned off (invalid) and the display position SR3 on the right-eye image is displayed. Is turned off (disabled).
- the left eye subtitle in the left eye SR is displayed (superposed) at the display position SR4 on the left eye image.
- the right eye subtitle in the right eye SR is displayed (superposed) at the display position SP5 on the right eye image.
- FIG. 16 shows a display example of the left-eye subtitle and right-eye subtitle on the receiving side when, for example, command information for controlling on / off of display (Display) is not included in the SCS segment.
- the subtitle in the left eye SR is in a state of being displayed (superimposed) on the display positions SR0, SR2 and SR4.
- the subtitles in the right eye SR are in a state of being displayed (superimposed) on the display positions SR1, SR3, SR5 in an overlapping manner.
- the dynamic change of the display position of the left eye subtitle in the left eye SR and the right eye subtitle in the right eye SR is not performed correctly.
- FIG. 17 conceptually shows a method for creating stereoscopic image subtitle data when the transmission format of stereoscopic image data is the side-by-side format.
- FIG. 17A shows a region based on subtitle data for a two-dimensional image. In this example, the region includes three objects.
- the subtitle processing unit 123 converts the size of the region based on the above-described subtitle data for a two-dimensional image into a size suitable for the side-by-side method as illustrated in FIG. Generate bitmap data of that size.
- the subtitle processing unit 123 uses the bitmap data after the size conversion as a component of the region in the stereoscopic image subtitle data. That is, the bitmap data after the size conversion is an object corresponding to the left eye subtitle in the region and an object corresponding to the right eye subtitle in the region.
- the subtitle processing unit 123 converts the subtitle data for the two-dimensional image into the subtitle data for the stereoscopic image, and DDS, PCS, RCS, CDS, ODS corresponding to the stereoscopic image subtitle data. Create a segment such as
- the subtitle processing unit 123 based on a user operation or automatically, on the region of the region in the stereoscopic image subtitle data, Set the right eye SR.
- the left eye SR is set in an area including an object corresponding to the left eye subtitle.
- the right eye SR is set in an area including an object corresponding to the right eye subtitle.
- the subtitle processing unit 123 creates an SCS segment including the region information, target frame information, and disparity information of the left eye SR and right eye SR set as described above. For example, the subtitle processing unit 123 creates an SCS including the left eye SR and right eye SR region information, target frame information, and disparity information in common, or the left eye SR and right eye SR region information, target frame information, SCS segments each including disparity information are created.
- FIG. 18 shows an example of a region and an object based on stereoscopic image subtitle data created as described above.
- the start position of the region is “Region_address”.
- FIG. 18 shows an example of the left eye SR and the right eye SR set as described above.
- FIG. 19 shows a creation example (example 1) of each segment of stereoscopic image subtitle data when the transmission format of stereoscopic image data is the side-by-side format.
- the start position “object_position1” of the object on the left eye image side and the start position “object_position2” of the object on the right eye image side are designated.
- the left eye SR and right eye SR SCS (Subregion composition segment) are created separately.
- the start position (Subregion ⁇ Position 1) of the left eye SR is specified.
- Target_Frame 1 of the right eye SR
- Disarity2 disparity information of the right eye SR
- Command2 display on / off command information
- FIG. 21 conceptually shows a method of creating stereoscopic image subtitle data when the transmission format of stereoscopic image data is the top-and-bottom method.
- FIG. 21A shows a region based on subtitle data for a two-dimensional image. In this example, the region includes three objects.
- the subtitle processing unit 123 uses the bitmap data after the size conversion as a component of the region of the stereoscopic image subtitle data. That is, the bitmap data after the size conversion is set as a region object on the left eye image (leftview) side and a region object on the right eye image (Right view) side.
- the subtitle processing unit 123 converts the subtitle data for the two-dimensional image into the subtitle data for the stereoscopic image, and PCS, RCS, CDS, ODS, etc. corresponding to the stereoscopic image subtitle data. Create a segment.
- the subtitle processing unit 123 based on a user operation or automatically, on the region of the region in the stereoscopic image subtitle data, Set the right eye SR.
- the left eye SR is set to an area including an object in the region on the left eye image side.
- the right eye SR is set to an area including an object in the region on the left eye image side.
- FIG. 22 shows an example of a region (region) and an object (object) based on stereoscopic image subtitle data created as described above.
- the start position of the region on the left eye image (left view) side is “Region_address1”
- the start position of the region on the right eye image (right view) side is “Region_address2”.
- FIG. 22 shows an example of the left eye SR and the right eye SR set as described above.
- FIG. 23 shows a creation example (Example 1) of each segment of stereoscopic image subtitle data when the transmission format of stereoscopic image data is the top-and-bottom method.
- PCS page_composition segment
- the left eye SR and the right eye SR SCS are created separately.
- the start position (Subregion ⁇ Position 1) of the left eye SR is specified.
- Target_Frame 1 of the right eye SR
- Disarity2 disparity information of the right eye SR
- Command2 display on / off command information
- FIG. 24 shows another example of creation of each segment of stereoscopic image subtitle data when the transmission format of stereoscopic image data is the top-and-bottom method (example 2).
- PCS, RCS, CDS, CDS, and ODS segments are created as in the example shown in FIG. 23 (example 1).
- the SCS for the left eye SR and the right eye SR are created in common. That is, various information of the left eye SR and the right eye SR is included in the common SCS.
- FIG. 25 conceptually shows a method of creating stereoscopic image subtitle data when the transmission format of stereoscopic image data is a frame sequential method.
- FIG. 25A shows a region based on subtitle data for a two-dimensional image. In this example, the region includes one object.
- the 2D image subtitle data is used as it is as the stereoscopic image subtitle data.
- segments such as DDS, PCS, RCS, and ODS corresponding to subtitle data for 2D images become segments such as DDS, PCS, RCS, and ODS corresponding to subtitle data for stereoscopic images as they are.
- the subtitle processing unit 123 on the region of the region in the stereoscopic image subtitle data, automatically or based on a user operation, or Set the right eye SR.
- the left eye SR is set in an area including an object corresponding to the left eye subtitle.
- the right eye SR is set in an area including an object corresponding to the right eye subtitle.
- the subtitle processing unit 123 creates an SCS segment including the region information, target frame information, and disparity information of the left eye SR and right eye SR set as described above. For example, the subtitle processing unit 123 creates an SCS including the left eye SR and right eye SR region information, target frame information, and disparity information in common, or the left eye SR and right eye SR region information, target frame information, SCS segments each including disparity information are created.
- FIG. 26 shows an example of the left eye SR and the right eye SR set as described above.
- FIG. 27 shows a creation example (example 1) of each segment of stereoscopic image subtitle data when the transmission format of stereoscopic image data is the frame sequential method.
- the start position “object_position1” of the object is designated.
- the left eye SR and the right eye SR SCS are created separately.
- the start position (Subregion ⁇ Position 1) of the left eye SR is specified.
- Target_Frame 1 of the right eye SR
- Disarity2 disparity information of the right eye SR
- Command2 display on / off command information
- FIG. 28 shows another example of creation of each segment of stereoscopic image subtitle data (example 2) when the transmission format of stereoscopic image data is the frame sequential method.
- PCS, RCS, and ODS segments are created as in the creation example (example 1) shown in FIG.
- the SCS of the left eye SR and the right eye SP is created in common. That is, various information of the left eye SR and the right eye SR is included in the common SCS.
- FIG. 29 and FIG. 30 show a structure example (syntax) of SCS (Subregion Composition segment).
- FIG. 31 shows the main data definition contents (semantics) of the SCS.
- This structure includes information of “Sync_byte”, “segment_type”, “page_id”, and “segment_length”.
- “Segment_type” is 8-bit data indicating the segment type, and is “0x49” indicating SCS here (see FIG. 11).
- Segment_length is 8-bit data indicating the length (size) of the segment.
- FIG. 30 shows a part including substantial information of the SCS.
- display control information for the left eye SR and right eye SR that is, area information for the left eye SR and right eye SR, target frame information, parallax information, and display on / off command information can be transmitted.
- display control information for an arbitrary number of subregions can be held.
- “Subregion_disparity_integer_part” is 8-bit information indicating an integer pixel (pixel) precision part (integer part) of disparity information (disparity) for shifting the display position of the corresponding subregion in the horizontal direction.
- “Subregion_disparity_fractional_part” is 4-bit information indicating a sub-pixel precision part (decimal part) of disparity information (disparity) for shifting the corresponding region / partition in the horizontal direction.
- the disparity information (disparity) shifts the display position of the corresponding sub-region as described above, and as described above, the left-eye subtitle in the left eye SR and the right-eye subtitle in the right eye SR. This is information for giving parallax to the display position.
- Subregion_horizontal_position is 16-bit information indicating the position of the left end of the subregion which is a rectangular region.
- Subregion_vertical_position is 16-bit information indicating the position of the upper end of the subregion which is a rectangular region.
- Subregion_width is 16-bit information indicating the horizontal size (number of pixels) of a subregion which is a rectangular region.
- Subregion_height is 16-bit information indicating the vertical size (number of pixels) of a subregion which is a rectangular region.
- FIG. 32 shows a flow of stereoscopic image data and subtitle data (including display control information) from the broadcasting station 100 to the television receiver 300 via the set top box 200 or directly from the broadcasting station 100 to the television receiver 300.
- the broadcast station 100 generates stereoscopic image subtitle data in accordance with a side-by-side format.
- the stereoscopic image data is included in the video data stream and transmitted, and the stereoscopic image subtitle data is included in the subtitle data stream and transmitted.
- stereoscopic image data and subtitle data (including display control information) is sent from the broadcasting station 100 to the set top box 200 and the set top box 200 is a legacy 2D-compatible device (Legacy 2D STB) will be described.
- the set top box 200 generates region display data for displaying the left-eye subtitle and the right-eye subtitle based on the subtitle data (excluding subregion display control information), and converts the display data into stereoscopic image data.
- the output stereoscopic image data is obtained by superimposing. In this case, the overlapping position is the position of the region.
- the set top box 200 transmits the output stereoscopic image data to the television receiver 300 through, for example, an HDMI digital interface.
- the transmission format of the stereoscopic image data from the set top box 200 to the television receiver 300 is, for example, a side-by-side system.
- the television receiver 300 When the television receiver 300 is a 3D-compatible device (3D TV), the left eye on which side-by-side stereoscopic image data sent from the set-top box 200 is subjected to 3D signal processing and a subtitle is superimposed. Image and right eye image data are generated. Then, the television receiver 300 displays binocular parallax images (a left-eye image and a right-eye image) for allowing the user to recognize a stereoscopic image on a display panel such as an LCD.
- a display panel such as an LCD
- stereoscopic image data and subtitle data (including display control information) is sent from the broadcast station 100 to the set top box 200, and the set top box 200 is a 3D-compatible device (3D ⁇ STB).
- the set top box 200 generates region display data for displaying the left-eye subtitle and the right-eye subtitle based on the subtitle data (excluding subregion display control information).
- the set top box 200 extracts display data corresponding to the left eye SR and display data corresponding to the right eye SR from the display data of this region.
- the set top box 200 obtains output stereoscopic image data by superimposing display data corresponding to the left eye SR and right eye SR on the stereoscopic image data.
- the display data corresponding to the left eye SR is superimposed on a frame portion (left eye image frame portion) indicated by frame0 which is target frame information of the left eye SR.
- the display data corresponding to the right eye SR is superimposed on a frame portion (right eye image frame portion) indicated by frame1 which is target frame information of the right eye SR.
- the display data corresponding to the left eye SR is the position indicated by Position1 which is the region information of the left eye SR in the side-by-side stereoscopic image data, and the disparity1 which is the disparity information of the left eye SR. It is superimposed at a position shifted by half of.
- the display data corresponding to the left eye SR is the position indicated by Position2 which is the area information of the right eye SR in the side-by-side stereoscopic image data, and the disparity2 which is the disparity information of the left eye SR. It is superimposed at a position shifted by half.
- the set top box 200 transmits the output stereoscopic image data obtained as described above to the television receiver 300 through, for example, an HDMI digital interface.
- the transmission format of the stereoscopic image data from the set top box 200 to the television receiver 300 is, for example, a side-by-side system.
- the television receiver 300 When the television receiver 300 is a 3D-compatible device (3D TV), the left eye on which side-by-side stereoscopic image data sent from the set-top box 200 is subjected to 3D signal processing and a subtitle is superimposed. Image and right eye image data are generated. Then, the television receiver 300 displays binocular parallax images (a left-eye image and a right-eye image) for allowing the user to recognize a stereoscopic image on a display panel such as an LCD.
- a display panel such as an LCD
- the television receiver 300 generates region display data for displaying the left-eye subtitle and the right-eye subtitle based on the subtitle data (excluding subregion display control information). Then, the television receiver 300 extracts display data corresponding to the left eye SR and display data (right eye display data) corresponding to the right eye SR from the display data of this region.
- the television receiver 300 scales the display data corresponding to the left eye SR twice in the horizontal direction to obtain left eye display data corresponding to full resolution. Then, the television receiver 300 superimposes the left-eye display data on full-resolution left-eye image data corresponding to frame0 that is target frame information of the left eye SR. That is, the television receiver 300 performs full-resolution left-eye image data obtained by scaling the left-eye display data by horizontally doubling the left-eye image portion of the side-by-side stereoscopic image data. The left eye image data with the subtitle superimposed is generated.
- the television receiver 300 scales display data corresponding to the right eye SR twice in the horizontal direction to obtain right eye display data compatible with full resolution. Then, the television receiver 300 superimposes the right-eye display data on the full-resolution right-eye image data corresponding to the frame 1 that is the target frame information of the right eye SR. That is, the television receiver 300 uses the right-eye display data obtained by scaling the right-eye image portion of the stereoscopic image data of the side-by-side method twice in the horizontal direction in the horizontal direction. The right eye image data on which the subtitle is superimposed is generated.
- the position where Position1 that is the region information of the left eye SR is doubled in the left-eye image data of the full resolution is shifted by Disparity1 that is the disparity information of the left eye SR. It is superimposed on the position.
- the right-eye display data is the parallax of the left-eye SR at a position that is doubled by subtracting H / 2 from Position2 that is the area information of the right-eye SR of the full-resolution right-eye image data. It is superimposed at a position shifted by the information Disparity2
- the television receiver 300 uses a binocular parallax image for allowing a user to recognize a stereoscopic image on a display panel such as an LCD based on the left-eye image data and the right-eye image data on which the subtitles generated as described above are superimposed. (Left eye image and right eye image) are displayed.
- FIG. 33 shows a flow of stereoscopic image data and subtitle data (including display control information) from the broadcast station 100 to the television receiver 300 via the set top box 200 or directly from the broadcast station 100 to the television receiver 300.
- the broadcasting station 100 generates stereoscopic image subtitle data that conforms to the MVC (Multi-view Video Coding) method.
- stereoscopic image data is composed of base-view image data (left-eye image data) and non-base-view image data (right-eye image data).
- the stereoscopic image data is included in the video data stream and transmitted, and the stereoscopic image subtitle data is included in the subtitle data stream and transmitted.
- the set-top box 200 generates region display data for displaying the left-eye subtitle and the right-eye subtitle based on the subtitle data (excluding subregion display control information), and the display data is displayed in the base view (left
- the output image data is obtained by superimposing it on the eye image data.
- the overlapping position is the position of the region.
- the set top box 200 transmits the output image data to the television receiver 300 through, for example, an HDMI digital interface.
- the television receiver 300 displays a 2D image on the display panel regardless of whether the device is a 2D compatible device (2D TV) or a 3D compatible device (3D TV).
- stereoscopic image data and subtitle data (including display control information) is sent from the broadcast station 100 to the set top box 200, and the set top box 200 is a 3D-compatible device (3D ⁇ STB).
- the set top box 200 generates region display data for displaying the left-eye subtitle and the right-eye subtitle based on the subtitle data (excluding subregion display control information).
- the set top box 200 extracts display data corresponding to the left eye SR and display data corresponding to the right eye SR from the display data of this region.
- the set top box 200 superimposes the display data corresponding to the left eye SR on the image data of the base view (left eye image) indicated by frame0 that is the target frame information of the left eye SR, and superimposes the left eye subtitle.
- Output image data of the base view (left eye image) is obtained.
- the display data corresponding to the left eye SR is the position indicated by Position1 which is the region information of the left eye SR in the image data of the base view (left eye image), and Disparity1 which is the disparity information of the left eye SR. It is superimposed at a position shifted by the amount.
- the set-top box 200 superimposes display data corresponding to the right eye SR on the image data of the non-base view (right eye image) indicated by frame1 which is the target frame information of the right eye SR, so that the right eye Output image data of a non-base view (right eye image) on which a subtitle is superimposed is obtained.
- the display data corresponding to the right eye SR is the disparity information of the right eye SR at the position indicated by Position2 which is the area information of the right eye SR in the image data of the non-base view (right eye image). It is superimposed at a position shifted by Disparity2.
- the set-top box 200 transmits the image data of the base view (left eye image) and the non-base view (right eye image) obtained as described above to the television receiver 300 through, for example, an HDMI digital interface.
- the transmission format of stereoscopic image data from the set top box 200 to the television receiver 300 is, for example, a frame packing method.
- the television receiver 300 When the television receiver 300 is a 3D-compatible device (3D TV), the left-eye image and the right image on which the subtitle is superimposed by performing 3D signal processing on the frame packing type stereoscopic image data transmitted from the set-top box 200. Generate eye image data. Then, the television receiver 300 displays binocular parallax images (a left-eye image and a right-eye image) for allowing the user to recognize a stereoscopic image on a display panel such as an LCD.
- a display panel such as an LCD.
- the television receiver 300 generates region display data for displaying the left-eye subtitle and the right-eye subtitle based on the subtitle data (excluding subregion display control information). Then, the television receiver 300 extracts display data corresponding to the left eye SR and display data corresponding to the right eye SR from the display data of this region.
- the television receiver 300 superimposes the display data corresponding to the left eye SR on the image data of the base view (left eye image) indicated by frame0 that is the target frame information of the left eye SR, and superimposes the left eye subtitle.
- Output image data of the base view (left eye image) is obtained.
- the display data corresponding to the left eye SR is the position indicated by Position1 which is the region information of the left eye SR in the image data of the base view (left eye image), and Disparity1 which is the disparity information of the left eye SR. It is superimposed at a position shifted by the amount.
- the television receiver 300 superimposes display data corresponding to the right eye SR on the image data of the non-base view (right eye image) indicated by frame1 that is the target frame information of the right eye SR, so that the right eye Output image data of a non-base view (right eye image) on which a subtitle is superimposed is obtained.
- the display data corresponding to the right eye SR is the disparity information of the right eye SR at the position indicated by Position2 which is the area information of the right eye SR in the image data of the non-base view (right eye image). It is superimposed at a position shifted by Disparity2.
- the television receiver 300 Based on the image data of the base view (left eye image) and the non-base view (right eye image) on which the subtitles generated as described above are superimposed, the television receiver 300 displays a stereoscopic image on the display panel such as an LCD. A binocular parallax image (left eye image and right eye image) for recognizing the image is displayed.
- the bit stream data BSD output from the multiplexer 122 is a multiplexed data stream having a video data stream and a subtitle data stream.
- the video data stream includes stereoscopic image data.
- the subtitle data stream includes subtitle data for stereoscopic images (for 3D images) corresponding to the transmission format of the stereoscopic image data.
- the stereoscopic image subtitle data includes left eye subtitle data and right eye subtitle data. Therefore, on the receiving side, based on the subtitle data, the display data of the left eye subtitle superimposed on the left eye image data included in the stereoscopic image data and the display of the right eye subtitle superimposed on the right eye image data included in the stereoscopic image data are displayed. Data can be generated easily. This facilitates processing.
- the bit stream data BSD output from the multiplexer 122 includes display control information in addition to stereoscopic image data and stereoscopic image subtitle data.
- This display control information includes display control information (region information, target frame information, parallax information) related to the left eye SR and the right eye SR. Therefore, on the receiving side, it is easy to superimpose and display only the left eye subtitle in the left eye SR and the subtitle in the right eye SR on the target frame. Then, parallax can be given to the display positions of the left eye subtitle in the left eye SR and the subtitle in the right eye SR, and in the display of the subtitle (caption), the consistency of perspective with each object in the image is improved. It is possible to maintain the optimum state.
- the subtitle processing unit 123 can transmit the SCS segment in which the disparity information is sequentially updated in the subtitle display period, the left eye subtitle and the right eye in the left eye SR are transmitted.
- the display position of the right-eye subtitle in SR can be dynamically controlled.
- the parallax provided between the left eye subtitle and the right eye subtitle can be dynamically changed in conjunction with the change in the image content.
- the disparity information included in the SCS segment created by the subtitle processing unit 123 in the transmission data generating unit 110 shown in FIG. Therefore, on the receiving side, when the display positions of the left-eye subtitle in the left eye SR and the right-eye subtitle in the right eye SR are shift-adjusted based on the disparity information sequentially updated in the subtitle display period, the shift operation can be smoothed. , Can contribute to image quality improvement.
- the set-top box 200 receives bit stream data (transport stream) BSD transmitted from the broadcasting station 100 on broadcast waves.
- the bit stream data BSD includes stereoscopic image data and audio data including left eye image data and right eye image data.
- the bit stream data BSD includes stereoscopic image subtitle data (including display control information) for displaying a subtitle (caption).
- the set top box 200 has a bit stream processing unit 201.
- the bit stream processing unit 201 extracts stereoscopic image data, audio data, and subtitle data from the bit stream data BSD.
- the bit stream processing unit 201 generates stereoscopic image data on which the subtitle is superimposed using stereoscopic image data, subtitle data, and the like.
- parallax can be given between the left-eye subtitle superimposed on the left-eye image and the right-eye subtitle superimposed on the right-eye image.
- the subtitle data for stereoscopic images transmitted from the broadcast station 100 can be generated so that parallax is given between the left eye subtitle and the right eye subtitle.
- the display control information added to the stereoscopic image subtitle data transmitted from the broadcast station 100 includes disparity information, and based on this disparity information, Parallax can be given between the eye subtitle and the right eye subtitle. In this manner, by providing parallax between the left eye subtitle and the right eye subtitle, the user can recognize the subtitle (caption) in front of the image.
- FIG. 34 (a) shows a display example of a subtitle (caption) on an image.
- captions are superimposed on an image composed of a background and a foreground object.
- FIG. 34B shows the perspective of the background, the foreground object, and the caption, and indicates that the caption is recognized at the forefront.
- FIG. 35 (a) shows a display example of subtitles (captions) on the same image as FIG. 34 (a).
- FIG. 34B shows a left-eye caption LGI superimposed on the left-eye image and a right-eye caption RGI superimposed on the right-eye image.
- FIG. 34 (c) shows that a parallax is given between the left-eye caption LGI and the right-eye caption RGI because the caption is recognized most forward.
- FIG. 36 shows a configuration example of the set top box 200.
- the set top box 200 includes a bit stream processing unit 201, an HDMI terminal 202, an antenna terminal 203, a digital tuner 204, a video signal processing circuit 205, an HDMI transmission unit 206, and an audio signal processing circuit 207. ing.
- the set top box 200 includes a CPU 211, a flash ROM 212, a DRAM 213, an internal bus 214, a remote control receiving unit 215, and a remote control transmitter 216.
- the antenna terminal 203 is a terminal for inputting a television broadcast signal received by a receiving antenna (not shown).
- the digital tuner 204 processes the television broadcast signal input to the antenna terminal 203 and outputs predetermined bit stream data (transport stream) BSD corresponding to the user's selected channel.
- the bit stream processing unit 201 extracts stereoscopic image data, audio data, stereoscopic image subtitle data (including display control information), and the like from the bit stream data BSD.
- the bit stream processing unit 201 outputs audio data.
- the bit stream processing unit 201 combines the display data of the left eye subtitle and the right eye subtitle with the stereoscopic image data, and obtains output stereoscopic image data on which the subtitle is superimposed.
- the display control information includes left eye SR and right eye SR area information, target frame information, and parallax information.
- the bit stream processing unit 201 generates region display data for displaying the left-eye subtitle and the right-eye subtitle based on the subtitle data (excluding subregion display control information). Then, the bit stream processing unit 201 extracts display data corresponding to the left eye SR and display data corresponding to the right eye SR from the display data of this region based on the region information of the left eye SR and the right eye SR. .
- the bit stream processing unit 201 superimposes display data corresponding to the left eye SR and right eye SR on the stereoscopic image data to obtain output stereoscopic image data (display stereoscopic image data).
- the display data corresponding to the left eye SR is superimposed on a frame portion (left eye image frame portion) indicated by frame0 which is target frame information of the left eye SR.
- the display data corresponding to the right eye SR is superimposed on a frame portion (right eye image frame portion) indicated by frame1 which is target frame information of the right eye SR.
- the bit stream processing unit 201 shift-adjusts the display positions (superimposition positions) of the left eye subtitle in the left eye SR and the right eye subtitle in the right eye SR based on the parallax information.
- the video signal processing circuit 205 performs image quality adjustment processing on the output stereoscopic image data obtained by the bit stream processing unit 201 as necessary, and supplies the processed output stereoscopic image data to the HDMI transmission unit 206.
- the audio signal processing circuit 207 performs sound quality adjustment processing or the like on the audio data output from the bit stream processing unit 201 as necessary, and supplies the processed audio data to the HDMI transmission unit 206.
- the HDMI transmitting unit 206 transmits, for example, uncompressed image data and audio data from the HDMI terminal 202 by communication conforming to HDMI. In this case, since transmission is performed using an HDMI TMDS channel, image data and audio data are packed and output from the HDMI transmission unit 206 to the HDMI terminal 202.
- the TMDS transmission format is a side-by-side format (see FIG. 32).
- the TMDS transmission format is the top-and-bottom method.
- the TMDS transmission format is a frame packing method (see FIG. 33).
- the CPU 211 controls the operation of each part of the set top box 200.
- the flash ROM 212 stores control software and data.
- the DRAM 213 constitutes a work area for the CPU 211.
- the CPU 211 develops software and data read from the flash ROM 212 on the DRAM 213 to activate the software, and controls each part of the set top box 200.
- the remote control receiving unit 215 receives the remote control signal (remote control code) transmitted from the remote control transmitter 216 and supplies it to the CPU 211.
- the CPU 211 controls each part of the set top box 200 based on the remote control code.
- the CPU 211, flash ROM 212 and DRAM 213 are connected to the internal bus 214.
- a television broadcast signal input to the antenna terminal 203 is supplied to the digital tuner 204.
- the digital tuner 204 processes the television broadcast signal and outputs predetermined bit stream data (transport stream) BSD corresponding to the user's selected channel.
- the bit stream data BSD output from the digital tuner 204 is supplied to the bit stream processing unit 201.
- the bit stream processing unit 201 extracts stereoscopic image data, audio data, stereoscopic image subtitle data (including display control information), and the like from the bit stream data BSD.
- display data (bitmap data) of the left eye subtitle and the right eye subtitle is combined with the stereoscopic image data, and output stereoscopic image data on which the subtitle is superimposed is obtained.
- the output stereoscopic image data obtained by the bit stream processing unit 201 is supplied to the video signal processing circuit 205.
- image quality adjustment processing or the like is performed on the output stereoscopic image data as necessary.
- the processed output stereoscopic image data output from the video signal processing circuit 205 is supplied to the HDMI transmission unit 206.
- the audio data obtained by the bit stream processing unit 201 is supplied to the audio signal processing circuit 207.
- the audio signal processing circuit 207 performs processing such as sound quality adjustment processing on the audio data as necessary.
- the processed audio data output from the audio signal processing circuit 207 is supplied to the HDMI transmission unit 206.
- the stereoscopic image data and audio data supplied to the HDMI transmission unit 206 are transmitted from the HDMI terminal 202 to the HDMI cable 400 via the HDMI TMDS channel.
- FIG. 37 shows a configuration example of the bit stream processing unit 201.
- the bit stream processing unit 201 has a configuration corresponding to the transmission data generation unit 110 shown in FIG.
- the bit stream processing unit 201 includes a demultiplexer 221, a video decoder 222, a subtitle decoder 223, a stereoscopic image subtitle generation unit 224, a display control unit 225, a video superimposing unit 226, and an audio decoder 227. is doing.
- the demultiplexer 221 extracts video, audio, and subtitle packets from the bit stream data BSD, and sends them to each decoder.
- the demultiplexer 221 extracts information such as PMT and EIT inserted in the bit stream data BSD, and sends the information to the CPU 211.
- the CPU 211 can identify that the subtitle data stream includes the subtitle data for stereoscopic images based on this description.
- the video decoder 222 performs processing reverse to that of the video encoder 119 of the transmission data generation unit 110 described above.
- a video data stream is reconstructed from the video packets extracted by the demultiplexer 221, and decoding processing is performed to obtain stereoscopic image data including left eye image data and right eye image data.
- the transmission format of the stereoscopic image data is, for example, a side-by-side method, a top-and-bottom method, a frame-sequential method, an MVC method, or the like.
- the subtitle decoder 223 performs a process reverse to that of the subtitle encoder 125 of the transmission data generation unit 110 described above. That is, the subtitle decoder 223 reconstructs a subtitle data stream from the subtitle packet extracted by the demultiplexer 221 and performs a decoding process to obtain stereoscopic image subtitle data (including display control information).
- the stereoscopic image subtitle generating unit 224 generates display data (bitmap data) of the left eye subtitle and the right eye subtitle to be superimposed on the stereoscopic image data based on the stereoscopic image subtitle data (excluding display control information). .
- the stereoscopic image subtitle generating unit 224 constitutes a display data generating unit.
- the display control unit 225 controls display data to be superimposed on stereoscopic image data based on display control information (left eye SR, right eye SR area information, target frame information, and parallax information). That is, the display control unit 225 corresponds to the left eye SR from the display data (bitmap data) of the left eye subtitle and the right eye subtitle superimposed on the stereoscopic image data based on the region information of the left eye SR and the right eye SR. And display data corresponding to the right eye SR is extracted.
- display control information left eye SR, right eye SR area information, target frame information, and parallax information
- the display control unit 225 supplies display data corresponding to the left eye SR and right eye SR to the video superimposing unit 226 and superimposes it on the stereoscopic image data.
- the display data corresponding to the left eye SR is superimposed on a frame portion (left eye image frame portion) indicated by frame0 which is target frame information of the left eye SR.
- the display data corresponding to the right eye SR is superimposed on a frame portion (right eye image frame portion) indicated by frame1 which is target frame information of the right eye SR.
- the display control unit 225 shifts and adjusts the display positions (superimposition positions) of the left eye subtitle in the left eye SR and the right eye subtitle in the right eye SR based on the parallax information.
- FIG. 38 schematically shows an example of interpolation processing in the case of shifting by 1 ⁇ 2 pixel (pixel) in the horizontal direction.
- the black circles in FIG. 38 (a) indicate received data.
- the white circles in FIG. 38B indicate a state where the received data is simply shifted by 1 ⁇ 2 pixel in the horizontal direction.
- the data indicated by the white circles does not include data at pixel positions. Therefore, the shift adjustment unit 225 performs interpolation processing on the data indicated by the white circles, generates data at the pixel positions indicated by the hatched circles in FIG. 38B, and sets the data after the shift adjustment.
- the video superimposing unit 226 obtains output stereoscopic image data Vout.
- the video superimposing unit 226 corresponds to the display data (bitmap data) of the left eye SR and the right eye SR that are shift-adjusted by the display control unit 225 with respect to the stereoscopic image data obtained by the video decoder 222. Superimpose on the target frame part. Then, the video superimposing unit 226 outputs the output stereoscopic image data Vout to the outside of the bit stream processing unit 201.
- the audio decoder 227 performs the reverse process of the audio encoder 120 of the transmission data generation unit 110 described above. That is, the audio decoder 227 reconstructs an audio elementary stream from the audio packet extracted by the demultiplexer 221 and performs a decoding process to obtain audio data Aout. The audio decoder 227 outputs the audio data Aout to the outside of the bit stream processing unit 201.
- bit stream processing unit 201 The operation of the bit stream processing unit 201 shown in FIG.
- the bit stream data BSD output from the digital tuner 204 (see FIG. 36) is supplied to the demultiplexer 221.
- video, audio, and subtitle packets are extracted from the bit stream data BSD and supplied to each decoder.
- a video data stream is reconstructed from the video packet extracted by the demultiplexer 221, and further, decoding processing is performed to obtain stereoscopic image data including left eye image data and right eye image data. .
- the stereoscopic image data is supplied to the video superimposing unit 226.
- the subtitle decoder 223 reconstructs a subtitle data stream from the subtitle packet extracted by the demultiplexer 221 and further performs a decoding process to obtain stereoscopic image subtitle data (including display control information). It is done. This subtitle data is supplied to the stereoscopic image subtitle generating unit 224.
- the stereoscopic image subtitle generating unit 224 generates display data (bitmap data) of the left eye subtitle and the right eye subtitle to be superimposed on the stereoscopic image data based on the stereoscopic image subtitle data (excluding display control information).
- the This display data is supplied to the display control unit 225.
- the display control unit 225 controls superimposition of display data on stereoscopic image data based on display control information (left eye SR, right eye SR region information, target frame information, and parallax information).
- the display data of the left eye SR and the right eye SR is extracted from the display data generated by the generation unit 224, and shift adjustment is performed. Thereafter, the display data of the left eye SR and the right eye SR that have undergone the shift adjustment is supplied to the video superimposing unit 226 so as to be superimposed on the target frame of the stereoscopic image data.
- the display data shifted by the display control unit 225 is superimposed on the stereoscopic image data obtained by the video decoder 222, and output stereoscopic image data Vout is obtained.
- the output stereoscopic image data Vout is output to the outside of the bit stream processing unit 201.
- the audio decoder 227 reconstructs an audio elementary stream from the audio packet extracted by the demultiplexer 221, further performs decoding processing, and audio data Aout corresponding to the display stereoscopic image data Vout described above. Is obtained.
- the audio data Aout is output to the outside of the bit stream processing unit 201.
- the bit stream data BSD output from the digital tuner 204 is a multiplexed data stream having a video data stream and a subtitle data stream.
- the video data stream includes stereoscopic image data.
- the subtitle data stream includes subtitle data for stereoscopic images (for 3D images) corresponding to the transmission format of the stereoscopic image data.
- the stereoscopic image subtitle data includes left eye subtitle data and right eye subtitle data. Therefore, the stereoscopic image subtitle generating unit 224 of the bitstream processing unit 201 can easily generate display data for the left eye subtitle to be superimposed on the left eye image data included in the stereoscopic image data. Further, the stereoscopic image subtitle generating unit 224 of the bitstream processing unit 201 can easily generate display data of the right eye subtitle to be superimposed on the right eye image data included in the stereoscopic image data. This facilitates processing.
- the bit stream data BSD output from the digital tuner 204 includes display control information in addition to stereoscopic image data and stereoscopic image subtitle data.
- This display control information includes display control information (region information, target frame information, parallax information) related to the left eye SR and the right eye SR. Therefore, it becomes easy to superimpose and display only the left eye subtitle in the left eye SR and the subtitle in the right eye SR on the target frame. Further, parallax can be given to the display positions of the left eye subtitle in the left eye SR and the subtitle in the right eye SR, and the consistency of perspective with each object in the image is displayed in the display of the subtitle (caption). It is possible to maintain the optimum state.
- the television receiver 300 receives stereoscopic image data sent from the set top box 200 via the HDMI cable 400.
- the television receiver 300 includes a 3D signal processing unit 301.
- the 3D signal processing unit 301 performs processing (decoding processing) corresponding to the transmission format on the stereoscopic image data to generate left-eye image data and right-eye image data.
- FIG. 39 illustrates a configuration example of the television receiver 300.
- the television receiver 300 includes a 3D signal processing unit 301, an HDMI terminal 302, an HDMI receiving unit 303, an antenna terminal 304, a digital tuner 305, and a bit stream processing unit 306.
- the television receiver 300 includes a video / graphic processing circuit 307, a panel drive circuit 308, a display panel 309, an audio signal processing circuit 310, an audio amplification circuit 311, and a speaker 312.
- the television receiver 300 includes a CPU 321, a flash ROM 322, a DRAM 323, an internal bus 324, a remote control receiving unit 325, and a remote control transmitter 326.
- the antenna terminal 304 is a terminal for inputting a television broadcast signal received by a receiving antenna (not shown).
- the digital tuner 305 processes the television broadcast signal input to the antenna terminal 304 and outputs predetermined bit stream data (transport stream) BSD corresponding to the user's selected channel.
- the bit stream processing unit 306 extracts stereoscopic image data, audio data, stereoscopic image subtitle data (including display control information), and the like from the bit stream data BSD.
- the bit stream processing unit 306 is configured in the same manner as the bit stream processing unit 201 of the set top box 200.
- the bit stream processing unit 306 combines the display data of the left eye subtitle and the right eye subtitle with the stereoscopic image data, and generates and outputs output stereoscopic image data on which the subtitle is superimposed.
- the bit stream processing unit 306 performs scaling processing, for example, when the transmission format of the stereoscopic image data is a side-by-side method or a top-and-bottom method, and performs full-resolution left-eye image data and The right eye image data is output (see the portion of the television receiver 300 in FIG. 32).
- the bit stream processing unit 306 outputs audio data.
- the HDMI receiving unit 303 receives uncompressed image data and audio data supplied to the HDMI terminal 302 via the HDMI cable 400 by communication conforming to HDMI.
- the HDMI receiving unit 303 has a version of, for example, HDMI 1.4a, and can handle stereoscopic image data.
- the 3D signal processing unit 301 performs a decoding process on the stereoscopic image data received by the HDMI receiving unit 303 to generate full-resolution left-eye image data and right-eye image data.
- the 3D signal processing unit 301 performs a decoding process corresponding to the TMDS transmission data format. Note that the 3D signal processing unit 301 does nothing with the full-resolution left-eye image data and right-eye image data obtained by the bit stream processing unit 306.
- the video / graphic processing circuit 307 generates image data for displaying a stereoscopic image based on the left eye image data and right eye image data generated by the 3D signal processing unit 301.
- the video / graphic processing circuit 307 performs image quality adjustment processing on the image data as necessary. Further, the video / graphic processing circuit 307 synthesizes superimposition information data such as a menu and a program guide with the image data as necessary.
- the panel drive circuit 308 drives the display panel 309 based on the image data output from the video / graphic processing circuit 307.
- the display panel 309 includes, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), and the like.
- the audio signal processing circuit 310 performs necessary processing such as D / A conversion on the audio data received by the HDMI receiving unit 303 or obtained by the bit stream processing unit 306.
- the audio amplification circuit 311 amplifies the audio signal output from the audio signal processing circuit 310 and supplies the amplified audio signal to the speaker 312.
- the CPU 321 controls the operation of each unit of the television receiver 300.
- the flash ROM 322 stores control software and data.
- the DRAM 323 constitutes a work area for the CPU 321.
- the CPU 321 develops software and data read from the flash ROM 322 on the DRAM 323 to activate the software, and controls each unit of the television receiver 300.
- the remote control receiving unit 325 receives the remote control signal (remote control code) transmitted from the remote control transmitter 326 and supplies it to the CPU 321.
- the CPU 321 controls each part of the television receiver 300 based on the remote control code.
- the CPU 321, flash ROM 322, and DRAM 323 are connected to the internal bus 324.
- the HDMI receiving unit 303 receives stereoscopic image data and audio data transmitted from the set top box 200 connected to the HDMI terminal 302 via the HDMI cable 400.
- the stereoscopic image data received by the HDMI receiving unit 303 is supplied to the 3D signal processing unit 301.
- the audio data received by the HDMI receiving unit 303 is supplied to the audio signal processing circuit 310.
- the TV broadcast signal input to the antenna terminal 304 is supplied to the digital tuner 305.
- the digital tuner 305 processes the television broadcast signal and outputs predetermined bit stream data (transport stream) BSD corresponding to the user's selected channel.
- the bit stream data BSD output from the digital tuner 305 is supplied to the bit stream processing unit 306.
- the bit stream processing unit 306 extracts stereoscopic image data, audio data, stereoscopic image subtitle data (including display control information), and the like from the bit stream data BSD. Further, the bit stream processing unit 306 combines the display data of the left eye subtitle and the right eye subtitle with the stereoscopic image data, and outputs the stereoscopic image data (full resolution left eye image data and Right eye image data) is generated.
- the output stereoscopic image data is supplied to the video / graphic processing circuit 307 through the 3D signal processing unit 301.
- the stereoscopic image data received by the HDMI receiving unit 303 is decoded, and full-resolution left-eye image data and right-eye image data are generated.
- the left eye image data and right eye image data are supplied to the video / graphic processing circuit 307.
- image data for displaying a stereoscopic image is generated based on the left eye image data and the right eye image data, and image quality adjustment processing, OSD (on-screen display) is performed as necessary.
- OSD on-screen display
- the image data obtained by the video / graphic processing circuit 307 is supplied to the panel drive circuit 308. Therefore, a stereoscopic image is displayed on the display panel 309.
- the left eye image based on the left eye image data and the right eye image based on the right eye image data are alternately displayed on the display panel 309 in a time division manner.
- the viewer can see only the left eye image with the left eye and the right eye with the shutter glasses by alternately opening the left eye shutter and the right eye shutter in synchronization with the display on the display panel 309. Only the right eye image can be seen, and a stereoscopic image can be perceived.
- the audio data obtained by the bit stream processing unit 306 is supplied to the audio signal processing circuit 310.
- necessary processing such as D / A conversion is performed on the audio data received by the HDMI receiving unit 303 or obtained by the bit stream processing unit 306.
- the audio data is amplified by the audio amplification circuit 311 and then supplied to the speaker 312. Therefore, sound corresponding to the display image on the display panel 309 is output from the speaker 312.
- the broadcasting station 100 transmits the multiplexed video data stream and subtitle data stream from the set top box 200 or the television receiver 300.
- a data stream is transmitted.
- the video data stream includes stereoscopic image data.
- the subtitle data stream includes subtitle data for stereoscopic images (for 3D images) corresponding to the transmission format of the stereoscopic image data.
- the stereoscopic image subtitle data includes left eye subtitle data and right eye subtitle data. Therefore, on the receiving side (the set top box 200, the television receiver 300), display data of the left eye subtitle to be superimposed on the left eye image data included in the stereoscopic image data can be easily generated. On the receiving side, display data for the right eye subtitle to be superimposed on the right eye image data included in the stereoscopic image data can be easily generated. Thereby, the processing of the bit data processing unit 201 is facilitated.
- the bit stream data BSD output from the transmission data generation unit 110 of the broadcast station 100 includes display control information in addition to stereoscopic image data and stereoscopic image subtitle data. Is also included.
- This display control information includes display control information (region information, target frame information, parallax information) related to the left eye SR and the right eye SR. Therefore, on the receiving side, it is easy to superimpose and display only the left eye subtitle in the left eye SR and the subtitle in the right eye SR on the target frame.
- parallax can be given to the display positions of the left eye subtitle in the left eye SR and the subtitle in the right eye SR, and in the display of the subtitle (caption), the consistency of perspective with each object in the image is improved. It is possible to maintain the optimum state.
- the transmission data generation unit 110 of the broadcast station 100 can transmit the SCS segment in which the disparity information is sequentially updated in the subtitle display period, the left in the left eye SR.
- the display positions of the eye subtitle and the right eye subtitle in the right eye SR can be dynamically controlled.
- the parallax provided between the left eye subtitle and the right eye subtitle can be dynamically changed in conjunction with the change in the image content.
- the disparity information included in the SCS segment created by the transmission data generation unit 110 of the broadcast station 100 has subpixel accuracy. Therefore, on the receiving side, when the display positions of the left-eye subtitle in the left eye SR and the right-eye subtitle in the right eye SR are shift-adjusted based on the disparity information sequentially updated in the subtitle display period, the shift operation can be smoothed. , Can contribute to image quality improvement.
- the image transmission / reception system 10 includes the broadcasting station 100, the set-top box 200, and the television receiver 300.
- the television receiver 300 includes a bit stream processing unit 306 that functions in the same manner as the bit stream processing unit 201 in the set-top box 200, as shown in FIG. Therefore, as shown in FIG. 40, an image transmission / reception system 10A including a broadcasting station 100 and a television receiver 300 is also conceivable.
- the set-top box 200 and the television receiver 300 are connected by an HDMI digital interface.
- the present invention can be similarly applied even when these are connected by a digital interface (including wireless as well as wired) similar to the HDMI digital interface.
- the information that handles the subtitle (caption) is shown as the superimposition information.
- the present invention can be similarly applied to other information that handles superimposition information such as graphics information and text information.
- a SCS segment is newly defined, and display control information is supplied from the broadcasting station 100 to the set-top box 200 using this segment.
- the method of supplying the display control information to the set-top box 200 or the television receiver 300 is not limited to this, and for example, the set-top box 200 is obtained from the Internet as necessary. Also good.
- the present invention can be applied to an image transmission / reception system capable of displaying superimposition information such as a subtitle (caption) superimposed on a stereoscopic image.
- Antenna terminal 305 ... Digital tuner 306 ... Bit stream processing part 307 ... Video / graphic processing circuit 308 ... Panel drive circuit 309 ... Display panel 310 ... Audio signal processing circuit 311 ... Audio amplification circuit 312 ... Speaker 321 ... CPU 325 ... Remote control receiver 326 ... Remote control transmitter 400 ... HDMI cable
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
左眼画像データおよび右眼画像データを持つ所定の伝送フォーマットの立体画像データを出力する画像データ出力部と、
上記左眼画像データおよび上記右眼画像データによる画像に重畳する重畳情報のデータを出力する重畳情報データ出力部と、
上記重畳情報データ出力部から出力される上記重畳情報のデータを、上記所定の伝送フォーマットの立体画像データに含まれる上記左眼画像データに対応した左眼重畳情報のデータおよび上記所定の伝送フォーマットの立体画像データに含まれる上記右眼画像データに対応した右眼重畳情報のデータを持つ送信用重畳情報データに変換する重畳情報データ処理部と、
上記重畳情報データ処理部から出力される上記送信用重畳情報データの表示領域内に、上記左眼重畳情報の表示位置に対応した第1の表示領域と、上記右眼重畳情報の表示位置に対応した第2の表示領域とを設定し、上記第1の表示領域および上記第2の表示領域のそれぞれの領域情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報をそれぞれ表示するターゲットフレームの情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報の表示位置をそれぞれシフト調整する視差情報とを含む表示制御情報を生成する表示制御情報生成部と、
上記画像データ出力部から出力される上記立体画像データを含む第1のデータストリームと、上記重畳情報データ処理部から出力される上記送信用重畳情報データおよび上記表示制御情報生成部で生成される上記表示制御情報を含む第2のデータストリームとを有する多重化データストリームを送信するデータ送信部と
を備える立体画像データ送信装置にある。
第1のデータストリームおよび第2のデータストリームを有する多重化データストリームを受信するデータ受信部を備え、
上記第1のデータストリームは、左眼画像データおよび右眼画像データを持つ所定の伝送フォーマットの立体画像データを含み、
上記第2のデータストリームは、送信用重畳情報データおよび表示制御情報を含み、
上記送信用重畳情報データは、上記所定の伝送フォーマットの立体画像データに含まれる上記左眼画像データに対応した左眼重畳情報のデータおよび上記右眼画像データに対応した右眼重畳情報のデータを持ち、
上記表示制御情報は、上記送信用重畳情報データの表示領域内に設定された上記左眼重畳情報の表示位置に対応した第1の表示領域と上記右眼重畳情報の表示位置に対応した第2の表示領域のそれぞれの領域情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報をそれぞれ表示するターゲットフレームの情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報の表示位置をそれぞれシフト調整する視差情報を持ち、
上記データ受信部で受信された上記多重化データストリームが有する上記第1のデータストリームから上記立体画像データを取得する画像データ取得部と、
上記データ受信部で受信された上記多重化データストリームが有する上記第2のデータストリームから上記送信用重畳情報データを取得する重畳情報データ取得部と、
上記データ受信部で受信された上記多重化データストリームが有する上記第2のデータストリームから上記表示制御情報を取得する表示制御情報取得部と、
上記重畳情報データ取得部で取得された上記送信用重畳情報データに基づいて、左眼画像および右眼画像にそれぞれ左眼重畳情報および右眼重畳情報を重畳表示するための表示データを発生する表示データ発生部と、
上記表示データ発生部で発生された上記表示データのうち、上記表示制御情報取得部で取得された上記表示制御情報が持つ上記第1の表示領域および上記第2の表示領域の領域情報に基づいて、上記第1の表示領域および上記第2の表示領域の表示データを抽出する表示データ抽出部と、
上記表示データ抽出部で抽出される上記第1の表示領域および上記第2の表示領域の表示データの位置を、上記表示制御情報取得部で取得された上記表示制御情報が持つ上記視差情報に基づいてシフト調整するシフト調整部と、
上記シフト調整部でシフト調整された上記第1の表示領域および上記第2の表示領域の表示データを、それぞれ、上記画像データ取得部で取得された上記立体画像データのうち、上記表示制御情報取得部で取得された上記表示制御情報が持つ上記ターゲットフレーム情報が示すターゲットフレームに重畳して出力立体画像データを得るデータ合成部とをさらに備える
立体画像データ受信装置にある。
1.実施の形態
2.変形例
[画像送受信システムの構成例]
図1は、実施の形態としての画像送受信システム10の構成例を示している。この画像送受信システム10は、放送局100と、セットトップボックス(STB)200と、テレビ受信機(TV)300を有している。
放送局100は、ビットストリームデータBSDを、放送波に載せて送信する。放送局100は、ビットストリームデータBSDを生成する送信データ生成部110を備えている。このビットストリームデータBSDには、立体画像データ、音声データ、重畳情報のデータなどが含まれる。立体画像データは所定の伝送フォーマットを有し、立体画像を表示するための左眼画像データおよび右眼画像データを持っている。重畳情報は、一般的には、字幕、グラフィクス情報、テキスト情報などであるが、この実施の形態においては、サブタイトル(字幕)である。
図2は、放送局100において、送信データ生成部110の構成例を示している。この送信データ生成部110は、カメラ111L,111Rと、ビデオフレーミング部112と、視差ベクトル検出部113と、マイクロホン114と、データ取り出し部115と、切換スイッチ116~118を有している。また、この送信データ生成部110は、ビデオエンコーダ119と、オーディオエンコーダ120と、サブタイトル発生部121と、視差情報作成部122と、サブタイトル処理部123と、サブタイトルエンコーダ125と、マルチプレクサ125を有している。
図2に示す送信データ生成部110のサブタイトル処理部123の処理の詳細を説明する。このサブタイトル処理部123は、上述したように、二次元画像用のサブタイトルデータを立体画像用のサブタイトルデータに変換する。また、このサブタイトル処理部123は、上述したように、表示制御情報生成部124において、表示制御情報(左眼SRおよび右眼SRの領域情報、ターゲットフレーム情報、視差情報を含む)を生成する。
図1に戻って、セットトップボックス200は、放送局100から放送波に載せて送信されてくるビットストリームデータ(トランスポートストリーム)BSDを受信する。このビットストリームデータBSDには、左眼画像データおよび右眼画像データを含む立体画像データ、音声データが含まれている。また、このビットストリームデータBSDには、サブタイトル(字幕)を表示するための立体画像用のサブタイトルデータ(表示制御情報を含む)が含まれている。
セットトップボックス200の構成例を説明する。図36は、セットトップボックス200の構成例を示している。このセットトップボックス200は、ビットストリーム処理部201と、HDMI端子202と、アンテナ端子203と、デジタルチューナ204と、映像信号処理回路205と、HDMI送信部206と、音声信号処理回路207を有している。また、このセットトップボックス200は、CPU211と、フラッシュROM212と、DRAM213と、内部バス214と、リモコン受信部215と、リモコン送信機216を有している。
図37は、ビットストリーム処理部201の構成例を示している。このビットストリーム処理部201は、上述の図2に示す送信データ生成部110に対応した構成となっている。このビットストリーム処理部201は、デマルチプレクサ221と、ビデオデコーダ222と、サブタイトルデコーダ223と、立体画像用サブタイトル発生部224と、表示制御部225と、ビデオ重畳部226と、オーディオデコーダ227とを有している。
図1に戻って、テレビ受信機300は、セットトップボックス200からHDMIケーブル400を介して送られてくる立体画像データを受信する。このテレビ受信機300は、3D信号処理部301を有している。この3D信号処理部301は、立体画像データに対して、伝送フォーマットに対応した処理(デコード処理)を行って、左眼画像データおよび右眼画像データを生成する。
テレビ受信機300の構成例を説明する。図39は、テレビ受信機300の構成例を示している。このテレビ受信機300は、3D信号処理部301と、HDMI端子302と、HDMI受信部303と、アンテナ端子304と、デジタルチューナ305と、ビットストリーム処理部306を有している。
なお、上述実施の形態においては、画像送受信システム10が、放送局100、セットトップボックス200およびテレビ受信機300で構成されているものを示した。しかし、テレビ受信機300は、図39に示すように、セットトップボックス200内のビットストリーム処理部201と同様に機能するビットストリーム処理部306を備えている。したがって、図40に示すように、放送局100およびテレビ受信機300で構成される画像送受信システム10Aも考えられる。
100・・・放送局
110・・・送信データ生成部
111L,111R・・・カメラ
112・・・ビデオフレーミング部
113・・・視差ベクトル検出部
114・・・マイクロホン
115・・・データ取り出し部
115a・・・データ記録媒体
116~118・・・切り換えスイッチ
119・・・ビデオエンコーダ
120・・・オーディオエンコーダ
121・・・サブタイトルエンコーダ
122・・・視差情報作成部
123・・・サブタイトル処理部
124・・・表示制御情報生成部
125・・・サブタイトルエンコーダ
126・・・マルチプレクサ
200・・・セットトップボックス(STB)
201・・・ビットストリーム処理部
202・・・HDMI端子
203・・・アンテナ端子
204・・・デジタルチューナ
205・・・映像信号処理回路
206・・・HDMI送信部
207・・・音声信号処理回路
211・・・CPU
215・・・リモコン受信部
216・・・リモコン送信機
221・・・デマルチプレクサ
222・・・ビデオデコーダ
223・・サブタイトルデコーダ
224・・・立体画像用サブタイトル発生部
225・・・表示制御部
226・・・ビデオ重畳部
227・・・オーディオデコーダ
300・・・テレビ受信機(TV)
301・・・3D信号処理部
302・・・HDMI端子
303・・・HDMI受信部
304・・・アンテナ端子
305・・・デジタルチューナ
306・・・ビットストリーム処理部
307・・・映像・グラフィック処理回路
308・・・パネル駆動回路
309・・・表示パネル
310・・・音声信号処理回路
311・・・音声増幅回路
312・・・スピーカ
321・・・CPU
325・・・リモコン受信部
326・・・リモコン送信機
400・・・HDMIケーブル
Claims (12)
- 左眼画像データおよび右眼画像データを持つ所定の伝送フォーマットの立体画像データを出力する画像データ出力部と、
上記左眼画像データおよび上記右眼画像データによる画像に重畳する重畳情報のデータを出力する重畳情報データ出力部と、
上記重畳情報データ出力部から出力される上記重畳情報のデータを、上記所定の伝送フォーマットの立体画像データに含まれる上記左眼画像データに対応した左眼重畳情報のデータおよび上記所定の伝送フォーマットの立体画像データに含まれる上記右眼画像データに対応した右眼重畳情報のデータを持つ送信用重畳情報データに変換する重畳情報データ処理部と、
上記重畳情報データ処理部から出力される上記送信用重畳情報データの表示領域内に、上記左眼重畳情報の表示位置に対応した第1の表示領域と、上記右眼重畳情報の表示位置に対応した第2の表示領域とを設定し、上記第1の表示領域および上記第2の表示領域のそれぞれの領域情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報をそれぞれ表示するターゲットフレームの情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報の表示位置をそれぞれシフト調整する視差情報とを含む表示制御情報を生成する表示制御情報生成部と、
上記画像データ出力部から出力される上記立体画像データを含む第1のデータストリームと、上記重畳情報データ処理部から出力される上記送信用重畳情報データおよび上記表示制御情報生成部で生成される上記表示制御情報を含む第2のデータストリームとを有する多重化データストリームを送信するデータ送信部と
を備える立体画像データ送信装置。 - 上記左眼画像データによる左眼画像と上記右眼画像データによる右眼画像との間の視差情報を出力する視差情報出力部をさらに備え、
上記重畳情報データ処理部は、上記視差情報出力部から出力される上記視差情報に基づいて、少なくとも、上記左眼重畳情報または上記右眼重畳情報をシフトさせて、該左眼重畳情報と該右眼重畳情報との間に視差を付与する
請求項1に記載の立体画像データ送信装置。 - 上記左眼画像データによる左眼画像と上記右眼画像データによる右眼画像との間の視差情報を出力する視差情報出力部をさらに備え、
上記表示制御情報生成部は、上記視差情報出力部から出力される上記視差情報に基づいて、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報の表示位置をそれぞれシフト調整する視差情報を取得する
請求項1に記載の立体画像データ送信装置。 - 上記データ送信部は、上記第2のデータストリームに上記立体画像データの伝送フォーマットに対応した上記送信用重畳情報データが含まれることを識別する識別情報を、上記多重化データストリームに挿入する
請求項1に記載の立体画像データ送信装置。 - 上記表示制御情報生成部で生成される上記表示制御情報に含まれる上記視差情報はサブピクセルの精度を持つ
請求項1に記載の立体画像データ送信装置。 - 上記表示制御情報生成部で生成される上記表示制御情報には、さらに、
上記第1の表示領域および上記第2の表示領域に含まれる重畳情報のそれぞれの表示のオンオフを制御するコマンド情報が含まれる
請求項1に記載の立体画像データ送信装置。 - 上記重畳情報のデータはサブタイトルデータであり、
上記重畳情報データの表示領域はリージョンであり、
上記第1の表示領域および上記第2の表示領域は、上記リージョンに含まれるように設定されたサブリージョンである
請求項1に記載の立体画像データ送信装置。 - 左眼画像データおよび右眼画像データを持つ所定の伝送フォーマットの立体画像データを出力する画像データ出力ステップと、
上記左眼画像データおよび上記右眼画像データによる画像に重畳する重畳情報のデータを出力する重畳情報データ出力ステップと、
上記重畳情報データ出力ステップで出力される上記重畳情報のデータを、上記所定の伝送フォーマットの立体画像データに含まれる上記左眼画像データに対応した左眼重畳情報のデータおよび上記所定の伝送フォーマットの立体画像データに含まれる上記右眼画像データに対応した右眼重畳情報のデータを持つ送信用重畳情報データに変換する重畳情報データ処理ステップと、
上記重畳情報データ処理ステップで出力される上記送信用重畳情報データの表示領域内に、上記左眼重畳情報の表示位置に対応した第1の表示領域と、上記右眼重畳情報の表示位置に対応した第2の表示領域とを設定し、上記第1の表示領域および上記第2の表示領域のそれぞれの領域情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報をそれぞれ表示するターゲットフレームの情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報の表示位置をそれぞれシフト調整する視差情報とを含む表示制御情報を生成する表示制御情報生成ステップと、
上記画像データ出力ステップで出力される上記立体画像データを含む第1のデータストリームと、上記重畳情報データ処理ステップで出力される上記送信用重畳情報データおよび上記表示制御情報生成ステップで生成される上記表示制御情報を含む第2のデータストリームとを有する多重化データストリームを送信するデータ送信ステップと
を備える立体画像データ送信方法。 - 第1のデータストリームおよび第2のデータストリームを有する多重化データストリームを受信するデータ受信部を備え、
上記第1のデータストリームは、左眼画像データおよび右眼画像データを持つ所定の伝送フォーマットの立体画像データを含み、
上記第2のデータストリームは、送信用重畳情報データおよび表示制御情報を含み、
上記送信用重畳情報データは、上記所定の伝送フォーマットの立体画像データに含まれる上記左眼画像データに対応した左眼重畳情報のデータおよび上記右眼画像データに対応した右眼重畳情報のデータを持ち、
上記表示制御情報は、上記送信用重畳情報データの表示領域内に設定された上記左眼重畳情報の表示位置に対応した第1の表示領域と上記右眼重畳情報の表示位置に対応した第2の表示領域のそれぞれの領域情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報をそれぞれ表示するターゲットフレームの情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報の表示位置をそれぞれシフト調整する視差情報を持ち、
上記データ受信部で受信された上記多重化データストリームが有する上記第1のデータストリームから上記立体画像データを取得する画像データ取得部と、
上記データ受信部で受信された上記多重化データストリームが有する上記第2のデータストリームから上記送信用重畳情報データを取得する重畳情報データ取得部と、
上記データ受信部で受信された上記多重化データストリームが有する上記第2のデータストリームから上記表示制御情報を取得する表示制御情報取得部と、
上記重畳情報データ取得部で取得された上記送信用重畳情報データに基づいて、左眼画像および右眼画像にそれぞれ左眼重畳情報および右眼重畳情報を重畳表示するための表示データを発生する表示データ発生部と、
上記表示データ発生部で発生された上記表示データのうち、上記表示制御情報取得部で取得された上記表示制御情報が持つ上記第1の表示領域および上記第2の表示領域の領域情報に基づいて、上記第1の表示領域および上記第2の表示領域の表示データを抽出する表示データ抽出部と、
上記表示データ抽出部で抽出される上記第1の表示領域および上記第2の表示領域の表示データの位置を、上記表示制御情報取得部で取得された上記表示制御情報が持つ上記視差情報に基づいてシフト調整するシフト調整部と、
上記シフト調整部でシフト調整された上記第1の表示領域および上記第2の表示領域の表示データを、それぞれ、上記画像データ取得部で取得された上記立体画像データのうち、上記表示制御情報取得部で取得された上記表示制御情報が持つ上記ターゲットフレーム情報が示すターゲットフレームに重畳して出力立体画像データを得るデータ合成部とをさらに備える
立体画像データ受信装置。 - 上記データ合成部で得られた上記出力立体画像データを外部機器に送信するデジタルインタフェース部をさらに備える
請求項9に記載の立体画像データ受信装置。 - 上記データ受信部で受信される上記多重化データストリームは、上記第2のデータストリームに上記立体画像データの伝送フォーマットに対応した上記送信用重畳情報データが含まれることを識別する識別情報を含み、
上記データ受信部で受信される上記多重化データストリームから上記識別情報を取得する識別情報取得部と、
上記識別情報取得部で取得された上記識別情報に基づいて、上記第2のデータストリームに上記立体画像データの伝送フォーマットに対応した上記送信用重畳情報データが含まれることを識別する重畳情報データ識別部とをさらに備える
請求項9に記載の立体画像データ受信装置。 - 第1のデータストリームおよび第2のデータストリームを有する多重化データストリームを受信するデータ受信ステップを備え、
上記第1のデータストリームは、左眼画像データおよび右眼画像データを持つ所定の伝送フォーマットの立体画像データを含み、
上記第2のデータストリームは、送信用重畳情報データおよび表示制御情報を含み、
上記送信用重畳情報データは、上記所定の伝送フォーマットの立体画像データに含まれる上記左眼画像データに対応した左眼重畳情報のデータおよび上記右眼画像データに対応した右眼重畳情報のデータを持ち、
上記表示制御情報は、上記送信用重畳情報データの表示領域内に設定された上記左眼重畳情報の表示位置に対応した第1の表示領域と上記右眼重畳情報の表示位置に対応した第2の表示領域のそれぞれの領域情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報をそれぞれ表示するターゲットフレームの情報と、上記第1の表示領域および上記第2の表示領域に含まれる重畳情報の表示位置をそれぞれシフト調整する視差情報を持ち、
上記データ受信ステップで受信された上記多重化データストリームが有する上記第1のデータストリームから上記立体画像データを取得する画像データ取得ステップと、
上記データ受信ステップで受信された上記多重化データストリームが有する上記第2のデータストリームから上記送信用重畳情報データを取得する重畳情報データ取得ステップと、
上記データ受信ステップで受信された上記多重化データストリームが有する上記第2のデータストリームから上記表示制御情報を取得する表示制御情報取得ステップと、
上記重畳情報データ取得ステップで取得された上記送信用重畳情報データに基づいて、左眼画像および右眼画像にそれぞれ左眼重畳情報および右眼重畳情報を重畳表示するための表示データを発生する表示データ発生ステップと、
上記表示データ発生ステップで発生された上記表示データのうち、上記表示制御情報取得ステップで取得された上記表示制御情報が持つ上記第1の表示領域および上記第2の表示領域の領域情報に基づいて、上記第1の表示領域および上記第2の表示領域の表示データを抽出する表示データ抽出ステップと、
上記表示データ抽出ステップで抽出される上記第1の表示領域および上記第2の表示領域の表示データの位置を、上記表示制御情報取得ステップで取得された上記表示制御情報が持つ上記視差情報に基づいてシフト調整するシフト調整ステップと、
上記シフト調整ステップでシフト調整された上記第1の表示領域および上記第2の表示領域の表示データを、それぞれ、上記画像データ取得ステップで取得された上記立体画像データのうち、上記表示制御情報取得ステップで取得された上記表示制御情報が持つ上記ターゲットフレーム情報が示すターゲットフレームに重畳して出力立体画像データを得るデータ合成ステップとをさらに備える
立体画像データ受信方法。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2012006228A MX2012006228A (es) | 2010-10-01 | 2011-09-22 | Dispositivo transmisor de datos de imagenes 3d, metodo para transmitir datos de imagenes 3d, dispositivo receptor de datos de imagenes 3d y metodo para recibir datos de imagenes 3d. |
EP11828909.9A EP2495981A4 (en) | 2010-10-01 | 2011-09-22 | 3d-image data transmitting device, 3d-image data transmitting method, 3d-image data receiving device and 3d-image data receiving method |
BR112012013542A BR112012013542A2 (pt) | 2010-10-01 | 2011-09-22 | dispositivos e métodos de transmissão e de recepção de dados de imagem estereoscópica |
RU2012122999/07A RU2012122999A (ru) | 2010-10-01 | 2011-09-22 | Устройство передачи данных стереоскопического изображения, способ передачи данных стереоскопического изображения, устройство приема данных стереоскопического изображения и способ приема данных стереоскопического изображения |
KR1020127014330A KR20130098133A (ko) | 2010-10-01 | 2011-09-22 | 입체 화상 데이터 송신 장치, 입체 화상 데이터 송신 방법, 입체 화상 데이터 수신 장치 및 입체 화상 데이터 수신 방법 |
US13/513,351 US20120242802A1 (en) | 2010-10-01 | 2011-09-22 | Stereoscopic image data transmission device, stereoscopic image data transmission method, stereoscopic image data reception device, and stereoscopic image data reception method |
CN201180006830.XA CN102726052B (zh) | 2010-10-01 | 2011-09-22 | 立体图像数据发送设备、立体图像数据发送方法、立体图像数据接收设备以及立体图像数据接收方法 |
AU2011309301A AU2011309301B2 (en) | 2010-10-01 | 2011-09-22 | 3D-image data transmitting device, 3D-image data transmitting method, 3D-image data receiving device and 3D-image data receiving method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-224417 | 2010-10-01 | ||
JP2010224417A JP5454444B2 (ja) | 2010-10-01 | 2010-10-01 | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012043352A1 true WO2012043352A1 (ja) | 2012-04-05 |
Family
ID=45892809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/071564 WO2012043352A1 (ja) | 2010-10-01 | 2011-09-22 | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 |
Country Status (11)
Country | Link |
---|---|
US (1) | US20120242802A1 (ja) |
EP (1) | EP2495981A4 (ja) |
JP (1) | JP5454444B2 (ja) |
KR (1) | KR20130098133A (ja) |
CN (1) | CN102726052B (ja) |
AU (1) | AU2011309301B2 (ja) |
BR (1) | BR112012013542A2 (ja) |
MX (1) | MX2012006228A (ja) |
RU (1) | RU2012122999A (ja) |
TW (1) | TW201230769A (ja) |
WO (1) | WO2012043352A1 (ja) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120091007A (ko) * | 2009-10-02 | 2012-08-17 | 파나소닉 주식회사 | 입체 시 영상을 재생할 수 있는 재생장치, 집적회로, 재생방법, 프로그램 |
JP2013051660A (ja) * | 2011-08-04 | 2013-03-14 | Sony Corp | 送信装置、送信方法および受信装置 |
CN104885474A (zh) * | 2012-12-26 | 2015-09-02 | 汤姆逊许可公司 | 用于内容呈现的方法和装置 |
KR101430985B1 (ko) * | 2013-02-20 | 2014-09-18 | 주식회사 카몬 | 2d-3d 복합 차원 콘텐츠 파일을 사용하는 복합 차원 콘텐츠 서비스 제공 시스템, 그 서비스 제공 방법 |
CN104427326A (zh) * | 2013-09-06 | 2015-03-18 | 北京三星通信技术研究有限公司 | 集成成像显示系统中的三维显示方法和设备 |
CN106254893A (zh) * | 2015-12-30 | 2016-12-21 | 深圳超多维科技有限公司 | 主播类互动平台客户端场景切换方法及其装置、客户端 |
CN106231349B8 (zh) * | 2015-12-30 | 2019-08-16 | 深圳超多维科技有限公司 | 主播类互动平台服务器场景切换方法及其装置、服务器 |
CN106254846B (zh) * | 2015-12-30 | 2018-06-29 | 深圳超多维科技有限公司 | 一种图像视差调整方法、装置及电子设备 |
CN106231411B (zh) * | 2015-12-30 | 2019-05-21 | 深圳超多维科技有限公司 | 主播类互动平台客户端场景切换、加载方法及装置、客户端 |
CN106231350B (zh) * | 2015-12-30 | 2019-03-26 | 深圳超多维科技有限公司 | 主播类互动平台场景切换方法及其装置 |
CN106231397B (zh) * | 2015-12-30 | 2019-03-26 | 深圳超多维科技有限公司 | 主播类互动平台主播端场景切换方法及其装置、主播端 |
US11159811B2 (en) | 2019-03-15 | 2021-10-26 | Tencent America LLC | Partitioning of coded point cloud data |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11289555A (ja) * | 1998-04-02 | 1999-10-19 | Toshiba Corp | 立体映像表示装置 |
JP2004274125A (ja) * | 2003-03-05 | 2004-09-30 | Sony Corp | 画像処理装置および方法 |
JP2005006114A (ja) | 2003-06-12 | 2005-01-06 | Sharp Corp | 放送データ送信装置、放送データ送信方法および放送データ受信装置 |
JP2011030193A (ja) * | 2009-06-29 | 2011-02-10 | Sony Corp | 立体画像データ送信装置および立体画像データ受信装置 |
JP2011029849A (ja) * | 2009-07-23 | 2011-02-10 | Sony Corp | 受信装置、通信システム、立体画像への字幕合成方法、プログラム、及びデータ構造 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100397511B1 (ko) * | 2001-11-21 | 2003-09-13 | 한국전자통신연구원 | 양안식/다시점 3차원 동영상 처리 시스템 및 그 방법 |
DE10344773B3 (de) * | 2003-09-26 | 2005-05-25 | Siemens Ag | Verfahren und Vorrichtung zum Ermitteln einer Phasenlage zwischen einer Kurbelwelle und einer Nockenwelle einer Brennkraftmaschine |
US8160149B2 (en) * | 2007-04-03 | 2012-04-17 | Gary Demos | Flowfield motion compensation for video compression |
JP2009135686A (ja) * | 2007-11-29 | 2009-06-18 | Mitsubishi Electric Corp | 立体映像記録方法、立体映像記録媒体、立体映像再生方法、立体映像記録装置、立体映像再生装置 |
US8306387B2 (en) * | 2008-07-24 | 2012-11-06 | Panasonic Corporation | Play back apparatus, playback method and program for playing back 3D video |
WO2010064853A2 (en) * | 2008-12-02 | 2010-06-10 | Lg Electronics Inc. | 3d caption display method and 3d display apparatus for implementing the same |
CA2749668C (en) * | 2009-02-12 | 2017-07-11 | Lg Electronics Inc. | Broadcast receiver and 3d subtitle data processing method thereof |
WO2010092823A1 (ja) * | 2009-02-13 | 2010-08-19 | パナソニック株式会社 | 表示制御装置 |
-
2010
- 2010-10-01 JP JP2010224417A patent/JP5454444B2/ja not_active Expired - Fee Related
-
2011
- 2011-09-21 TW TW100133966A patent/TW201230769A/zh unknown
- 2011-09-22 MX MX2012006228A patent/MX2012006228A/es active IP Right Grant
- 2011-09-22 WO PCT/JP2011/071564 patent/WO2012043352A1/ja active Application Filing
- 2011-09-22 AU AU2011309301A patent/AU2011309301B2/en not_active Ceased
- 2011-09-22 KR KR1020127014330A patent/KR20130098133A/ko not_active Application Discontinuation
- 2011-09-22 BR BR112012013542A patent/BR112012013542A2/pt not_active Application Discontinuation
- 2011-09-22 RU RU2012122999/07A patent/RU2012122999A/ru unknown
- 2011-09-22 US US13/513,351 patent/US20120242802A1/en not_active Abandoned
- 2011-09-22 CN CN201180006830.XA patent/CN102726052B/zh not_active Expired - Fee Related
- 2011-09-22 EP EP11828909.9A patent/EP2495981A4/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11289555A (ja) * | 1998-04-02 | 1999-10-19 | Toshiba Corp | 立体映像表示装置 |
JP2004274125A (ja) * | 2003-03-05 | 2004-09-30 | Sony Corp | 画像処理装置および方法 |
JP2005006114A (ja) | 2003-06-12 | 2005-01-06 | Sharp Corp | 放送データ送信装置、放送データ送信方法および放送データ受信装置 |
JP2011030193A (ja) * | 2009-06-29 | 2011-02-10 | Sony Corp | 立体画像データ送信装置および立体画像データ受信装置 |
JP2011029849A (ja) * | 2009-07-23 | 2011-02-10 | Sony Corp | 受信装置、通信システム、立体画像への字幕合成方法、プログラム、及びデータ構造 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2495981A4 |
Also Published As
Publication number | Publication date |
---|---|
EP2495981A4 (en) | 2017-08-16 |
RU2012122999A (ru) | 2013-12-10 |
MX2012006228A (es) | 2012-08-15 |
KR20130098133A (ko) | 2013-09-04 |
BR112012013542A2 (pt) | 2016-08-02 |
CN102726052B (zh) | 2015-02-11 |
AU2011309301A1 (en) | 2012-06-21 |
EP2495981A1 (en) | 2012-09-05 |
US20120242802A1 (en) | 2012-09-27 |
CN102726052A (zh) | 2012-10-10 |
AU2011309301B2 (en) | 2014-10-30 |
TW201230769A (en) | 2012-07-16 |
JP2012080374A (ja) | 2012-04-19 |
JP5454444B2 (ja) | 2014-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5454444B2 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
JP5429034B2 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
WO2013031549A1 (ja) | 送信装置、送信方法および受信装置 | |
WO2012060198A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
WO2013005571A1 (ja) | 送信装置、送信方法および受信装置 | |
JP5682149B2 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
KR20130132241A (ko) | 영상 데이터 송신 장치, 영상 데이터 송신 방법, 영상 데이터 수신 장치 및 영상 데이터 수신 방법 | |
WO2012026342A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
WO2013018490A1 (ja) | 送信装置、送信方法および受信装置 | |
WO2012057048A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
WO2013011834A1 (ja) | 送信装置、送信方法および受信装置 | |
WO2013018489A1 (ja) | 送信装置、送信方法および受信装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180006830.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11828909 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2012/006228 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011309301 Country of ref document: AU |
|
ENP | Entry into the national phase |
Ref document number: 20127014330 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13513351 Country of ref document: US Ref document number: 2011828909 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012122999 Country of ref document: RU |
|
ENP | Entry into the national phase |
Ref document number: 2011309301 Country of ref document: AU Date of ref document: 20110922 Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012013542 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 112012013542 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120605 |