WO2012081427A1 - 送信装置、送信方法、受信装置および受信方法 - Google Patents
送信装置、送信方法、受信装置および受信方法 Download PDFInfo
- Publication number
- WO2012081427A1 WO2012081427A1 PCT/JP2011/077994 JP2011077994W WO2012081427A1 WO 2012081427 A1 WO2012081427 A1 WO 2012081427A1 JP 2011077994 W JP2011077994 W JP 2011077994W WO 2012081427 A1 WO2012081427 A1 WO 2012081427A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- dimensional image
- data
- video stream
- stream
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/341—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
- H04N13/359—Switching between monoscopic and stereoscopic modes
Definitions
- the present invention relates to a transmission device, a transmission method, a reception device, and a reception method, and more particularly, a video stream in which 2D image data is inserted and a multiplexed stream that includes a video stream in which 3D image data is inserted in a time division manner.
- the present invention relates to a transmission device for transmission.
- the image data may be transmitted while switching between three-dimensional (3D) image data and two-dimensional (2D) image data.
- 3D image data is transmitted for the main program and 2D image data is transmitted for a commercial message (CM).
- CM commercial message
- a broadcast stream transmits a multiplexed stream including a video stream in which 2D image data is inserted and a video stream in which 3D image data is inserted in a time division manner.
- the video stream includes signaling information (ES layer signaling information) indicating whether the video stream is a video stream in which 2D image data is inserted or a video stream in which 3D image data is inserted.
- signaling information indicating whether the video stream is a video stream in which 2D image data is inserted or a video stream in which 3D image data is inserted.
- ES layer signaling information For example, H.M.
- H.264 / AVC Advanced Video Video Coding
- signaling information is included in the SEIs portion of the access unit.
- signaling information is included in the user data area of the picture header.
- the switching timing between the video stream in which the 2D image data is inserted and the video stream in which the 3D image data is inserted is not known unless the switching timing is actually reached.
- the viewer can perceive a stereoscopic image by viewing the left eye image and the right eye image with the shutter glasses. Even when the shutter glasses are being worn, while the two-dimensional image is being displayed, by opening the shutter, the power consumption can be suppressed and the visual health can be improved.
- the switching information needs to be supplied to the shutter glasses before the actual switching timing. There is.
- An object of the present invention is to receive a multiplexed stream including a video stream into which two-dimensional image data is inserted and a video stream into which three-dimensional image data is inserted in a time-division manner to display a two-dimensional image or a three-dimensional image. This is to enable automatic switching of the shutter glasses mode appropriately.
- the concept of this invention is A transmission unit for transmitting a multiplexed stream including the first video stream into which the two-dimensional image data is inserted and the second video stream into which the three-dimensional image data is inserted in a time division manner;
- an information insertion unit for inserting advance information notifying in advance of switching between the first video stream and the second video stream into the multiplexed stream.
- a multiplexed stream including a first video stream into which two-dimensional (2D) image data is inserted and a second video stream into which three-dimensional (3D) image data is inserted in a time division manner by a transmission unit.
- the information insertion unit inserts advance information notifying in advance of switching between the first video stream and the second video stream into the multiplexed stream. For example, timing information indicating an interval from the insertion timing of the advance information to the switching timing may be added to the advance information. In addition, for example, information indicating whether switching to the first video stream or switching to the second video stream may be further added to the advance information.
- the advance information is inserted into a stream that constitutes the multiplexed stream and is independent of the video stream. Further, for example, the advance information is inserted in a layer higher than the picture layer of the video stream. Further, for example, the advance information is inserted into the header portion of the PES that constitutes the video stream. Further, for example, the advance information is inserted under the program map table included in the multiplexed stream.
- the multiplexed stream is informed in advance of switching between the first video stream in which the 2D image data is inserted and the second video stream in which the 3D image data is inserted.
- Advance information is inserted. Therefore, on the receiving side, by controlling based on the advance information, automatic switching of the shutter glasses mode can be appropriately performed at the actual switching timing of the two-dimensional image display state and the three-dimensional image display state. Become.
- the concept of the present invention is A first video stream into which two-dimensional image data is inserted and a second video stream into which three-dimensional image data is inserted are included in a time-sharing manner, and the first video stream, the second video stream, A receiving unit for receiving a multiplexed stream in which advance information notifying in advance of switching between Based on the image data acquired from the video stream included in the multiplexed stream received by the reception unit, a two-dimensional image display or a left eye image and a right eye image are alternately displayed on the display unit.
- a display control unit for displaying a three-dimensional image; Based on advance information acquired from the multiplexed stream received by the receiving unit, a shutter mode switching command is displayed on the shutter glasses before the switching timing between the two-dimensional image display and the three-dimensional image display. And a shutter control unit for transmitting.
- the reception unit receives a multiplexed stream including the first video stream into which the two-dimensional image data is inserted and the second video stream into which the three-dimensional image data is inserted in a time division manner.
- advance information notifying in advance of switching between the first video stream and the second video stream is inserted.
- two-dimensional image display or left-eye image and right-eye image are alternately displayed on the display unit.
- a three-dimensional image to be displayed is displayed. That is, when the video stream is the first video stream and 2D image data is included, 2D image display is performed. On the other hand, when the video stream is the second video stream and includes 3D image data, 3D image display is performed.
- the shutter glasses have a shutter before the switching timing between the two-dimensional image display and the three-dimensional image display.
- a mode switch command is sent.
- the shutter mode switching command may include information indicating the time until switching the shutter mode. Further, for example, information indicating whether to switch to a shutter mode corresponding to two-dimensional image display or to switch to a shutter mode corresponding to three-dimensional image display may be added to the shutter mode switching command.
- the present invention between the first video stream in which the two-dimensional image data included in the received multiplexed stream is inserted and the second video stream in which the three-dimensional image data is inserted.
- a shutter mode switching command is transmitted to the shutter glasses before the switching timing between the two-dimensional image display and the three-dimensional image display based on the advance information informing the switching in advance. Therefore, automatic switching of the shutter glasses mode can be appropriately performed at the actual switching timing of the two-dimensional image display state and the three-dimensional image display state.
- a first video stream into which two-dimensional image data is inserted and a second video stream into which three-dimensional image data is inserted are included in a time-sharing manner, and the first video stream and the second video stream are A receiving unit for receiving a multiplexed stream in which advance information notifying in advance of switching between Data for transmitting image data acquired from the video stream included in the multiplexed stream received by the receiving unit and advance information acquired from the multiplexed stream to an external device via a transmission path And a transmission unit.
- the reception unit receives a multiplexed stream including the first video stream into which the two-dimensional image data is inserted and the second video stream into which the three-dimensional image data is inserted in a time division manner.
- advance information notifying in advance of switching between the first video stream and the second video stream is inserted.
- the image data acquired from the video stream included in the multiplexed stream received by the receiving unit and the advance information acquired from the multiplexed stream are transmitted to the external device via the transmission path by the data transmitting unit.
- Sent the data transmission unit transmits the image data to the external device via a transmission path using a plurality of channels and a differential signal, and inserts the advance information during the blanking period of the image data. Is sent to an external device.
- the present invention between the first video stream in which the two-dimensional image data included in the received multiplexed stream is inserted and the second video stream in which the three-dimensional image data is inserted.
- the advance information informing the switching is transmitted to the external device together with the image data through the transmission path. Therefore, in the external device, by controlling based on this advance information, automatic switching of the shutter glasses mode can be appropriately performed at the actual switching timing of the two-dimensional image display state and the three-dimensional image display state. Become.
- a data receiving unit for receiving from an external device;
- a display control unit for performing two-dimensional image display on the display unit or three-dimensional image display for alternately displaying the left eye image and the right eye image on the display unit based on the image data received by the data receiving unit;
- a shutter control unit that transmits a shutter mode switching command to the shutter glasses before the switching timing between the two-dimensional image display and the three-dimensional image display based on the advance information received by the data receiving unit.
- a receiving device comprising:
- the image data and the advance information are received by the data receiving unit from an external device via a transmission line such as an HDMI cable.
- the image data includes two-dimensional image data and three-dimensional image data in a time division manner.
- the advance information is information informing in advance of switching between the two-dimensional image data and the three-dimensional image data.
- the display control unit performs two-dimensional image display or three-dimensional image display in which the left eye image and the right eye image are alternately displayed on the display unit based on the image data received by the data receiving unit. That is, when the received image data is two-dimensional image data, two-dimensional image display is performed. On the other hand, when the received image data is 3D image data, 3D image display is performed.
- the shutter control unit transmits a shutter mode switching command to the shutter glasses before the switching timing between the two-dimensional image display and the three-dimensional image display.
- the shutter mode switching command may include information indicating the time until switching the shutter mode. Further, for example, information indicating whether to switch to a shutter mode corresponding to two-dimensional image display or to switch to a shutter mode corresponding to three-dimensional image display may be added to the shutter mode switching command.
- the advance information for informing the switching between the two-dimensional image data and the three-dimensional image data is received together with the image data.
- a shutter mode switching command is transmitted before the switching timing between the three-dimensional image display and the three-dimensional image display. Therefore, automatic switching of the shutter glasses mode can be appropriately performed at the actual switching timing of the two-dimensional image display state and the three-dimensional image display state.
- a two-dimensional image display or a three-dimensional image display is performed by receiving a video stream in which two-dimensional image data is inserted and a multiplexed stream including a video stream in which three-dimensional image data is inserted in a time division manner.
- automatic switching of the shutter glasses mode can be performed appropriately.
- FIG. 1 It is a block diagram which shows the structural example of the image transmission / reception system as 2nd Embodiment of this invention. It is a block diagram which shows the structural example of the set top box which comprises an image transmission / reception system. It is a block diagram which shows the structural example of the receiver which comprises an image transmission / reception system. It is a figure which shows roughly the transmission timing etc. of the shutter mode switching command from the shutter control part in a receiver to shutter glasses. It is a block diagram which shows the structural example of an HDMI transmission part (HDMI source) and an HDMI receiving part (HDMI sink).
- HDMI transmission part HDMI source
- HDMI receiving part HDMI receiving part
- FIG. 1 shows a configuration example of an image transmission / reception system 10 according to the first embodiment.
- the image transmission / reception system 10 includes a broadcasting station 100 and a receiver 200.
- the broadcasting station 100 transmits a transport stream TS as a multiplexed stream that includes a first video stream (first video elementary stream) and a second video stream (second video elementary stream) in a time division manner. , Send on broadcast waves.
- the broadcasting station 100 includes a transmission data generation unit 110 that generates the transport stream TS.
- Two-dimensional (2D) image data is inserted into the first video stream.
- three-dimensional (3D) image data is inserted into the second video stream.
- the video stream is, for example, H.264. H.264 / AVC video stream or MPEG2 video stream.
- advance information notifying in advance of switching between the first video stream and the second video stream is inserted.
- Timing information indicating an interval from the insertion timing of the advance information to the switching timing is added to the advance information.
- information indicating whether switching to the first video stream or switching to the second video stream is added to the advance information.
- the advance information is inserted into a stream that constitutes the transport stream and is independent of the video stream. Further, for example, the advance information is inserted in a layer higher than the picture layer of the video stream. Further, for example, the advance information is inserted into the header portion of the PES that constitutes the video stream. Further, for example, the advance information is inserted under a program map table (PMT: Program Map Table) included in the multiplexed stream.
- PMT Program Map Table
- FIG. 2 shows the relationship between time management on the transmission side, video stream, and advance information.
- a predetermined time for example, 2 seconds before, 1 second before
- Advance information is inserted.
- a predetermined time for example, 2 seconds before and 1 second before
- the advance information is inserted at such timing.
- the receiver 200 receives the transport stream TS transmitted from the broadcasting station 100 on a broadcast wave.
- the receiver 200 acquires image data from the video stream included in the received transport stream TS.
- the receiving unit 200 includes a display panel (display unit) 217.
- the receiving unit 200 performs two-dimensional image display on the display panel 217 or three-dimensional image display for alternately displaying the left eye image and the right eye image. That is, when the video stream is the first video stream and 2D image data is included, 2D image display is performed. On the other hand, when the video stream is the second video stream and 3D image data is included, 3D image display is performed.
- the receiver 200 acquires advance information from the received transport stream TS.
- the receiving unit 200 includes a shutter control unit 221 that controls the operation of the shutter glasses 300. Based on the advance information, the receiver 200 transmits a shutter mode switching command to the shutter glasses 300 before the switching timing between the two-dimensional image display and the three-dimensional image display by the shutter control unit 221.
- the shutter mode switching command includes information indicating the time until switching the shutter mode, and information indicating whether switching to the shutter mode corresponding to 2D image display or switching to the shutter mode corresponding to 3D image display. It has been added. Based on this shutter mode switching command, the shutter glasses 300 appropriately switch the shutter mode at the actual switching timing between the two-dimensional image display state and the three-dimensional image display state.
- FIG. 3 shows a correspondence relationship between the transport stream TS received by the receiver 200, that is, the video stream included in the bit stream data, and the advance information acquired from the transport stream TS.
- FIG. 3 schematically shows the transmission timing of a shutter mode switching command from the shutter control unit 221 of the receiver 200 to the shutter glasses 300.
- the advance information acquired from the transport stream TS is inserted with a margin of time T with respect to the switching timing from the second video stream to the first video stream on the transmission side.
- a shutter mode switching command is transmitted from the shutter control unit 221 of the receiver 200 to the shutter glasses 300 at a timing before the shutter mode switching timing.
- the shutter mode is switched in accordance with the switching timing from the three-dimensional image display to the two-dimensional image display based on the shutter mode switching command.
- the transmission data generation unit 110A illustrated in FIG. 4 illustrates an example of the transmission data generation unit 110 that generates the above-described transport stream TS in the broadcast station 100.
- the transmission data generation unit 110A shows an example in which the advance information is inserted into a stream that forms the transport stream TS and is independent of the video stream.
- the transmission data generation unit 110A includes a data extraction unit (archive unit) 111, a video encoder 112, and an audio encoder 113.
- the transmission data generation unit 110A includes a first-out information generation unit 114, a first-out information encoder 115, and a multiplexer 116.
- the data extraction unit 111 extracts and outputs image data and audio data from the data recording medium 111a.
- the data recording medium 111a is, for example, a disc-shaped recording medium, a semiconductor memory, or the like that is detachably attached to the data extraction unit 111.
- audio data corresponding to the image data is recorded together with image data of a predetermined program transmitted by the transport stream TS.
- the image data is switched to three-dimensional (3D) image data or two-dimensional (2D) image data according to the program.
- the image data is switched to 3D image data or 2D image data in the program according to the contents of the main story and commercials.
- the 3D image data is composed of left eye image data and right eye image data.
- An example of a transmission system for 3D image data will be described. Here, the following first to third transmission methods are listed, but other transmission methods may be used.
- the image data of the left eye (L) and the right eye (R) is image data of a predetermined resolution, for example, a 1920 ⁇ 1080 pixel format. Let's take an example.
- the first transmission method is a top-and-bottom method. As shown in FIG. 6A, in the first half of the vertical direction, the data of each line of the left eye image data is transmitted, and the vertical direction In the latter half of the method, the data of each line of the left eye image data is transmitted. In this case, since the lines of the left eye image data and the right eye image data are thinned out to 1 ⁇ 2, the vertical resolution is halved with respect to the original signal.
- the second transmission method is a side-by-side (Side By Side) method.
- pixel data of the left eye image data is transmitted, and in the second half in the horizontal direction.
- the pixel data of the right eye image data is transmitted.
- the pixel data in the horizontal direction is thinned out to 1/2.
- the horizontal resolution is halved with respect to the original signal.
- the third transmission method is a frame sequential method, in which left-eye image data and right-eye image data are sequentially switched for each frame and transmitted as shown in FIG. 6C.
- This frame sequential method may be referred to as a full frame method or a backward compatible method.
- the video encoder 112 applies H.264 to the image data output from the data extraction unit 111. H.264 / AVC or MPEG2 encoding is performed to obtain encoded video data.
- the video encoder 112 uses a stream formatter (not shown) provided in the subsequent stage to insert a video elementary stream (first video) into which the two-dimensional image data is inserted. Stream).
- a video elementary stream (second video stream) into which the 3D image data is inserted is generated.
- the audio encoder 113 performs encoding such as MPEG-2Audio ⁇ ⁇ ⁇ AAC on the audio data output from the data extraction unit 111 to generate an audio elementary stream.
- the advance information generation unit 114 generates G_data as advance information notifying in advance of switching between the first video stream in which the 2D image data is inserted and the second video stream in which the 3D image data is inserted. To do.
- This G_data includes timing information indicating the interval from the insertion timing of advance information to the switching timing, information indicating whether switching to the first video stream or switching to the second video stream, and the like.
- the advance information generating unit 114 generates a predetermined number of G_data used at a predetermined number of insertion timings of the advance information corresponding to each of the switching timings between the first video stream and the second video stream.
- the advance information encoder 115 performs predetermined encoding on a predetermined number of G_data output from the advance information generation unit 114, and generates an elementary stream of advance information.
- PTS Presentation Time Stamp
- the multiplexer 116 packetizes and multiplexes the elementary streams generated by the video encoder 112, the audio encoder 113, and the advance information encoder 115, and generates a transport stream TS as a multiplexed stream.
- the transport stream TS includes a first video stream in which the two-dimensional image data is inserted in a predetermined program period in which the two-dimensional image data is output from the data extraction unit 111 or a predetermined period in the program. It becomes.
- the transport stream TS is a second video stream in which the 3D image data is inserted during a predetermined program period in which the 3D image data is output from the data extraction unit 111 or a predetermined period in the program. It will be included.
- the transport stream TS includes a first information stream independent of the video stream (first video stream, second video stream).
- FIG. 7 shows the correspondence between the video stream and the advance information stream.
- advance information G_data
- G_data advance information
- Image data (3D image data or 2D image data) output from the data extraction unit 111 is supplied to the video encoder 112.
- H.264 is applied to the image data.
- H.264 / AVC or MPEG2 encoding is performed, and a video elementary stream (video stream) including encoded video data is generated.
- the image data is 2D image data
- a video elementary stream (first video stream) into which the 2D image data is inserted is generated.
- a video elementary stream (second video stream) into which the 3D image data is inserted is generated.
- the video elementary stream generated by the video encoder 112 is supplied to the multiplexer 116.
- audio data corresponding to the image data is also output from the data extracting unit 111.
- This audio data is supplied to the audio encoder 113.
- the audio encoder 113 performs encoding such as MPEG-2Audio AAC on the audio data, and generates an audio elementary stream including the encoded audio data.
- This audio elementary stream is supplied to the multiplexer 116.
- the advance information generation unit 114 corresponds to each switching timing between the first video stream and the second video stream, and a predetermined number of G_data (advance information) used at a predetermined number of insertion timings of the advance information. ) Is generated.
- This G_data includes timing information indicating the interval from the insertion timing of advance information to the switching timing, information indicating whether switching to the first video stream or switching to the second video stream, and the like.
- G_data generated by the advance information generation unit 114 is supplied to the advance information encoder 115.
- the advance information encoder 115 performs predetermined encoding on this G_data, and generates an advance information elementary stream. In this case, a PTS is added for each G_data in order to synchronize with the video stream.
- This advance information elementary stream is supplied to the multiplexer 116.
- the elementary streams supplied from the encoders are packetized and multiplexed to generate a transport stream TS.
- the transport stream TS includes a first video stream in which the two-dimensional image data is inserted during a predetermined program period in which the two-dimensional image data is output from the data extraction unit 111 or a predetermined period in the program.
- the transport stream TS includes a second video stream in which the 3D image data is inserted during a predetermined program period in which the 3D image data is output from the data extraction unit 111 or a predetermined period in the program. included.
- the transport stream TS further includes an advance information stream independent of the video stream.
- FIG. 8 shows a configuration example of a transport stream (multiplexed data stream) including a video elementary stream, an audio elementary stream, and an advance information elementary stream.
- This transport stream includes PES packets obtained by packetizing each elementary stream.
- the PES packet “Video PES” of the video elementary stream is included.
- the PES packet “Audio PES” of the audio elementary stream and the PES packet “Pre-Switching Control PES” of the advance information elementary stream are included.
- G_data as advance information is inserted in the advance information elementary stream.
- the transport stream includes a PMT (Program Map Table) as PSI (Program Specific Information).
- PSI Program Specific Information
- This PSI is information describing to which program each elementary stream included in the transport stream belongs.
- the transport stream includes an EIT (EventInformation Table) as SI (Serviced Information) for managing each event.
- the PMT has a program descriptor (Program Descriptor) that describes information related to the entire program.
- the PMT includes an elementary loop having information related to each elementary stream. In this configuration example, there are a video elementary loop, an audio elementary loop, and a private elementary loop.
- information such as a packet identifier (PID) and a stream type (Stream_Type) is arranged for each stream, and although not shown, there is a descriptor that describes information related to the elementary stream. Be placed.
- PID packet identifier
- Stream_Type stream type
- FIG. 9 shows a structure example (Syntax) of “G_data”.
- FIG. 10 shows main data definition contents (semantics) of “G_data” and “G_descriptor” described later.
- the 1-bit field of “switching_bit” indicates whether there is a switch from the first video stream to the second video stream or a switch from the second video stream to the first video stream in the next video stream period. Indicate. “1” indicates that there is switching, and “0” indicates that there is no switching.
- the 1-bit field of “next_sequence_2D” whether the video stream after switching is the first video stream in which the 2D image data is inserted or the second video stream in which the 3D image data is inserted, Indicates. “1” indicates the first video stream, and “0” indicates the second video stream. When the 1-bit field of “switching_bit” described above is “0”, the 1-bit field of “next_sequence_2D” is skipped.
- the 14-bit field of “timing_information” indicates timing information that is an interval from the insertion timing of advance information to the switching timing. This timing information is based on the video frame frequency of interest.
- the 2-bit field of “timing_type” indicates the data type of “timing_information” described above. “00” indicates that the timing information is STC (System Time Clock) based on a time stamp grid (90 kHz base). “01” indicates that the timing information is a video frame count. “10” indicates that the timing information is a count number based on 0.5 seconds. “11” indicates that the count is based on 1 second.
- a transmission data generation unit 110B illustrated in FIG. 11 illustrates an example of the transmission data generation unit 110 that generates the above-described transport stream TS in the broadcast station 100.
- This transmission data generation unit 110B shows an example in which advance information is inserted in a layer higher than the picture layer of the video stream.
- FIG. 11 portions corresponding to those in FIG. 4 are denoted by the same reference numerals, and detailed description thereof is omitted as appropriate.
- the transmission data generation unit 110B includes a data extraction unit (archive unit) 111, a video encoder 112B, an audio encoder 113, a first-out information generation unit 114, and a multiplexer 116.
- the video encoder 112B applies H.264 to the image data output from the data extraction unit 111. H.264 / AVC or MPEG2 encoding is performed to obtain encoded video data.
- the video encoder 112B uses a stream formatter (not shown) provided in the subsequent stage, and when the image data is two-dimensional image data, the video elementary stream (first video) into which the two-dimensional image data is inserted. Stream). When the image data is 3D image data, a video elementary stream (second video stream) into which the 3D image data is inserted is generated. At this time, the video encoder 112 inserts G_data as advance information generated by the advance information generation unit 114 into a layer higher than the picture layer of the video stream. In this case, the video encoder 112B inserts a predetermined number of G_data used at a predetermined number of insertion timings of the advance information at video stream positions corresponding to the insertion timings.
- the video encoder 112B is H.264.
- G_data is inserted into the SEIs portion of the access unit.
- the video encoder 112B inserts G_data in the user data area of the picture header.
- the multiplexer 116 inserts G_data. That is, in the multiplexer 116, G_data is inserted into PES_private_data (128 bits fixed length) of the PES layer.
- FIG. 12 shows the correspondence between a video stream and advance information inserted in a layer higher than the picture layer of the video stream.
- a layer higher than the picture layer of the video stream is advanced in advance at a predetermined time, for example, 2 seconds before and 1 second before the switching timing between the first video stream and the second video stream.
- Information (G_data) is inserted.
- FIG. 13 shows a structural example (Syntax) of PES_private_data when G_data is inserted into PES_private_data (fixed length of 128 bits) of the PES layer.
- a specific value indicating G_data is defined in the 8-bit field of “private_data_type”.
- the 8-bit field of “G_data_length” indicates the size of the subsequent G_data.
- FIG. 14 shows the correspondence between the video stream and the advance information inserted in PES_private_data of the PES layer.
- advance information G_data
- G_data advance information
- the multiplexer 116 packetizes and multiplexes the elementary streams generated by the video encoder 112B and the audio encoder 113, and generates a transport stream TS as a multiplexed stream.
- the multiplexer 116 packetizes and multiplexes the elementary streams generated by the video encoder 112B and the audio encoder 113, and generates a transport stream TS as a multiplexed stream.
- the other parts of the transmission data generation unit 110B shown in FIG. 11 are configured in the same manner as the transmission data generation unit 110A shown in FIG.
- the operation of the transmission data generation unit 110B shown in FIG. 11 will be briefly described.
- the image data (3D image data or 2D image data) output from the data extraction unit 111 is supplied to the video encoder 112B.
- H.264 is applied to the image data.
- H.264 / AVC or MPEG2 encoding is performed, and a video elementary stream (video stream) including encoded video data is generated.
- the image data when the image data is 2D image data, a video elementary stream (first video stream) into which the 2D image data is inserted is generated.
- a video elementary stream (second video stream) into which the 3D image data is inserted is generated.
- the video elementary stream generated by the video encoder 112B in this way is supplied to the multiplexer 116.
- audio data corresponding to the image data is also output from the data extracting unit 111.
- This audio data is supplied to the audio encoder 113.
- the audio encoder 113 performs encoding such as MPEG-2Audio AAC on the audio data, and generates an audio elementary stream including the encoded audio data.
- This audio elementary stream is supplied to the multiplexer 116.
- the advance information generation unit 114 corresponds to each switching timing between the first video stream and the second video stream, and a predetermined number of G_data (advance information) used at a predetermined number of insertion timings of the advance information. ) Is generated.
- This G_data includes timing information indicating the interval from the insertion timing of advance information to the switching timing, information indicating whether switching to the first video stream or switching to the second video stream, and the like.
- G_data generated by the advance information generating unit 114 is supplied to the video encoder 112B.
- G_data is inserted into a layer higher than the picture layer of the video stream.
- a predetermined number of G_data used at a predetermined number of insertion timings of the advance information is inserted into video stream positions corresponding to the insertion timings.
- synchronization with the video stream is established for each G_data.
- G_data when G_data is inserted into the header portion of the PES (Packetized Elementary Stream) of the video stream, the G_data generated by the advance information generating unit 114 is supplied to the multiplexer 116, and the multiplexer 116 In G_data is inserted.
- PES Packetized Elementary Stream
- the transport stream TS includes a first video stream in which the two-dimensional image data is inserted during a predetermined program period in which the two-dimensional image data is output from the data extraction unit 111 or a predetermined period in the program.
- G_data as advance information is inserted in a layer higher than the picture layer of this video stream.
- FIG. 15 shows a configuration example of a transport stream (multiplexed data stream) including a video elementary stream and an audio elementary stream.
- This transport stream includes PES packets obtained by packetizing each elementary stream.
- the PES packet “Video PES” of the video elementary stream is included.
- the PES packet “Audio PES” of the audio elementary stream is included.
- G_data as advance information is inserted in the video elementary stream.
- a transmission data generation unit 110C illustrated in FIG. 16 illustrates an example of the transmission data generation unit 110 that generates the above-described transport stream TS in the broadcasting station 100.
- the transmission data generation unit 110C shows an example in which the advance information is inserted under the program map table included in the multiplexed stream.
- FIG. 16 portions corresponding to those in FIG. 4 are denoted by the same reference numerals, and detailed description thereof is omitted as appropriate.
- the transmission data generation unit 110C includes a data extraction unit (archive unit) 111, a video encoder 112, an audio encoder 113, an advance information generation unit 114, and a multiplexer 116C.
- the multiplexer 116C packetizes and multiplexes the elementary streams generated by the video encoder 112 and the audio encoder 113, and generates a transport stream TS as a multiplexed stream.
- the multiplexer 116C generates a G_descriptor as a descriptor including the advance information (G_data) generated by the advance information generation unit 114, and inserts the descriptor under the program map table (PMT). In this case, the multiplexer 116C inserts a predetermined number of G_descriptors used at a predetermined number of insertion timings of the advance information in accordance with the insertion timing.
- FIG. 17 shows the correspondence between the video stream and the advance information inserted under the program map table (PMT) of the transport stream TS.
- PMT program map table
- the switching timing between the first video stream and the second video stream is a predetermined time before, for example, 2 seconds before or 1 second before.
- a descriptor including advance information (G_data) is inserted.
- the transmission data generation unit 110C shown in FIG. 16 is otherwise configured in the same manner as the transmission data generation unit 110A shown in FIG.
- Image data (3D image data or 2D image data) output from the data extraction unit 111 is supplied to the video encoder 112.
- H.264 is applied to the image data.
- H.264 / AVC or MPEG2 encoding is performed, and a video elementary stream (video stream) including encoded video data is generated.
- the image data is 2D image data
- a video elementary stream (first video stream) into which the 2D image data is inserted is generated.
- a video elementary stream (second video stream) into which the 3D image data is inserted is generated.
- the video elementary stream thus generated by the video encoder 112 is supplied to the multiplexer 116C.
- audio data corresponding to the image data is also output from the data extracting unit 111.
- This audio data is supplied to the audio encoder 113.
- the audio encoder 113 performs encoding such as MPEG-2Audio AAC on the audio data, and generates an audio elementary stream including the encoded audio data.
- This audio elementary stream is supplied to the multiplexer 116C.
- the multiplexer 116C the elementary streams generated by the video encoder 112 and the audio encoder 113 are packetized and multiplexed to generate a transport stream TS as a multiplexed stream.
- the advance information generation unit 114 corresponds to each switching timing between the first video stream and the second video stream, and a predetermined number of G_data (advance information) used at a predetermined number of insertion timings of the advance information. ) Is generated.
- This G_data includes timing information indicating the interval from the insertion timing of advance information to the switching timing, information indicating whether switching to the first video stream or switching to the second video stream, and the like.
- G_data generated by the advance information generating unit 114 is supplied to the multiplexer 116C.
- a descriptor (G_descriptor) including the contents of G_data as the advance information generated by the advance information generating unit 114 is created, and this descriptor is inserted under the program map table (PMT).
- PMT program map table
- the multiplexer 116C inserts a predetermined number of G_descriptors used at a predetermined number of insertion timings of the advance information corresponding to the insertion timing. Accordingly, synchronization with the video stream is achieved for each G_descriptor.
- FIG. 18 shows a configuration example of a transport stream (multiplexed data stream) including a video elementary stream and an audio elementary stream.
- This transport stream includes PES packets obtained by packetizing each elementary stream.
- the PES packet “Video PES” of the video elementary stream is included.
- the PES packet “Audio PES” of the audio elementary stream is included.
- the transport stream TS includes a PMT (Program Map Table) as PSI (Program Specific Information).
- PSI Program Specific Information
- This PSI is information describing to which program each elementary stream included in the transport stream belongs.
- the transport stream TS includes an EIT (Event Information Table) as SI (Serviced Information) for managing each event.
- EIT Event Information Table
- SI Serviced Information
- FIG. 19 shows a structural example (Syntax) of “G_descriptor”.
- the 8-bit field of “descriptor_tag” indicates that this descriptor is “G_descriptor”.
- the 8-bit field of “descriptor_length” indicates the number of subsequent data bytes. Note that the fields of “switching_bit”, “next_sequence_2D”, “next_sequence_2D”, and “timing_information” are the same as described in the structure example of “G_data” described above, and thus the description thereof is omitted here ( 9 and 10).
- the second-dimensional image data is inserted into the transport stream TS as the multiplexed stream.
- Advance information notifying in advance of switching between one video stream and a second video stream into which 3D image data has been inserted is inserted. Therefore, on the receiving side, by controlling based on this advance information, automatic switching of the shutter glasses mode can be appropriately performed at the actual switching timing of the two-dimensional image display state and the three-dimensional image display state. it can.
- FIG. 20 shows a configuration example of the receiver 200.
- the receiver 200 includes a CPU 201, a flash ROM 202, a DRAM 203, an internal bus 204, a remote control receiver 205, and a remote controller transmitter 206.
- the receiver 200 also includes an antenna terminal 211, a digital tuner 212, a bit stream processing unit 213, a 3D signal processing unit 214, and a video signal control unit 215.
- the receiver 200 includes a panel drive unit 216, a display panel 217, an audio signal processing unit 218, an audio amplification unit 219, a speaker 220, and a shutter control unit 221.
- the CPU 201 controls the operation of each unit of receiver 200.
- the flash ROM 202 stores control software and data.
- the DRAM 203 constitutes a work area for the CPU 201.
- the CPU 201 develops software and data read from the flash ROM 202 on the DRAM 203 and activates the software to control each unit of the image display apparatus 100.
- the remote control receiving unit 205 receives the remote control signal (remote control code) transmitted from the remote control transmitter 206 and supplies it to the CPU 201.
- CPU201 controls each part of receiver 200 based on this remote control code.
- the CPU 201, flash ROM 202, and DRAM 203 are connected to each other via an internal bus 204.
- the antenna terminal 211 is a terminal for inputting a television broadcast signal received by a receiving antenna (not shown).
- the digital tuner 212 processes the television broadcast signal input to the antenna terminal 211 and outputs a predetermined transport stream (bit stream data) TS corresponding to the user's selected channel.
- the bit stream processing unit 213 extracts image data, audio data, and advance information (G_data) as content data from the transport stream TS.
- the 3D signal processing unit 214 When the output image data of the bit stream processing unit 213 is 3D image data, the 3D signal processing unit 214 performs processing corresponding to the transmission method to generate left eye image data and right eye image data of each frame. Output. Note that when the output image data of the 3D signal processing unit 214 is two-dimensional image data, the 3D signal processing unit 214 outputs the two-dimensional image data as it is.
- the video signal control unit 215 generates image data for displaying a three-dimensional image when the left-eye image data and the right-eye image data of each frame are output from the 3D signal processing unit 214. That is, the video signal control unit 215 generates image data for time-division display on the display panel 217 in the order of left eye image ⁇ right eye image ⁇ left eye image ⁇ right eye image ⁇ . The video signal control unit 215 generates image data for displaying a two-dimensional image when the two-dimensional image data is output from the 3D signal processing unit 214.
- the video signal control unit 215 outputs the generated image data to the panel drive unit 216.
- the panel drive unit 216 drives the display panel 217 based on the image data from the video signal control unit 215, and performs 3D image display or 2D image display on the display panel 217. That is, in the three-dimensional image display, the left eye image and the right eye image are displayed on the display panel 217 in a time division manner. In the two-dimensional image display, the two-dimensional image is continuously displayed on the display panel 217.
- the audio signal processing unit 218 performs necessary processing such as D / A conversion on the audio data obtained by the bit stream processing unit 213.
- the audio signal amplifier 219 amplifies the audio signal output from the audio signal processor 218 and supplies the amplified audio signal to the speaker 220.
- the shutter control unit 221 controls a shutter operation of the shutter glasses 300 based on advance information acquired by the bit stream processing unit 213, a predetermined signal generated based on signal processing by the video signal control unit 215, and the like. Generate a control signal.
- the wireless communication unit 221a included in the shutter control unit 221 performs wireless communication with the shutter glasses 300 based on, for example, IEEE 802.15.4.
- the wireless communication unit 221 a transmits the shutter control signal generated by the shutter control unit 221 to the shutter glasses 300.
- the shutter control signal includes, for example, shutter opening / closing cycle and L / R phase information thereof, information on shutter opening time, and the like.
- the shutter control unit 221 generates a shutter mode switching command as one of the shutter control signals.
- This shutter mode switching command is a control signal for switching the shutter mode of the shutter glasses 300 to the normal mode (Normal Mode) or the open mode (Open Mode).
- the left eye shutter and the right eye shutter are in an open state corresponding to the display of the left eye image and the right eye image on the display panel 217, respectively.
- both the left eye shutter and the right eye shutter are in the open state.
- This shutter mode switching command is added with information indicating the time until switching the shutter mode, and further information indicating whether switching to the open mode or switching to the normal mode.
- the shutter control unit 221 generates this shutter mode switching command based on the advance information and transmits it to the shutter glasses 300 before the switching timing between the two-dimensional image display and the three-dimensional image display. An example of the transmission timing of the shutter mode switching command from the shutter control unit 221 to the shutter glasses 300 will be described later.
- FIG. 21 shows a configuration example of the shutter glasses 300.
- the shutter glasses 300 include a control unit 301, a wireless communication unit 302, a shutter drive unit 303, and a glasses unit 304.
- the control unit 301 controls the operation of each unit of the shutter glasses 300.
- the wireless communication unit 302 performs wireless communication with the receiver 200 based on, for example, IEEE 802.15.4.
- the wireless communication unit 302 receives a shutter control signal transmitted from the wireless communication unit 221 a in the shutter control unit 221 of the receiver 200.
- the shutter driving unit 303 drives the left eye shutter 300L and the right eye shutter 300R of the glasses unit 304 under the control of the control unit 301 based on the shutter control signal received by the wireless communication unit 302. As described above, there is a shutter mode switching command as one of the shutter control signals. Information indicating the time until the shutter mode is switched and information indicating whether the mode is switched to the open mode or the normal mode are added to the shutter mode switching command.
- the shutter driving unit 303 switches the shutter driving from the normal mode to the open mode or from the open mode to the normal mode based on the shutter mode switching command.
- the shutter driving unit 303 opens the left eye shutter 300L at the timing when the left eye image is displayed on the display panel 217 of the receiver 200, and the right eye image is displayed on the display panel 217 of the receiver 200.
- the right eye shutter 300R is opened at the timing.
- the shutter drive unit 303 keeps both the left-eye shutter 300L and the right-eye shutter 300R open corresponding to the two-dimensional image display on the display panel 217 of the receiver 200. This reduces power consumption and contributes to improved visual health.
- a television broadcast signal input to the antenna terminal 211 is supplied to the digital tuner 212.
- the television broadcast signal is processed, and bit stream data as a predetermined transport stream TS corresponding to the user's selected channel is obtained.
- the bit stream data output from the digital tuner 212 is supplied to the bit stream processing unit 213.
- the bit stream processing unit 213 extracts image data, audio data, and advance information (G_data) from the bit stream data.
- the image data extracted by the bit stream processing unit 213 is supplied to the 3D signal processing unit 214.
- the 3D signal processing unit 214 when the output image data of the bit stream processing unit 213 is 3D image data, processing corresponding to the transmission method is performed, and left eye image data and right eye image data of each frame are generated. Is output. Further, when the output image data of the bit stream processing unit 213 is two-dimensional image data, the 3D signal processing unit 214 outputs the two-dimensional image data as it is. The image data output from the 3D signal processing unit 214 is supplied to the video signal control unit 215.
- the video signal control unit 2115 when the left-eye image data and the right-eye image data of each frame are output from the 3D signal processing unit 214, image data for displaying a three-dimensional image is generated. That is, the video signal control unit 215 generates image data for time-division display on the display panel 217 in the order of left eye image ⁇ right eye image ⁇ left eye image ⁇ right eye image ⁇ . In addition, when the 2D image data is output from the 3D signal processing unit 214, the video signal control unit 215 generates image data for displaying a 2D image.
- the image data generated by the video signal control unit 215 is supplied to the panel drive unit 216.
- the panel driving unit 216 the display panel 217 is driven based on the image data from the video signal control unit 215, and a three-dimensional image display or a two-dimensional image display is performed on the display panel 217. That is, in the three-dimensional image display, the left eye image and the right eye image are displayed on the display panel 217 in a time division manner. In the two-dimensional image display, the two-dimensional image is continuously displayed on the display panel 217.
- the audio data extracted by the bit stream processing unit 213 is supplied to the audio signal processing unit 218.
- the audio signal processing unit 218 necessary processing such as D / A conversion is performed on the audio data.
- the audio data is amplified by the audio signal amplification circuit 219 and then supplied to the speaker 220. Therefore, sound corresponding to the display image on the display panel 217 is output from the speaker 220.
- a predetermined signal generated based on signal processing is supplied from the video signal control unit 215 to the shutter control unit 221.
- the advance information extracted by the bit stream processing unit 213 is supplied to the shutter control unit 221.
- the shutter control unit 221 generates a shutter control signal such as a shutter mode switching command for controlling the shutter operation of the shutter glasses 300 based on the advance information and the like.
- the shutter control signal is transmitted to the shutter glasses 300 through the wireless communication unit 221a.
- the shutter control signal is transmitted intermittently from the viewpoint of suppressing power consumption in the shutter glasses 300.
- the shutter mode switching command is transmitted to the shutter glasses 300 before the switching timing between the two-dimensional image display and the three-dimensional image display.
- the wireless communication unit 302 of the shutter glasses 300 receives the shutter control signal transmitted from the wireless communication unit 221a in the shutter control unit 221 of the receiver 200.
- the shutter driving unit 303 can drive the left eye shutter 300L and the right eye shutter 300R of the glasses unit 304 under the control of the control unit 301 based on the shutter control signal received by the wireless communication unit 302. Done.
- the shutter mode of the glasses unit 304 is switched to the normal mode at the switching timing to the three-dimensional image display state in which the left eye image and the right eye image are sequentially displayed on the display panel 217 of the receiver 200.
- the shutter mode of the glasses unit 304 is switched to the open mode at the switching timing to the two-dimensional image display state in which the two-dimensional image is displayed on the display panel 217 of the receiver 200.
- the left eye shutter 300L is opened at the timing when the left eye image is displayed on the display panel 217 of the receiver 200, and the right eye is displayed at the timing when the right eye image is displayed on the display panel 217 of the receiver 200.
- the shutter 300R is opened. Therefore, the user wearing the shutter glasses 300 can perceive only the left eye image with the left eye, can perceive only the right eye image with the right eye, and the left eye image and the right eye displayed on the display panel 217. Based on the image, a three-dimensional (3D) image stereoscopic image can be perceived.
- 3D three-dimensional
- both the left-eye shutter 300L and the right-eye shutter 300R remain open. Therefore, the user wearing the shutter glasses 300 can observe the two-dimensional image displayed on the display panel 217 of the receiver 200 in a natural state.
- the shutter mode switching command is transmitted from the shutter control unit 221 of the receiver 200 to the shutter glasses 300 before the switching timing between the two-dimensional image display and the three-dimensional image display. Therefore, in the shutter glasses 300, switching between the normal mode and the open mode is appropriately performed at the switching timing between the 3D image display and the 2D image display.
- FIG. 22 shows the correspondence between the transport stream TS received by the receiver 200, that is, the video stream included in the bit stream data, and the advance information extracted from the bit stream data by the bit stream processing unit 213. Yes.
- FIG. 22 schematically shows the transmission timing of the shutter mode switching command from the shutter control unit 221 of the receiver 200 to the shutter glasses 300.
- FIG. 22 shows an example in which G_data as advance information is inserted into an advance information stream that is independent of the video stream, constituting the transport stream TS (see FIGS. 7 and 8).
- the shutter control unit 221 of the receiver 200 immediately measures and records its own clock at the timing A when the G_data is received.
- a time delay T between the time of G_data and the shutter mode switching time is obtained from the value of “timing_information” described in the PTS and G_data.
- a time delay T between the time of G_data and the shutter mode switching time is obtained from the time position of G_data and the value of “timing_information” described in G_data. .
- the shutter control unit 221 of the receiver 200 measures the elapsed time U from the timing A described above, and obtains a value S obtained by subtracting U from the time delay T as shown in the following equation (1). Further, a value T3 obtained by subtracting R from S is obtained as in equation (2).
- S TU (1)
- T3 SR (2)
- T3 (SR) is the same as the time W from when the command is received to when the shutter mode is switched in the shutter mode switching command (shutter control signal) sent in the temporal grid of the transmission cycle Tp. That is, T3 (SR) information is added to the shutter mode switching command transmitted from the shutter control unit 221 of the receiver 200 using the temporal grid having the transmission period Tp as information indicating the time until the shutter mode is switched. Is done.
- the shutter glasses 300 detect parameters added to the shutter mode switching command sent from the shutter control unit 221 of the receiver 200. That is, in shutter glasses 300, information indicating the time until switching the shutter mode (time information until switching) and information indicating whether switching to the normal mode or switching to the open mode (mode information after switching) is detected.
- the shutter mode switching timing is controlled based on the time information until switching.
- the shutter mode after switching is controlled based on the mode information after switching. Therefore, in the shutter glasses 300, switching between the normal mode and the open mode is appropriately performed at the switching timing between the three-dimensional image display and the two-dimensional image display.
- the advance information (G_data) inserted into the received transport stream TS is extracted.
- This advance information is information informing in advance of switching between the first video stream in which the 2D image data is inserted and the second video stream in which the 3D image data is inserted.
- the receiver 200 can transmit a shutter mode switching command to the shutter glasses 300 before the switching timing between the two-dimensional image display and the three-dimensional image display. Therefore, automatic switching of the shutter glasses mode can be appropriately performed at the actual switching timing of the two-dimensional image display state and the three-dimensional image display state.
- FIG. 23 shows a configuration example of an image transmission / reception system 10A as the second embodiment.
- the image transmission / reception system 10A includes a broadcasting station 100, a set top box (STB) 400, and a receiver 200A.
- STB set top box
- FIG. 23 parts corresponding to those in FIG. 1 are denoted by the same reference numerals, and detailed description thereof will be omitted as appropriate.
- the set-top box 400 and the receiver 200A are connected via an HDMI (High Definition Multimedia Interface) cable 500.
- the set top box 400 is provided with an HDMI terminal 402.
- the receiver 200A is provided with an HDMI terminal 231.
- One end of the HDMI cable 500 is connected to the HDMI terminal 402 of the set top box 400, and the other end of the HDMI cable 500 is connected to the HDMI terminal 231 of the receiver 200A.
- the broadcasting station 100 transmits a transport stream TS as a multiplexed stream that includes a first video stream (first video elementary stream) and a second video stream (second video elementary stream) in a time division manner. , Send on broadcast waves.
- the broadcast station 100 includes a transmission data generation unit 110 that generates the transport stream TS.
- Two-dimensional (2D) image data is inserted into the first video stream.
- three-dimensional (3D) image data is inserted into the second video stream.
- the video stream is, for example, H.264. H.264 / AVC video stream or MPEG2 video stream.
- advance information notifying in advance of switching between the first video stream and the second video stream is inserted.
- Timing information indicating an interval from the insertion timing of the advance information to the switching timing is added to the advance information.
- information indicating whether switching to the first video stream or switching to the second video stream is added to the advance information.
- the advance information is inserted into a stream that constitutes the transport stream and is independent of the video stream. Further, for example, the advance information is inserted in a layer higher than the picture layer of the video stream. In this case, for example, the advance information is inserted into the header part of the PES constituting the video stream. Further, for example, the advance information is inserted under a program map table (PMT: Program Map Table) included in the multiplexed stream.
- PMT Program Map Table
- the set top box 400 receives the transport stream TS transmitted from the broadcasting station 100 on the broadcast wave.
- the set top box 400 acquires image data from the video stream included in the received transport stream TS.
- the image data is 2D image data when the video stream is the first video stream and includes 2D image data, and the video stream is the second video stream and includes 3D image data. Is three-dimensional image data.
- the set top box 400 acquires advance information (G_data) from the received transport stream TS. Then, the set top box 400 transmits the image data and the advance information to the receiver 200A through the HDMI digital interface, that is, through the HDMI cable 500 as a transmission path.
- G_data advance information
- FIG. 24 shows a configuration example of the set top box 400.
- the set top box 400 includes a bit stream processing unit 401, an HDMI terminal 402, an antenna terminal 403, a digital tuner 404, a video signal processing circuit 405, an HDMI transmission unit 406, and an audio signal processing circuit 407. ing.
- the set top box 400 includes a CPU 411, a flash ROM 412, a DRAM 413, an internal bus 414, a remote control receiving unit 415, and a remote control transmitter 416.
- the antenna terminal 403 is a terminal for inputting a television broadcast signal received by a receiving antenna (not shown).
- the digital tuner 404 processes the television broadcast signal input to the antenna terminal 403 and outputs a predetermined transport stream (bit stream data) TS corresponding to the user's selected channel.
- the bit stream processing unit 201 has the same configuration as that of the bit stream processing unit 213 in the receiver 200 shown in FIG. 20 described above, and includes image data, audio data, and advance information as content data from the transport stream TS. (G_data) and the like are extracted.
- G_data content data from the transport stream TS.
- the video stream included in the transport stream TS is the first video stream and includes 2D image data
- 2D image data is extracted as image data.
- 3D image data is extracted as image data.
- the video signal processing circuit 405 performs scaling processing, image quality adjustment processing, and the like on the image data (two-dimensional image data, three-dimensional image data) output from the bit stream processing unit 401 as necessary.
- the image data is supplied to the HDMI transmission unit 406.
- the audio signal processing circuit 407 performs sound quality adjustment processing or the like on the audio data output from the bit stream processing unit 401 as necessary, and supplies the processed audio data to the HDMI transmission unit 406.
- the HDMI transmission unit 406 transmits image data and audio data from the HDMI terminal 402 by communication conforming to HDMI. At this time, the HDMI transmission unit 406 adds the advance information (G_data) extracted by the bit stream processing unit 401 as described above to the image data and transmits it. Since transmission is performed using the HDMI TMDS channel, image and audio data are packed and output from the HDMI transmission unit 406 to the HDMI terminal 402.
- the HDMI transmission unit 406 has a version of, for example, HDMI 1.4, and can handle 3D image data. Details of the HDMI transmission unit 406 will be described later.
- the CPU 411 controls the operation of each part of the set top box 400.
- the flash ROM 412 stores control software and data.
- the DRAM 413 constitutes a work area for the CPU 411.
- the CPU 411 develops software and data read from the flash ROM 412 on the DRAM 413 to activate the software, and controls each unit of the set top box 400.
- the remote control receiving unit 415 receives the remote control signal (remote control code) transmitted from the remote control transmitter 416 and supplies it to the CPU 411.
- the CPU 411 controls each part of the set top box 400 based on the remote control code.
- the CPU 411, flash ROM 412 and DRAM 413 are connected to the internal bus 414.
- a television broadcast signal input to the antenna terminal 403 is supplied to the digital tuner 404.
- the digital tuner 404 processes the television broadcast signal and outputs a predetermined transport stream (bit stream data) TS corresponding to the user's selected channel.
- the transport stream TS output from the digital tuner 404 is supplied to the bit stream processing unit 401.
- the bit stream processing unit 401 extracts image data (two-dimensional image data, three-dimensional image data), audio data, advance information (G_data), and the like from the transport stream TS.
- the image data extracted by the bit stream processing unit 401 is subjected to scaling processing, image quality adjustment processing, and the like as necessary by the video signal processing circuit 405, and then supplied to the HDMI transmission unit 406.
- the audio data extracted by the bit stream processing unit 401 is supplied to the HDMI transmission unit 406 after the audio signal processing circuit 407 performs sound quality adjustment processing or the like as necessary.
- the advance information (G_data) extracted by the bit stream processing unit 401 is supplied to the HDMI transmission unit 406.
- the image data and audio data supplied to the HDMI transmission unit 406 and the advance information (G_data) are transmitted from the HDMI terminal 402 to the HDMI cable 500 via the HDMI TMDS channel.
- the receiver 200A receives image data and audio data sent from the set top box 400 via the HDMI cable 500, and further, advance information (G_data).
- the receiver 200 ⁇ / b> A has a display panel 217. Based on the image data (two-dimensional image data, three-dimensional image data), the receiver 200A displays a two-dimensional image on the display panel 217 or a three-dimensional image display that alternately displays a left-eye image and a right-eye image. I do. That is, when 2D image data is sent from the set top box 400, 2D image display is performed, and when 3D image data is sent from the set top box 400, 3D image display is performed. .
- the receiver 200 ⁇ / b> A includes a shutter control unit 221. Based on the advance information (G_data), the receiver 200A transmits a shutter mode switching command to the shutter glasses 300 before the switching timing between the two-dimensional image display and the three-dimensional image display by the shutter control unit 221. To do.
- the shutter mode switching command includes information indicating the time until switching the shutter mode, and information indicating whether switching to the shutter mode corresponding to 2D image display or switching to the shutter mode corresponding to 3D image display. It has been added.
- the shutter glasses 300 appropriately switch the shutter mode at the actual switching timing between the two-dimensional image display state and the three-dimensional image display state based on the shutter mode switching command.
- the configuration of the shutter glasses 300 is the same as described above (see FIG. 21).
- FIG. 25 illustrates a configuration example of the receiver 200A.
- the receiver 200A includes a CPU 201, a flash ROM 202, a DRAM 203, an internal bus 204, a remote control receiving unit 205, and a remote control transmitter 206.
- the receiver 200 also includes an antenna terminal 211, a digital tuner 212, a bit stream processing unit 213, a 3D signal processing unit 214, and a video signal control unit 215.
- the receiver 200 includes a panel driving unit 216, a display panel 217, an audio signal processing unit 218, an audio amplification unit 219, a speaker 220, and a shutter control unit 220. Further, the receiver 200 includes an HDMI terminal 231 and an HDMI receiving unit 232.
- the HDMI receiving unit 232 receives image data and audio data and further advance information (G_data) sent to the HDMI terminal 231 via the HDMI cable 500 by communication conforming to HDMI.
- the HDMI receiving unit 232 has a version of, for example, HDMI 1.4, and can handle 3D image data. Details of the HDMI receiving unit 232 will be described later.
- the 3D signal processing unit 214 selectively performs processing on the image data received by the HDMI receiving unit 232 or extracted by the bit stream processing unit 213.
- the processing contents in the 3D signal processing unit 214 are the same as those of the 3D signal processing unit 214 of the receiver 200 shown in FIG.
- the audio signal processing unit 218 selectively performs processing on the audio data received by the HDMI receiving unit 232 or extracted by the bit stream processing unit 213.
- the processing content in the audio signal processing unit 218 is the same as that of the audio signal processing unit 218 of the receiver 200 shown in FIG.
- the shutter control unit 221 converts the advance information (G_data) received by the HDMI receiving unit 232 or extracted by the bit stream processing unit 213, a predetermined signal generated based on the signal processing by the video signal control unit 116, and the like. Based on this, a shutter control signal for controlling the shutter operation of the shutter glasses 300 is generated.
- the advance information (G_data) received by the HDMI receiving unit 232 is used.
- the advance information (G_data) extracted by the bit stream processing unit 213 is used.
- the processing content in the shutter control unit 221 is the same as that of the shutter control unit 221 of the receiver 200 shown in FIG.
- the operation of the receiver 200A shown in FIG. The operations using the image data, audio data, and advance information (G_data) extracted by the bit stream processing unit 213 are the same as those of the receiver 200 shown in FIG. Here, only an operation using image data, audio data, and advance information (G_data) received by the HDMI receiving unit 232 will be described.
- the HDMI receiving unit 232 receives image data and audio data and further advance information (G_data) transmitted from the set top box 400 connected to the HDMI terminal 231 via the HDMI cable 500.
- the image data (two-dimensional image data, three-dimensional image data) received by the HDMI receiving unit 232 is supplied to the 3D signal processing unit 214.
- the audio data received by the HDMI receiving unit 232 is supplied to the audio signal processing unit 218.
- the advance information (G_data) received by the HDMI receiving unit 232 is supplied to the shutter control unit 221.
- the 3D signal processing unit 214 when the image data received by the HDMI receiving unit 232 is three-dimensional image data, processing corresponding to the transmission method is performed, and the left eye image data and the right eye image data of each frame are converted. Generated and output. In the 3D signal processing unit 214, when the image data received by the HDMI receiving unit 232 is two-dimensional image data, the two-dimensional image data is output as it is. The image data output from the 3D signal processing unit 214 is supplied to the video signal control unit 215.
- the video signal control unit 2115 when the left-eye image data and the right-eye image data of each frame are output from the 3D signal processing unit 214, image data for stereoscopic image display is generated. That is, the video signal control unit 215 generates image data for time-division display on the display panel 217 in the order of left eye image ⁇ right eye image ⁇ left eye image ⁇ right eye image ⁇ . In addition, when the 2D image data is output from the 3D signal processing unit 214, the video signal control unit 215 generates image data for displaying a 2D image.
- the image data generated by the video signal control unit 215 is supplied to the panel drive unit 216.
- the panel driving unit 216 the display panel 217 is driven based on the image data from the video signal control unit 215, and a three-dimensional image display or a two-dimensional image display is performed on the display panel 217. That is, in the three-dimensional image display, the left eye image and the right eye image are displayed on the display panel 217 in a time division manner. In the two-dimensional image display, the two-dimensional image is continuously displayed on the display panel 217.
- the audio data received by the HDMI receiving unit 232 is supplied to the audio signal processing unit 218.
- the audio signal processing unit 2118 necessary processing such as D / A conversion is performed on the audio data.
- the audio data is amplified by the audio signal amplification circuit 219 and then supplied to the speaker 220. Therefore, sound corresponding to the display image on the display panel 217 is output from the speaker 220.
- a predetermined signal generated based on signal processing is supplied from the video signal control unit 215 to the shutter control unit 221.
- the advance information (G_data) received by the HDMI receiving unit 232 is supplied to the shutter control unit 221.
- the shutter control unit 221 generates a shutter control signal such as a shutter mode switching command for controlling the shutter operation of the shutter glasses 300 based on the advance information and the like.
- the shutter control signal is transmitted to the shutter glasses 300 through the wireless communication unit 221a.
- the shutter control signal is transmitted intermittently from the viewpoint of suppressing power consumption in the shutter glasses 300.
- the shutter mode switching command is transmitted to the shutter glasses 300 before the switching timing between the two-dimensional image display and the three-dimensional image display.
- FIG. 26 shows a correspondence relationship between the video stream included in the transport stream TS received by the set top box 400 and the advance information (G_data) extracted by the bit stream processing unit 401 from this bit stream data.
- FIG. 26 schematically shows the transmission timing of the shutter mode switching command from the shutter controller 221 of the receiver 200A to the shutter glasses 300 based on the advance information (G_data) sent from the set top box 400.
- This FIG. 26 is the same as FIG. 22 except that the HDMI transmission unit 406 and the HDMI reception unit 232 are interposed, and thus detailed description thereof is omitted.
- FIG. 27 illustrates a configuration example of the HDMI transmission unit (HDMI source) 406 of the set-top box 400 and the HDMI reception unit (HDMI sink) 232 of the receiver 200A in the image transmission / reception system 10A of FIG.
- the HDMI transmission unit 406 transmits a differential signal corresponding to pixel data of an uncompressed image for one screen in an effective image section (hereinafter, also referred to as an active video section as appropriate) using a plurality of channels.
- the effective image section is a section obtained by removing the horizontal blanking section and the vertical blanking section from the section from one vertical synchronization signal to the next vertical synchronization signal.
- the HDMI transmission unit 406 receives HDMI differential signals corresponding to at least audio data, control data, and other auxiliary data associated with an image on a plurality of channels in a horizontal blanking interval or a vertical blanking interval. Transmit to the unit 232 in one direction.
- the transmission channels of the HDMI system including the HDMI transmission unit 406 and the HDMI reception unit 232 include the following transmission channels. That is, three TMDS channels # 0 to ## as transmission channels for serially transmitting pixel data and audio data from the HDMI transmission unit 406 to the HDMI reception unit 232 in one direction in synchronization with the pixel clock. There are two. There is also a TMDS clock channel as a transmission channel for transmitting a pixel clock.
- the HDMI transmission unit 406 includes an HDMI transmitter 81.
- the transmitter 81 converts, for example, pixel data of an uncompressed image into a corresponding differential signal, and is connected via the HDMI cable 500 with three TMDS channels # 0, # 1, and # 2 that are a plurality of channels. Serial transmission in one direction to the HDMI receiving unit 232.
- the transmitter 81 converts audio data accompanying uncompressed images, further necessary control data and other auxiliary data, etc. into corresponding differential signals, and converts them into three TMDS channels # 0, # 1, #. 2 serially transmits to the HDMI receiving unit 232 in one direction.
- the transmitter 81 transmits the pixel clock synchronized with the pixel data transmitted through the three TMDS channels # 0, # 1, and # 2 to the HDMI receiving unit 232 connected via the HDMI cable 500 using the TMDS clock channel. Send.
- the HDMI receiving unit 232 receives the differential signal corresponding to the pixel data transmitted from the HDMI transmitting unit 406 in one direction through a plurality of channels in the active video section. Further, the HDMI receiving unit 232 transmits differential signals corresponding to audio data and control data transmitted in one direction from the HDMI transmitting unit 406 through a plurality of channels in a horizontal blanking interval or a vertical blanking interval. Receive.
- the HDMI receiving unit 232 includes the HDMI receiver 82.
- This HDMI receiver 82 uses TMDS channels # 0, # 1, and # 2 to transmit a differential signal corresponding to pixel data and a difference corresponding to audio data and control data transmitted from the HDMI transmission unit 406 in one direction. Receive a motion signal. In this case, reception is performed in synchronization with the pixel clock transmitted from the HDMI transmission unit 406 via the TMDS clock channel.
- the transmission channels of the HDMI system include transmission channels called DDC (Display Data Channel) 83 and CEC line 84 in addition to the above-described TMDS channels # 0 to # 2 and the TMDS clock channel.
- the DDC 83 includes two signal lines (not shown) included in the HDMI cable 500.
- the DDC 83 is used by the HDMI transmitting unit 406 to read E-EDID (Enhanced Extended Display Identification Data) from the HDMI receiving unit 232.
- E-EDID Enhanced Extended Display Identification Data
- the HDMI receiving unit 232 has an EDID ROM (Read Only Memory) 85 that stores E-EDID, which is performance information related to its performance (Configuration / capability), in addition to the HDMI receiver 81. .
- E-EDID ROM Read Only Memory
- the HDMI transmission unit 406 reads E-EDID from the HDMI reception unit 232 connected via the HDMI cable 500 via the DDC 83.
- the HDMI transmission unit 406 sends the read E-EDID to the CPU 411.
- the CPU 411 stores this E-EDID in the flash ROM 412 or the DRAM 413.
- the CPU 411 can recognize the performance setting of the HDMI receiving unit 232 based on the E-EDID. For example, the CPU 411 recognizes whether or not the receiver 200A having the HDMI receiving unit 232 can handle three-dimensional image data, and if so, what TMDS transmission data structure can be supported.
- the CEC line 84 is made up of one signal line (not shown) included in the HDMI cable 500, and is used for bidirectional communication of control data between the HDMI transmission unit 406 and the HDMI reception unit 232.
- the CEC line 84 constitutes a control data line.
- the HDMI cable 500 includes a line (HPD line) 86 connected to a pin called HPD (Hot Plug Detect).
- HPD line 86 is also used as a HEAC-line constituting a bidirectional communication path.
- the HDMI cable 500 includes a line (power line) 87 used for supplying power from the source device to the sink device.
- the HDMI cable 500 includes a utility line 88.
- the utility line 88 is also used as a HEAC + line constituting a bidirectional communication path.
- FIG. 28 shows an example of the structure of TMDS transmission data.
- FIG. 28 shows sections of various transmission data when image data of horizontal ⁇ vertical 1920 pixels ⁇ 1080 lines is transmitted in TMDS channels # 0, # 1, and # 2.
- Video Field In a video field (Video Field) in which transmission data is transmitted through the three TMDS channels # 0, # 1, and # 2 of HDMI, there are three types of sections according to the type of transmission data. These three types of sections are a video data period (Video Data period), a data island period (Data Islandperiod), and a control period (Control period).
- the video field period is a period from the rising edge (active edge) of a certain vertical synchronizing signal to the rising edge of the next vertical synchronizing signal.
- This video field period is divided into a horizontal blanking period (horizontal blanking), a vertical blanking period (verticalblanking), and an active video period (Active Video).
- This active video section is a section obtained by removing the horizontal blanking period and the vertical blanking period from the video field section.
- the video data section is assigned to the active video section.
- 1920 pixels (pixels) ⁇ 1080 lines of effective pixel (Active pixel) data constituting uncompressed image data for one screen is transmitted.
- Data island section and control section are assigned to horizontal blanking period and vertical blanking period.
- auxiliary data (Auxiliary data) is transmitted. That is, the data island period is assigned to a part of the horizontal blanking period and the vertical blanking period.
- audio data packets which are data not related to control, of auxiliary data are transmitted.
- the control section is assigned to other parts of the horizontal blanking period and the vertical blanking period.
- this control period for example, vertical synchronization signals, horizontal synchronization signals, control packets, and the like, which are data related to control, of auxiliary data are transmitted.
- FIG. 29 shows a packet structure of HDMI “Vendor” Specific “InfoFrame”. Since this HDMI Vendor Specific InfoFrame is defined in CEA-861-D, detailed description is omitted.
- 3 bits information “HDMI_Video_Format” indicating the type of image data is arranged from the 7th bit to the 5th bit of the 4th byte (PB4).
- “3D_Meta_present” is arranged in the third bit of the fifth byte (PB5) and Vendor SpecificInfoFrame extension is specified, this one bit is set to “1”.
- “3D_Metadata_type” is arranged from the 7th bit to the 5th bit of the 7th byte (PB7).
- G_data advance information
- this 3-bit information is unused, for example, “100”.
- 3D_Metadata_length is arranged from the 4th byte to the 0th byte of the 7th byte (PB7).
- This 5-bit information indicates the length of the 3D_Metadata area to be arranged thereafter.
- switching_bit is arranged at the seventh bit of the eighth byte (PB8)
- no_sequence_2D is arranged at the sixth bit
- timing_type is arranged from the fifth bit to the fourth bit.
- timing_information is arranged from the third bit to the 0th bit of the eighth byte (PB8) and from the seventh bit to the 0th bit of the eighth + 1 byte (PB8 + 1).
- the container is a transport stream (MPEG-2 TS)
- MPEG-2 TS transport stream
- the present invention can be similarly applied to a system configured to be distributed to receiving terminals using a network such as the Internet.
- the Internet distribution it is often distributed in a container of MP4 or other formats.
- containers of various formats such as transport stream (MPEG-2 TS) adopted in the digital broadcasting standard and MP4 used in Internet distribution correspond to the container.
- the present invention transmits a multiplexed stream including a first video stream into which two-dimensional image data is inserted and a second video stream into which three-dimensional image data is inserted in a time-division manner on a broadcast wave, or
- the present invention can be applied to an image transmission / reception system distributed via a network such as the Internet.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Library & Information Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームを時分割的に含む多重化ストリームを送信する送信部と、
上記多重化ストリームに、上記第1のビデオストリームと上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報を挿入する情報挿入部と
を備える送信装置にある。
2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームが時分割的に含まれ、さらに上記第1のビデオストリームと上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報が挿入されている多重化ストリームを受信する受信部と、
上記受信部で受信された上記多重化ストリームに含まれる上記ビデオストリームから取得される画像データに基づいて、表示部に、2次元画像表示、または、左眼画像および右眼画像を交互に表示する3次元画像表示を行う表示制御部と、
上記受信部で受信された上記多重化ストリームから取得される先出し情報に基づいて、シャッタメガネに、上記2次元画像表示と上記3次元画像表示との間の切り換えタイミングより前に、シャッタモード切り換えコマンドを送信するシャッタ制御部と
を備える受信装置にある。
2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームが時分割的に含まれ、さらに上記第1のビデオストリームおよび上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報が挿入されている多重化ストリームを受信する受信部と、
上記受信部で受信された上記多重化ストリームに含まれる上記ビデオストリームから取得される画像データと、上記多重化ストリームから取得される先出し情報とを、伝送路を介して、外部機器に送信するデータ送信部と
を備える受信装置にある。
2次元画像データおよび3次元画像データが時分割的に含まれる画像データと、上記2次元画像データと上記3次元画像データとの間の切り換えを事前に知らせる先出し情報とを、伝送路を介して、外部機器から受信するデータ受信部と、
上記データ受信部で受信された画像データに基づいて、表示部に、2次元画像表示、または、左眼画像および右眼画像を交互に表示する3次元画像表示を行う表示制御部と、
上記データ受信部で受信された上記先出し情報に基づいて、シャッタメガネに、上記2次元画像表示と上記3次元画像表示との間の切り換えタイミングより前に、シャッタモード切り換えコマンドを送信するシャッタ制御部と
を備える受信装置にある。
1.第1の実施の形態
2.第2の実施の形態
3.変形例
[画像送受信システム]
図1は、第1の実施の形態としての画像送受信システム10の構成例を示している。この画像送受信システム10は、放送局100および受信機200により構成されている。放送局100は、第1のビデオストリーム(第1のビデオエレメンタリストリーム)および第2のビデオストリーム(第2のビデオエレメンタリストリーム)を時分割的に含む多重化ストリームとしてのトランスポートストリームTSを、放送波に載せて送信する。
[第1の構成例]
図4に示す送信データ生成部110Aは、放送局100において、上述したトランスポートストリームTSを生成する送信データ生成部110の一例を示している。この送信データ生成部110Aは、先出し情報が、トランスポートストリームTSを構成する、ビデオストリームとは独立したストリームに挿入される例を示している。この送信データ生成部110Aは、データ取り出し部(アーカイブ部)111と、ビデオエンコーダ112と、オーディオエンコーダ113を有している。また、この送信データ生成部110Aは、先出し情報発生部114と、先出し情報エンコーダ115と、マルチプレクサ116を有している。
図11に示す送信データ生成部110Bは、放送局100において、上述したトランスポートストリームTSを生成する送信データ生成部110の一例を示している。この送信データ生成部110Bは、先出し情報が、ビデオストリームのピクチャレイヤ以上のレイヤに挿入される例を示している。この図11において、図4と対応する部分には同一符号を付し、適宜、その詳細説明を省略する。
図16に示す送信データ生成部110Cは、放送局100において、上述したトランスポートストリームTSを生成する送信データ生成部110の一例を示している。この送信データ生成部110Cは、先出し情報が、多重化ストリームに含まれるプログラム・マップ・テーブルの配下に挿入される例を示している。この図16において、図4と対応する部分には同一符号を付し、適宜、その詳細説明を省略する。
図20は、受信機200の構成例を示している。この受信機200は、CPU201と、フラッシュROM202と、DRAM203と、内部バス204と、リモコン受信部205と、リモコン送信機206を有している。また、この受信機200は、アンテナ端子211と、デジタルチューナ212と、ビットストリーム処理部213と、3D信号処理部214と、映像信号制御部215を有している。また、この受信機200は、パネル駆動部216と、表示パネル217と、音声信号処理部218と、音声増幅部219と、スピーカ220と、シャッタ制御部221を有している。
シャッタメガネ300について説明する。図21は、シャッタメガネ300の構成例を示している。このシャッタメガネ300は、制御部301と、無線通信部302と、シャッタ駆動部303と、メガネ部304を有している。
S=T-U ・・・(1)
T3=S-R ・・・(2)
[画像送受信システム]
図23は、第2の実施の形態としての画像送受信システム10Aの構成例を示している。この画像送受信システム10Aは、放送局100と、セットトップボックス(STB:Set Top Box)400と、受信機200Aを有している。この図23において、図1と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。
図24は、セットトップボックス400の構成例を示している。このセットトップボックス400は、ビットストリーム処理部401と、HDMI端子402と、アンテナ端子403と、デジタルチューナ404と、映像信号処理回路405と、HDMI送信部406と、音声信号処理回路407を有している。また、このセットトップボックス400は、CPU411と、フラッシュROM412と、DRAM413と、内部バス414と、リモコン受信部415と、リモコン送信機416を有している。
図25は、受信機200Aの構成例を示している。この図25において、図20と対応する部分には、同一符号を付し、適宜、その詳細説明を省略する。この受信機200Aは、CPU201と、フラッシュROM202と、DRAM203と、内部バス204と、リモコン受信部205と、リモコン送信機206を有している。また、この受信機200は、アンテナ端子211と、デジタルチューナ212と、ビットストリーム処理部213と、3D信号処理部214と、映像信号制御部215を有している。また、この受信機200は、パネル駆動部216と、表示パネル217と、音声信号処理部218と、音声増幅部219と、スピーカ220と、シャッタ制御部220を有している。また、この受信機200は、HDMI端子231と、HDMI受信部232を有している。
図27は、図23の画像送受信システム10Aにおける、セットトップボックス400のHDMI送信部(HDMIソース)406と、受信機200AのHDMI受信部(HDMIシンク)232の構成例を示している。
先出し情報(G_data)をHDMIインタフェースで送信する方法について説明する。ここでは、HDMI Vendor Specific InfoFrame を利用する方法について説明するが、これに限定されるものではない。この方法では、HDMI Vendor Specific InfoFrame paketにおいて、例えば、3D_Meta_present=1とされて、Vendor Specific InfoFrame extensionが指定される。その場合、3D_Metadata_typeは、未使用の、例えば、“100”と定義され、先出し情報(G_data)が指定される。
なお、上述実施の形態においては、コンテナがトランスポートストリーム(MPEG-2 TS)である例を示した。しかし、この発明は、インターネット等のネットワークを利用して受信端末に配信される構成のシステムにも同様に適用できる。インターネットの配信では、MP4やそれ以外のフォーマットのコンテナで配信されることが多い。つまり、コンテナとしては、デジタル放送規格で採用されているトランスポートストリーム(MPEG-2 TS)、インターネット配信で使用されているMP4などの種々のフォーマットのコンテナが該当する。
100・・・放送局
110,110A~110C・・・送信データ生成部
111・・・データ取り出し部
111a・・・データ記録媒体
112,112B・・・ビデオエンコーダ
113・・・オーディオエンコーダ
114・・・先出し情報発生部
115・・・先出し情報エンコーダ
116,116C・・・マルチプレクサ
200,200A・・・受信機
201・・・CPU
202・・・フラッシュROM
203・・・DRAM
211・・・アンテナ端子
212・・・デジタルチューナ
213・・・ビットストリーム処理部
214・・・3D信号処理部
215・・・映像信号制御部
216・・・パネル駆動部
217・・・表示パネル
218・・・音声信号処理部
219・・・音声信号増幅部
220・・・スピーカ
221・・・シャッタ制御部
221a・・・無線通信部
231・・・HDMI端子
232・・・HDNI受信部
300・・・シャッタメガネ
300L・・・左眼シャッタ
300R・・・右眼シャッタ
301・・・制御部
302・・・無線通信部
303・・・シャッタ駆動部
304・・・メガネ部
400・・・セットトップボックス
401・・・ビットストリーム処理部
402・・・HDMI端子
403・・・アンテナ端子
404・・・デジタルチューナ
405・・・映像信号処理回路
406・・・HDMI送信部
407・・・音声信号処理回路
411・・・CPU
412・・・フラッシュROM
413・・・DRAM
500・・・HDMIケーブル
Claims (18)
- 2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームを時分割的に含む多重化ストリームを送信する送信部と、
上記多重化ストリームに、上記第1のビデオストリームと上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報を挿入する情報挿入部と
を備える送信装置。 - 上記先出し情報には、
該先出し情報の挿入タイミングから上記切り換えのタイミングまでの間隔を示すタイミング情報が付加されている
請求項1に記載の送信装置。 - 上記先出し情報には、
上記第1のビデオストリームへの切り換えか上記第2のビデオストリームへの切り換えかを示す情報が付加されている
請求項2に記載の送信装置。 - 上記情報挿入部は、
上記先出し情報を、上記多重化ストリームを構成する、上記ビデオストリームとは独立したストリームに挿入する
請求項1に記載の送信装置。 - 上記情報挿入部は、
上記先出し情報を、上記ビデオストリームのピクチャレイヤ以上のレイヤに挿入する
請求項1に記載の送信装置。 - 上記情報挿入部は、
上記先出し情報を、上記ビデオストリームを構成するPESのヘッダ部に挿入する
請求項5に記載の送信装置。 - 上記情報挿入部は、
上記先出し情報を、上記多重化ストリームに含まれるプログラム・マップ・テーブルの配下に挿入する
請求項1に記載の送信装置。 - 2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームを時分割的に含む多重化ストリームを送信する送信ステップと、
上記多重化ストリームに、上記第1のビデオストリームと上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報を挿入する情報挿入ステップと
を備える送信方法。 - 2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームが時分割的に含まれ、さらに上記第1のビデオストリームと上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報が挿入されている多重化ストリームを受信する受信部と、
上記受信部で受信された上記多重化ストリームに含まれる上記ビデオストリームから取得される画像データに基づいて、表示部に、2次元画像表示、または、左眼画像および右眼画像を交互に表示する3次元画像表示を行う表示制御部と、
上記受信部で受信された上記多重化ストリームから取得される先出し情報に基づいて、シャッタメガネに、上記2次元画像表示と上記3次元画像表示との間の切り換えタイミングより前に、シャッタモード切り換えコマンドを送信するシャッタ制御部と
を備える受信装置。 - 上記シャッタモード切り換えコマンドには、シャッタモードを切り換えるまでの時間を示す情報が含まれている
請求項9に記載の受信装置。 - 上記シャッタモード切り換えコマンドには、上記2次元画像表示に対応したシャッタモードへの切り換えか上記3次元画像表示に対応したシャッタモードへの切り換えかを示す情報が付加されている
請求項10に示す受信装置。 - 2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームが時分割的に含まれ、さらに上記第1のビデオストリームと上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報が挿入されている多重化ストリームを受信する受信ステップと、
上記受信ステップで受信された上記多重化ストリームに含まれる上記ビデオストリームから取得される画像データに基づいて、表示部に、2次元画像表示、または、左眼画像および右眼画像を交互に表示する3次元画像表示を行う表示制御ステップと、
上記受信ステップで受信された上記多重化ストリームから取得される先出し情報に基づいて、シャッタメガネに、上記2次元画像表示と上記3次元画像表示との間の切り換えタイミングより前に、シャッタモード切り換えコマンドを送信するシャッタ制御ステップと
を備える受信方法。 - 2次元画像データが挿入された第1のビデオストリームおよび3次元画像データが挿入された第2のビデオストリームが時分割的に含まれ、さらに上記第1のビデオストリームおよび上記第2のビデオストリームとの間の切り換えを事前に知らせる先出し情報が挿入されている多重化ストリームを受信する受信部と、
上記受信部で受信された上記多重化ストリームに含まれる上記ビデオストリームから取得される画像データと、上記多重化ストリームから取得される先出し情報とを、伝送路を介して、外部機器に送信するデータ送信部と
を備える受信装置。 - 上記データ送信部は、
上記画像データを、複数チャネルで、差動信号により、上記伝送路を介して、上記外部機器に送信し、
上記画像データのブランキング期間に上記先出し情報を挿入することで、該先出し情報を上記外部機器に送信する
請求項13に記載の受信装置。 - 2次元画像データおよび3次元画像データが時分割的に含まれる画像データと、上記2次元画像データおよび上記3次元画像データとの間の切り換えを事前に知らせる先出し情報とを、伝送路を介して、外部機器から受信するデータ受信部と、
上記データ受信部で受信された画像データに基づいて、表示部に、2次元画像表示、または、左眼画像および右眼画像を交互に表示する3次元画像表示を行う表示制御部と、
上記データ受信部で受信された上記先出し情報に基づいて、シャッタメガネに、上記2次元画像表示と上記3次元画像表示との間の切り換えタイミングより前に、シャッタモード切り換えコマンドを送信するシャッタ制御部と
を備える受信装置。 - 上記シャッタモード切り換えコマンドには、シャッタモードを切り換えるまでの時間を示す情報が含まれている
請求項15に記載の受信装置。 - 上記シャッタモード切り換えコマンドには、上記2次元画像表示に対応したシャッタモードへの切り換えか上記3次元画像表示に対応したシャッタモードへの切り換えかを示す情報が付加されている
請求項16に示す受信装置。 - 2次元画像データおよび3次元画像データが時分割的に含まれる画像データと、上記2次元画像データおよび上記3次元画像データとの間の切り換えを事前に知らせる先出し情報とを、伝送路を介して、外部機器から受信するデータ受信ステップと、
上記データ受信ステップで受信された画像データに基づいて、表示部に、2次元画像表示、または、左眼画像および右眼画像を交互に表示する3次元画像表示を行う表示制御ステップと、
上記データ受信ステップで受信された上記先出し情報に基づいて、シャッタメガネに、上記2次元画像表示と上記3次元画像表示との間の切り換えタイミングより前に、シャッタモード切り換えコマンドを送信するシャッタ制御ステップと
を備える受信方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112012019870A BR112012019870A2 (pt) | 2010-12-15 | 2011-12-02 | transmissor, métodos de transmissao e de recepcao, e, receptor |
US13/574,708 US20120293621A1 (en) | 2010-12-15 | 2011-12-02 | Transmission device, transmission method, reception device, and reception method |
CN2011800226182A CN102884800A (zh) | 2010-12-15 | 2011-12-02 | 发送装置、发送方法、接收装置、以及接收方法 |
EP11849318.8A EP2521360A4 (en) | 2010-12-15 | 2011-12-02 | SENDING DEVICE, TRANSMISSION PROCEDURE, RECEPTION DEVICE AND RECEPTION PROCEDURE |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010279949A JP2012129827A (ja) | 2010-12-15 | 2010-12-15 | 送信装置、送信方法、受信装置および受信方法 |
JP2010-279949 | 2010-12-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012081427A1 true WO2012081427A1 (ja) | 2012-06-21 |
Family
ID=46244535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/077994 WO2012081427A1 (ja) | 2010-12-15 | 2011-12-02 | 送信装置、送信方法、受信装置および受信方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20120293621A1 (ja) |
EP (1) | EP2521360A4 (ja) |
JP (1) | JP2012129827A (ja) |
CN (1) | CN102884800A (ja) |
BR (1) | BR112012019870A2 (ja) |
WO (1) | WO2012081427A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103888748A (zh) * | 2014-03-24 | 2014-06-25 | 中国人民解放军国防科学技术大学 | 用于众视点三维显示系统的视频帧同步方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012147121A (ja) * | 2011-01-07 | 2012-08-02 | Sony Corp | 画像表示システム、表示装置、並びにシャッター眼鏡 |
JP2012178783A (ja) * | 2011-02-28 | 2012-09-13 | Sony Corp | 画像表示システム、表示装置、並びにシャッター眼鏡 |
JP5641090B2 (ja) | 2013-03-14 | 2014-12-17 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
GB2513111A (en) | 2013-04-08 | 2014-10-22 | Sony Corp | Data encoding and decoding |
KR20150031994A (ko) * | 2013-09-17 | 2015-03-25 | 삼성전자주식회사 | 디스플레이 장치 및 디스플레이 장치의 제어방법 |
CN108141641B (zh) | 2015-10-22 | 2020-09-22 | 三菱电机株式会社 | 视频配送装置、视频配送系统及视频配送方法 |
JP6610273B2 (ja) * | 2016-01-08 | 2019-11-27 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62231578A (ja) * | 1986-03-31 | 1987-10-12 | Sanyo Electric Co Ltd | 立体テレビジヨンシステム |
JPH09138384A (ja) | 1995-11-15 | 1997-05-27 | Sanyo Electric Co Ltd | 立体画像観察用眼鏡の制御方法 |
JP2000036969A (ja) | 1998-07-21 | 2000-02-02 | Nippon Hoso Kyokai <Nhk> | 立体画像表示方法および装置 |
JP2003045343A (ja) | 2001-08-03 | 2003-02-14 | Nippon Hoso Kyokai <Nhk> | 立体画像表示装置 |
JP2005006114A (ja) * | 2003-06-12 | 2005-01-06 | Sharp Corp | 放送データ送信装置、放送データ送信方法および放送データ受信装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3789794B2 (ja) * | 2001-09-26 | 2006-06-28 | 三洋電機株式会社 | 立体画像処理方法、装置、およびシステム |
KR101488199B1 (ko) * | 2008-03-12 | 2015-01-30 | 삼성전자주식회사 | 영상 처리 방법, 영상 재생 방법, 그 장치 및 기록매체 |
JP5469911B2 (ja) * | 2009-04-22 | 2014-04-16 | ソニー株式会社 | 送信装置および立体画像データの送信方法 |
JP2011066871A (ja) * | 2009-08-21 | 2011-03-31 | Sony Corp | コンテンツ伝送方法及び表示装置 |
KR101789636B1 (ko) * | 2010-04-30 | 2017-10-25 | 엘지전자 주식회사 | 이미지 처리 방법 및 장치 |
WO2011151958A1 (ja) * | 2010-06-02 | 2011-12-08 | 日立コンシューマエレクトロニクス株式会社 | 受信装置および出力方法 |
-
2010
- 2010-12-15 JP JP2010279949A patent/JP2012129827A/ja active Pending
-
2011
- 2011-12-02 CN CN2011800226182A patent/CN102884800A/zh active Pending
- 2011-12-02 BR BR112012019870A patent/BR112012019870A2/pt not_active Application Discontinuation
- 2011-12-02 US US13/574,708 patent/US20120293621A1/en not_active Abandoned
- 2011-12-02 WO PCT/JP2011/077994 patent/WO2012081427A1/ja active Application Filing
- 2011-12-02 EP EP11849318.8A patent/EP2521360A4/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62231578A (ja) * | 1986-03-31 | 1987-10-12 | Sanyo Electric Co Ltd | 立体テレビジヨンシステム |
JPH09138384A (ja) | 1995-11-15 | 1997-05-27 | Sanyo Electric Co Ltd | 立体画像観察用眼鏡の制御方法 |
JP2000036969A (ja) | 1998-07-21 | 2000-02-02 | Nippon Hoso Kyokai <Nhk> | 立体画像表示方法および装置 |
JP2003045343A (ja) | 2001-08-03 | 2003-02-14 | Nippon Hoso Kyokai <Nhk> | 立体画像表示装置 |
JP2005006114A (ja) * | 2003-06-12 | 2005-01-06 | Sharp Corp | 放送データ送信装置、放送データ送信方法および放送データ受信装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2521360A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103888748A (zh) * | 2014-03-24 | 2014-06-25 | 中国人民解放军国防科学技术大学 | 用于众视点三维显示系统的视频帧同步方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2012129827A (ja) | 2012-07-05 |
EP2521360A1 (en) | 2012-11-07 |
EP2521360A4 (en) | 2014-03-12 |
BR112012019870A2 (pt) | 2016-04-26 |
CN102884800A (zh) | 2013-01-16 |
US20120293621A1 (en) | 2012-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5446913B2 (ja) | 立体画像データ送信装置および立体画像データ送信方法 | |
JP5531972B2 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
WO2012081427A1 (ja) | 送信装置、送信方法、受信装置および受信方法 | |
TWI437873B (zh) | Three-dimensional image data transmission device, three-dimensional image data transmission method, three-dimensional image data receiving device and three-dimensional image data receiving method | |
WO2011001858A1 (ja) | 立体画像データ送信装置および立体画像データ受信装置 | |
JP5633259B2 (ja) | 立体画像データ送信装置、立体画像データ送信方法および立体画像データ受信装置 | |
US20110141233A1 (en) | Three-dimensional image data transmission device, three-dimensional image data transmission method, three-dimensional image data reception device, and three-dimensional image data reception method | |
KR20120036789A (ko) | 입체화상 데이터 송신 장치 및 입체화상 데이터 수신 장치 | |
US20140063187A1 (en) | Reception device, reception method, and electronic device | |
KR20120111909A (ko) | 입체화상 데이터 송신 장치, 입체화상 데이터 송신 방법, 입체화상 데이터 수신 장치 및 입체화상 데이터 수신 방법 | |
JP2011166757A (ja) | 送信装置、送信方法および受信装置 | |
WO2012060198A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
WO2014050447A1 (ja) | 送信装置、送信方法、受信装置および受信方法 | |
JP6809450B2 (ja) | 送信装置、送信方法、受信装置および受信方法 | |
WO2012063675A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法および立体画像データ受信装置 | |
JP2011010255A (ja) | 立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
JP2013176141A (ja) | 立体画像データ受信装置および立体画像データ受信方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180022618.2 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13574708 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011849318 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11849318 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012019870 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112012019870 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120808 |