WO2010008012A1 - 送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 - Google Patents
送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 Download PDFInfo
- Publication number
- WO2010008012A1 WO2010008012A1 PCT/JP2009/062788 JP2009062788W WO2010008012A1 WO 2010008012 A1 WO2010008012 A1 WO 2010008012A1 JP 2009062788 W JP2009062788 W JP 2009062788W WO 2010008012 A1 WO2010008012 A1 WO 2010008012A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- transmission
- image data
- transmission method
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 347
- 230000005540 biological transmission Effects 0.000 claims abstract description 608
- 238000012545 processing Methods 0.000 claims abstract description 92
- 230000008569 process Effects 0.000 claims abstract description 27
- 230000007175 bidirectional communication Effects 0.000 claims description 23
- 239000000284 extract Substances 0.000 claims description 6
- 238000005513 bias potential Methods 0.000 claims description 5
- 239000003990 capacitor Substances 0.000 description 31
- 238000012856 packing Methods 0.000 description 24
- 230000006854 communication Effects 0.000 description 20
- 238000004891 communication Methods 0.000 description 20
- 230000008878 coupling Effects 0.000 description 19
- 238000010168 coupling process Methods 0.000 description 19
- 238000005859 coupling reaction Methods 0.000 description 19
- 230000008054 signal transmission Effects 0.000 description 16
- 238000011084 recovery Methods 0.000 description 9
- 230000005236 sound signal Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 102100031699 Choline transporter-like protein 1 Human genes 0.000 description 6
- 102100035954 Choline transporter-like protein 2 Human genes 0.000 description 6
- 102100039497 Choline transporter-like protein 3 Human genes 0.000 description 6
- 101000940912 Homo sapiens Choline transporter-like protein 1 Proteins 0.000 description 6
- 101000948115 Homo sapiens Choline transporter-like protein 2 Proteins 0.000 description 6
- 101000889279 Homo sapiens Choline transporter-like protein 3 Proteins 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 5
- RTZKZFJDLAIYFH-UHFFFAOYSA-N Diethyl ether Chemical compound CCOCC RTZKZFJDLAIYFH-UHFFFAOYSA-N 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000000630 rising effect Effects 0.000 description 4
- 230000002457 bidirectional effect Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 102100035353 Cyclin-dependent kinase 2-associated protein 1 Human genes 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 102100029860 Suppressor of tumorigenicity 20 protein Human genes 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000012788 optical film Substances 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/005—Adapting incoming signals to the display format of the display terminal
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/006—Details of the interface to the display terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/15—Processing image signals for colour aspects of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/167—Synchronising or controlling image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/4147—PVR [Personal Video Recorder]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
- H04N21/43632—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wired protocol, e.g. IEEE 1394
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
- H04N21/43632—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wired protocol, e.g. IEEE 1394
- H04N21/43635—HDMI
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/38—Transmitter circuitry for the transmission of television signals according to analogue transmission standards
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/025—Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/173—Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/02—Handling of images in compressed format, e.g. JPEG, MPEG
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
- G09G2370/045—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller using multiple communication channels, e.g. parallel and serial
- G09G2370/047—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller using multiple communication channels, e.g. parallel and serial using display data channel standard [DDC] communication
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/10—Use of a protocol of communication by packets in interfaces along the display data pipeline
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/12—Use of DVI or HDMI protocol in interfaces along the display data pipeline
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/16—Use of wireless transmission of display information
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/22—Detection of presence or absence of input display information or of connection or disconnection of a corresponding information source
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/12—Synchronisation between the display unit and other units, e.g. other display units, video-disc players
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/003—Aspects relating to the "2D+depth" image format
Definitions
- the present invention relates to a transmission device, a stereoscopic image data transmission method, a reception device, and a stereoscopic image data reception method. More specifically, according to the present invention, when transmitting stereoscopic image data to an external device, the transmission method of stereoscopic image data to be transmitted is received from the external device as transmission method information of stereoscopic image data that can be supported by the external device.
- the present invention relates to a transmission apparatus or the like that can determine and transmit transmission method information of stereoscopic image data to be transmitted to an external device, so that transmission of stereoscopic image data between devices can be performed satisfactorily.
- Non-Patent Document 1 describes details of the HDMI standard.
- FIG. 42 shows a configuration example of the AV (Audio Visual) system 10.
- the AV system 10 includes a disc player 11 as a source device and a television receiver 12 as a sink device.
- the disc player 11 and the television receiver 12 are connected via an HDMI cable 13.
- the disc player 11 is provided with an HDMI terminal 11a to which an HDMI transmission unit (HDMITX) 11b is connected.
- the television receiver 12 is provided with an HDMI terminal 12a to which an HDMI receiving unit (HDMI RX) 12b is connected.
- One end of the HDMI cable 13 is connected to the HDMI terminal 11 a of the disc player 11, and the other end of the HDMI cable 13 is connected to the HDMI terminal 12 a of the television receiver 12.
- uncompressed image data obtained by being reproduced by the disc player 11 is transmitted to the television receiver 12 via the HDMI cable 13, and is transmitted from the disc player 11 by the television receiver 12. An image based on the processed image data is displayed. Further, uncompressed audio data obtained by being reproduced by the disc player 11 is transmitted to the television receiver 12 via the HDMI cable 13, and the audio by the audio data transmitted from the disc player 11 is received by the television receiver 12. Is output.
- FIG. 43 shows a configuration example of the HDMI transmitting unit (HDMI source) 11b of the disc player 11 and the HDMI receiving unit (HDMI sink) 12b of the television receiver 12 in the AV system 10 of FIG.
- the HDMI transmission unit 11b is an effective image section (hereinafter also referred to as an active video section as appropriate) that is a section obtained by removing a horizontal blanking section and a vertical blanking section from a section from one vertical synchronization signal to the next vertical synchronization signal. ),
- the differential signal corresponding to the pixel data of the uncompressed image for one screen is transmitted to the HDMI receiving unit 12b in one direction through a plurality of channels, and in the horizontal blanking interval or the vertical blanking interval, At least a differential signal corresponding to audio data, control data, other auxiliary data, etc. associated with an image is transmitted to the HDMI receiving unit 12b in one direction through a plurality of channels.
- the HDMI transmission unit 11 b includes the HDMI transmitter 81.
- the transmitter 81 converts, for example, pixel data of an uncompressed image into a corresponding differential signal, and uses three TMDS (Transition Differential Signaling) channels # 0, # 1, # 2, which are a plurality of channels, and an HDMI cable. 13 is serially transmitted in one direction to the HDMI receiving unit 12b connected via the H.13.
- TMDS Transition Differential Signaling
- the transmitter 81 converts audio data accompanying uncompressed images, further necessary control data and other auxiliary data, etc. into corresponding differential signals, and converts them into three TMDS channels # 0, # 1, #. 2 is serially transmitted in one direction to the HDMI receiving unit 12b connected via the HDMI cable 13.
- the transmitter 81 transmits the pixel clock synchronized with the pixel data transmitted through the three TMDS channels # 0, # 1, and # 2 to the HDMI receiving unit 12b connected via the HDMI cable 13 using the TMDS clock channel. Send.
- the HDMI receiving unit 12b receives a differential signal corresponding to the pixel data transmitted in one direction from the HDMI transmitting unit 11b through a plurality of channels in the active video period, and also receives a horizontal blanking period or a vertical blanking.
- differential signals corresponding to audio data and control data transmitted in one direction from the HDMI transmission unit 11b are received through a plurality of channels.
- the HDMI receiving unit 12b includes an HDMI receiver 82.
- the receiver 82 transmits TMDS channels # 0, # 1, and # 2 in one direction from the HDMI transmission unit 11b connected via the HDMI cable 13, and the differential signal corresponding to the pixel data and the audio Similarly, differential signals corresponding to data and control data are received in synchronization with the pixel clock transmitted from the HDMI transmission unit 11b through the TMDS clock channel.
- pixel data and audio data are transmitted in one direction from the HDMI transmission unit 11b to the HDMI reception unit 12b in synchronization with the pixel clock.
- TMDS channels # 0 to # 2 transmission channels for serial transmission
- TMDS clock channels transmission channels for transmitting pixel clocks
- DDC Display Data Channel
- CEC Consumer Electronics Control
- the DDC 83 is composed of two signal lines (not shown) included in the HDMI cable 13, and the HDMI transmission unit 11 b is connected to the HDMI reception unit 12 b connected via the HDMI cable 13 with an E-EDID (Enhanced Extended Display Display Identification Data). Used to read
- the HDMI receiving unit 12b has an EDID ROM (Read Only Memory) 85 that stores E-EDID that is performance information related to its performance (Configuration / capability). .
- the HDMI transmission unit 11b reads the E-EDID of the HDMI reception unit 12b from the HDMI reception unit 12b connected via the HDMI cable 13 through the DDC 83, and based on the E-EDID, the HDMI reception unit 12b For example, an image format (profile) supported by the electronic device having the HDMI receiving unit 12b, for example, RGB, YCbCr4: 4: 4, YCbCr4: 2: 2, and the like.
- the CEC line 84 is composed of a single signal line (not shown) included in the HDMI cable 13 and is used for bidirectional communication of control data between the HDMI transmission unit 11b and the HDMI reception unit 12b.
- the HDMI cable 13 includes a line (HPD line) 86 connected to a pin called HPD (Hot Plug Detect).
- the source device can detect the connection of the sink device using the line 86.
- the HDMI cable 13 includes a line (power supply line) 87 used for supplying power from the source device to the sink device.
- the HDMI cable 13 includes a reserved line 88.
- FIG. 44 shows an example of TMDS transmission data.
- FIG. 44 shows sections of various transmission data when image data of horizontal ⁇ vertical 1920 pixels ⁇ 1080 lines is transmitted in TMDS channels # 0, # 1, and # 2.
- Video Data period a video field
- Data island period Video Island field
- Control period a data period in which transmission data is transmitted through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- the video field period is a period from a rising edge (active edge) of a certain vertical synchronizing signal to a rising edge of the next vertical synchronizing signal, and is a horizontal blanking period (horizontal blanking) and a vertical blanking period (vertical blanking).
- the video field section is divided into an active video section (Active Video) that is a section excluding the horizontal blanking period and the vertical blanking period.
- the video data section is assigned to the active video section.
- 1920 pixels (pixels) ⁇ 1080 lines of effective pixel (Active pixel) data constituting uncompressed image data for one screen is transmitted.
- Data island section and control section are assigned to horizontal blanking period and vertical blanking period.
- auxiliary data (Auxiliary data) is transmitted. That is, the data island period is assigned to a part of the horizontal blanking period and the vertical blanking period.
- audio data packets which are data not related to control, of auxiliary data are transmitted.
- the control section is assigned to other parts of the horizontal blanking period and the vertical blanking period.
- this control period for example, vertical synchronization signals, horizontal synchronization signals, control packets, and the like, which are data related to control, of auxiliary data are transmitted.
- FIG. 45 shows an example of a packing format when image data (24 bits) is transmitted through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- image data transmission system three types of RGB 4: 4: 4, YCbCr: 4: 4: 4, and YCbCr 4: 2: 2 are shown.
- RGB4 4: 4 system
- 8-bit blue (B) data, 8-bit green (G) data, 8-bit are stored in the data area of each pixel (pixel) in TMDS channels # 0, # 1, and # 2.
- Red (R) data is arranged.
- YCbCr 4: 4: 4 system 8-bit blue difference (Cb) data, 8-bit luminance (Y) data, 8 bits are stored in the data area of each pixel (pixel) in the TMDS channels # 0, # 1, and # 2.
- Bit red difference (Cr) data is arranged.
- bits 0 to 3 of luminance (Y) data are arranged in the data area of each pixel (pixel) of TMDS channel # 0, and blue difference (Cb) data
- the data of bit 0 to bit 3 and the data of bit 0 to bit 3 of the red color difference (Cr) data are alternately arranged for each pixel.
- the data of the bits 4 to 11 of the luminance (Y) data is arranged in the data area of each pixel (pixel) of the TMDS channel # 1.
- bits 4 to 11 of blue difference (Cb) data and bit 4 of red difference (Cr) data are stored in the data area of each pixel (pixel) of TMDS channel # 2. Data of bit 11 are alternately arranged for each pixel.
- FIG. 46 shows an example of the packing format when deep color image data (48 bits) is transmitted through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- image data transmission methods two types of RGB 4: 4: 4 and YCbCr 4: 4: 4 are shown.
- TMDS clock 2 ⁇ pixel clock.
- RGB4: 4: 4 system 16-bit blue (B) data bits 0 to 7 and bits 8 to 15 are placed in the first half and second half of each pixel (pixel) in the TMDS channel # 0. Data is arranged. Also, in the RGB44: 4: 4 system, 16 bits of green (G) data bits 0 to 7 and bits 8 are placed in the first half and second half of each pixel (pixel) in TMDS channel # 1. Data of bit 15 is arranged. Further, in the RGB: 4: 4: 4 system, 16 bits of red (R) data bits 0 to 7 and bits 8 are placed in the first half and second half of each pixel (pixel) in TMDS channel # 2. Data of bit 15 is arranged.
- YCbCr4 4: 4 method
- 16-bit blue difference (Cb) data bits 0 to 7 and bits 8 are placed in the first half and second half of each pixel (pixel) in the TMDS channel # 0.
- Data of bit 15 is arranged.
- 16 bits of luminance (Y) data bits 0 to 7 and bits 8 are stored in the first half and second half of each pixel (pixel) in the TMDS channel # 1.
- Data of bit 15 is arranged.
- YCbCrb4 4: 4 system, 16 bits of red-difference (Cr) data bits 0 to 7 and bits in the first half and second half of each pixel (pixel) in the TMDS channel # 2 Data of 8 to 15 is arranged.
- Patent Document 1 proposes a transmission method and determination of stereoscopic image data, but does not propose transmission using a digital interface such as HDMI.
- Patent Document 2 proposes a transmission method using a television broadcast radio wave of stereoscopic image data, but does not propose transmission using a digital interface.
- JP 2003-111101 A Japanese Patent Laid-Open No. 2005-6114
- An object of the present invention is to enable a good transmission of stereoscopic image data between devices.
- the concept of this invention is A data transmission unit for transmitting stereoscopic image data for displaying a stereoscopic image to an external device via a transmission path;
- a transmission method information receiving unit that receives transmission method information of stereoscopic image data that can be supported by the external device, sent from the external device via the transmission path; Based on the transmission method information received by the transmission method information receiving unit, a transmission method of the stereoscopic image data that can be supported by the external device is changed as a transmission method of the stereoscopic image data transmitted by the data transmission unit.
- the transmission apparatus includes: a transmission method information transmission unit that transmits information on the transmission method of the stereoscopic image data transmitted by the data transmission unit to the external device via the transmission path.
- the concept of the present invention is A data receiving unit for receiving stereoscopic image data for displaying a stereoscopic image from an external device via a transmission path; A transmission method information receiving unit for receiving transmission method information of stereoscopic image data received by the data receiving unit from the external device; A data processing unit that processes the stereoscopic image data received by the data receiving unit based on the transmission method information received by the transmission method information receiving unit to generate left-eye image data and right-eye image data; A transmission method information storage unit for storing transmission method information of stereoscopic image data that can be supported by itself, and transmission method information stored in the transmission method information storage unit to the external device via the transmission path. And a transmission method information transmission unit for transmission.
- the transmission device receives transmission method information of stereoscopic image data that can be supported by the external device from the external device (reception device) via the transmission path.
- the receiving device stores the transmission method information of the stereoscopic image data to which the receiving device corresponds, and transmits this transmission method information to the external device (transmitting device) via the transmission path.
- the transmission device selects a predetermined transmission method from stereoscopic image data transmission methods that can be supported by the external device, based on the transmission method information received from the external device (reception device). In this case, for example, when there are a plurality of transmission methods of stereoscopic image data that can be supported by the external device, the transmission device selects the transmission method with the least image quality degradation.
- the transmission device receives the transmission rate information of the transmission path from the external device (reception device).
- the receiving apparatus acquires transmission rate information of the transmission path based on the data reception state such as an error rate, and transmits this transmission rate information to the external device (transmitting apparatus) via the transmission path.
- the transmission apparatus when the transmission apparatus receives the transmission rate information of the transmission path from the external device, based on the transmission rate information of the transmission path, together with the transmission method information of the stereoscopic image data that can be supported by the external apparatus, Select a predetermined transmission method.
- the transmission apparatus is a transmission method of stereoscopic image data that can be supported by an external device as a predetermined transmission method, and a transmission method in which a transmission rate required for transmission of stereoscopic image data is within the transmission rate of the transmission path Select.
- the transmission device can always transmit stereoscopic image data satisfactorily to the reception device regardless of the change in the state of the transmission path.
- the transmitting device transmits stereoscopic image data of the selected transmission method to an external device (receiving device) via the transmission path.
- the transmission device transmits stereoscopic image data to an external device via a transmission path using a differential signal with a plurality of channels.
- the transmission device may include pixel data constituting the two-dimensional data and depth data corresponding to the pixel data in the data area of each pixel. Place and send.
- the stereoscopic image data includes first data and second data
- the transmission device transmits the first data to the external device via the first transmission path
- the second data is transmitted to the second data.
- the second transmission path is a bidirectional communication path configured using a predetermined line of the first transmission path, and the transmission device transmits the first data to the external device via the first transmission path.
- the second data is transmitted through a plurality of channels using a differential signal, and the second data is transmitted to an external device via a bidirectional communication path.
- the first data is left eye image data or right eye image data
- the second data is right eye image data or left eye image data.
- the first data is two-dimensional image data
- the second data is depth data corresponding to each pixel.
- the transmission device transmits the transmission method information of the stereoscopic image data to be transmitted to the external device (reception device) via the transmission path.
- the transmission device transmits the transmission method information to the external device by inserting the information in the blanking period of the stereoscopic image data.
- the transmission device transmits information on the transmission method to an external device via a control data line configuring the transmission path.
- the transmission device transmits information on the transmission method to an external device via a bidirectional communication path configured using a predetermined line of the transmission path.
- the bidirectional communication path is a pair of differential transmission paths, and at least one of the pair of differential transmission paths has a function of notifying a connection state of an external device by a DC bias potential (HDMI cable). HPD line etc.).
- the receiving device receives stereoscopic image data sent from an external device (transmitting device).
- the receiving device receives transmission method information of stereoscopic image data sent from an external device. Then, the receiving device processes the received stereoscopic image data based on the transmission method information, and generates left eye image data and right eye image data.
- the transmission device when transmitting stereoscopic image data from the transmission device to the reception device, transmits the transmission method of the stereoscopic image data to be transmitted, and transmission method information of the stereoscopic image data that can be supported by the reception device. Receive from and decide. At this time, the transmission device transmits the transmission method information of the stereoscopic image data to be transmitted to the reception device. Therefore, transmission of stereoscopic image data between the transmission device and the reception device (between devices) can be performed satisfactorily.
- the transmission device when the transmission device transmits stereoscopic image data to the reception device (external device), the transmission device transmits the stereoscopic image data that can be transmitted by the external device as a transmission method of the stereoscopic image data to be transmitted.
- the method information is received from the external device and determined, and the transmission device transmits the transmission method information of the stereoscopic image data to be transmitted to the external device, and performs good transmission of the stereoscopic image data between the devices. be able to.
- FIG. 1 is a block diagram showing a configuration example of an AV system as an embodiment of the present invention. It is a figure which shows the "field sequential system” and the "phase difference plate system” which are the example of a display system of a stereoscopic vision image. It is a block diagram which shows the structural example of the disc player (source device) which comprises AV system. It is a block diagram which shows the structural example of the television receiver (sink apparatus) which comprises AV system. It is a block diagram which shows the structural example of an HDMI transmission part (HDMI source) and an HDMI receiving part (HDMI sink). It is a block diagram which shows the structural example of the HDMI transmitter which comprises an HDMI transmission part, and the HDMI receiver which comprises an HDMI receiving part.
- HDMI transmission part HDMI transmission part
- HDMI receiving part HDMI receiving part
- FIG. 5 is a connection diagram illustrating a configuration example of a high-speed data line interface that is an interface of a bidirectional communication path configured using a reserve line of an HDMI cable and an HDD line in a source device and a sink device. It is a figure which shows the image data (Image data of a 1920 * 1080p pixel format) of the left eye (L) and the right eye (R).
- 3D (stereoscopic) image data transmission method (a) a method of sequentially switching pixel data of left eye image data and pixel data of right eye image data for each TMDS clock, and (b) left eye image data. For alternately transmitting one line of right eye image data and one line of right eye image data, and (c) a method of sequentially switching and transmitting left eye image data and right eye image data for each field.
- FIG. 1 3D (stereoscopic) image data transmission method, (a) a method of sequentially switching pixel data of left eye image data and pixel data of right eye image data for each TMDS clock, and (b) left eye image data. For alternately transmitting one line of right eye image data and one line of right eye image data, and (c) a method of sequentially switching and transmitting left eye image data and right eye image data for each field.
- 3D (stereoscopic) image data transmission method (a) a method of alternately transmitting one line of left eye image data and one line of right eye image data, (b) the left eye in the first half of the vertical direction A method of transmitting data of each line of image data, and transmitting data of each line of left eye image data in the second half of the vertical direction, (c) transmitting pixel data of left eye image data in the first half of the horizontal direction, and horizontally It is a figure for demonstrating the system which transmits the pixel data of left eye image data in the latter half of a direction.
- TMDS transmission data example in the system (system (1)) which switches pixel data of left eye image data and pixel data of right eye image data sequentially for every TMDS clock, and transmits. It is a figure which shows the example of a packing format at the time of transmitting 3D image data of a system (1) with three TMDS channels # 0, # 1, and # 2 of HDMI. It is a figure which shows the TMDS transmission data example in the system (system (2)) which transmits one line of left eye image data and one line of right eye image data alternately. It is a figure which shows the example of a packing format at the time of transmitting 3D image data of a system (2) with three TMDS channels # 0, # 1, and # 2 of HDMI.
- FIG. 1 It is a figure which shows the example of a packing format at the time of transmitting 3D image data of a system (4) with three TMDS channels # 0, # 1, and # 2 of HDMI.
- FIG. 3 is a diagram for explaining an example of a packing format when MPEG-C 3D image data is transmitted using three HDMI TMDS channels # 0, # 1, and # 2.
- FIG. 6 is a diagram for explaining decoding processing in a sink device (television receiver) that receives MPEG-C 3D image data. It is a figure which shows the data structure example of E-EDID memorize
- FIG. 1 shows a configuration example of an AV (Audio Visual) system 200 as an embodiment.
- the AV system 200 includes a disc player 210 as a source device and a television receiver 250 as a sink device.
- the disc player 210 and the television receiver 250 are connected via an HDMI cable 350.
- the disc player 210 is provided with an HDMI terminal 211 to which an HDMI transmission unit (HDMITX) 212 and a high-speed data line interface (I / F) 213 are connected.
- the television receiver 250 is provided with an HDMI terminal 251 to which an HDMI receiving unit (HDMIRX) 252 and a high-speed data line interface (I / F) 253 are connected.
- One end of the HDMI cable 350 is connected to the HDMI terminal 211 of the disc player 210, and the other end of the HDMI cable 350 is connected to the HDMI terminal 251 of the television receiver 250.
- uncompressed (baseband) image data obtained by being reproduced by the disc player 210 is transmitted to the television receiver 250 via the HDMI cable 350, and the television receiver 250 receives the disc data.
- An image based on the image data transmitted from the player 210 is displayed.
- uncompressed audio data obtained by being played back by the disc player 210 is transmitted to the television receiver 250 via the HDMI cable 350, and the audio by the audio data transmitted from the disc player 210 is received by the television receiver 250. Is output.
- the television receiver 250 When the image data transmitted from the disc player 210 is 3D image data (stereoscopic image data) for displaying a stereoscopic image, the television receiver 250 provides a stereoscopic image to the user. A stereoscopic image is displayed.
- 3D image data stereo image data
- this stereoscopic image display method for example, as shown in FIG. 2A, a left-eye (L) image and a right-eye (R) image are alternately displayed for each field, so-called “field sequential”. There is "method”. In this display method, the television receiver side needs to be driven at twice the normal frame rate. Further, in this display method, it is not necessary to attach an optical film to the display unit, but it is necessary to switch opening and closing of the shutters of the left and right lens units in synchronization with the field of the display unit on the side of the glasses worn by the user. .
- a stereoscopic image display method for example, as shown in FIG. 2B, a method of switching and displaying a left eye (L) image and a right eye (R) image for each line, so-called “position”.
- “Phase difference plate method” In this display method, a polarizing plate having a polarization direction different by 90 ° for each line is attached to the display unit on the television receiver side. Stereo vision is realized by blocking the light of the image of the reverse eye with polarized glasses worn by the user.
- FIG. 3 shows a configuration example of the disc player 210.
- the disc player 210 has an HDMI terminal 211, an HDMI transmission unit 212, a high-speed data line interface 213, and a DTCP (Digital Transmission Content Protection) circuit 230.
- the disc player 210 includes a CPU (Central Processing Unit) 214, a CPU bus 215, a flash ROM (Read Only Memory) 216, an SDRAM (Synchronous DRAM) 217, a remote control receiving unit 218, and a remote control transmitter 219. have.
- CPU Central Processing Unit
- the disc player 210 has an IDE interface 220, a BD (Blu-ray Disc) drive 221, an internal bus 222, an Ethernet interface (Ethernet I / F) 223, and a network terminal 224. Further, the disc player 210 includes an MPEG (Moving Picture Picture Expert Group) decoder 225, a graphic generation circuit 226, a video output terminal 227, an audio output terminal 228, and a 3D signal processing unit 229. “Ethernet” and “Ethernet” are registered trademarks.
- the CPU 214, the flash ROM 216, the SDRAM 217, and the remote control receiving unit 218 are connected to the CPU bus 215. Further, the CPU 214, the IDE interface 220, the Ethernet interface 223, the DTCP circuit 230, and the MPEG decoder 225 are connected to the internal bus 222.
- the CPU 214 controls the operation of each part of disc player 210.
- the flash ROM 216 stores control software and data.
- the SDRAM 217 constitutes a work area for the CPU 214.
- the CPU 214 develops software and data read from the flash ROM 216 on the SDRAM 217 to activate the software, and controls each unit of the disc player 210.
- the remote control receiving unit 218 receives the remote control signal (remote control code) transmitted from the remote control transmitter 219 and supplies it to the CPU 214.
- the CPU 214 controls each part of the disc player 210 according to the remote control code.
- the BD drive 221 records content data on a BD (not shown) as a disc-shaped recording medium, or reproduces content data from the BD.
- the BD drive 221 is connected to the internal bus 222 via the IDE interface 220.
- the MPEG decoder 225 performs decoding processing on the MPEG2 stream reproduced by the BD drive 221 to obtain image and audio data.
- the DTCP circuit 230 is necessary when content data reproduced by the BD drive 221 is transmitted to the network via the network terminal 224 or to the bidirectional communication path from the high-speed data line interface 213 via the HDMI terminal 211. Encrypt accordingly.
- the graphic generation circuit 226 performs graphics data superimposition processing on the image data obtained by the MPEG decoder 225 as necessary.
- the video output terminal 227 outputs image data output from the graphic generation circuit 226.
- the audio output terminal 228 outputs the audio data obtained by the MPEG decoder 225.
- the HDMI transmission unit (HDMI source) 212 transmits baseband image (video) and audio data from the HDMI terminal 211 by communication conforming to HDMI. Details of the HDMI transmission unit 212 will be described later.
- the high-speed data line interface 213 is an interface of a bidirectional communication path configured using predetermined lines (in this embodiment, a reserved line and an HPD line) constituting the HDMI cable 350.
- the high-speed data line interface 213 is inserted between the Ethernet interface 223 and the HDMI terminal 211.
- the high-speed data line interface 213 transmits the transmission data supplied from the CPU 214 to the counterpart device from the HDMI terminal 211 via the HDMI cable 350.
- the high-speed data line interface 213 supplies received data received from the counterpart device from the HDMI cable 350 via the HDMI terminal 211 to the CPU 214. Details of the high-speed data line interface 213 will be described later.
- the 3D signal processing unit 229 transmits 3D image data for displaying a stereoscopic image through the HDMI TMDS channel among the image data obtained by the MPEG decoder 225, the 3D signal processing unit 229 selects the 3D image data according to the transmission method.
- the 3D image data is composed of left-eye image data and right-eye image data, or is composed of two-dimensional image data and depth data corresponding to each pixel (MPEG-C system). Details of the type of 3D image data transmission method, selection of the transmission method, and the packing format of each method will be described later.
- an MPEG stream of a digital tuner (not shown) or content data to be recorded is acquired from the network terminal 224 via the Ethernet interface 223 or from the HDMI terminal 211 via the high-speed data line interface 213 and the Ethernet interface 223. .
- This content data is input to the IDE interface 220 and recorded on the BD by the BD drive 221.
- recording may be performed on an HDD (hard disk drive) (not shown) connected to the IDE interface 220.
- content data (MPEG stream) reproduced from the BD by the BD drive 221 is supplied to the MPEG decoder 225 via the IDE interface 220.
- the reproduced content data is decoded, and baseband image and audio data is obtained.
- the image data is output to the video output terminal 227 through the graphic generation circuit 226.
- the audio data is output to the audio output terminal 228.
- the image and audio data obtained by the MPEG decoder 225 is transmitted through the HDMI TMDS channel at the time of reproduction, the image and audio data are supplied to the HDMI transmission unit 212 and packed.
- the data is output from the HDMI transmission unit 212 to the HDMI terminal 211.
- the image data is 3D image data
- the 3D image data is processed by the 3D signal processing unit 229 into a state corresponding to the selected transmission method, and then supplied to the HDMI transmission unit 212.
- the content data reproduced by the BD drive 221 When the content data reproduced by the BD drive 221 is sent to the network during reproduction, the content data is encrypted by the DTCP circuit 230 and then output to the network terminal 224 via the Ethernet interface 223. Is done. Similarly, when the content data reproduced by the BD drive 221 is sent to the bidirectional communication path of the HDMI cable 350 during reproduction, the content data is encrypted by the DTCP circuit 230, and then the Ethernet interface 223, The data is output to the HDMI terminal 211 via the high-speed data line interface 213.
- FIG. 4 shows a configuration example of the television receiver 250.
- the television receiver 250 includes an HDMI terminal 251, an HDMI receiving unit 252, a high-speed data line interface 253, and a 3D signal processing unit 254.
- the television receiver 250 includes an antenna terminal 255, a digital tuner 256, a demultiplexer 257, an MPEG decoder 258, a video signal processing circuit 259, a graphic generation circuit 260, a panel drive circuit 261, and a display panel 262. And have.
- the television receiver 250 includes an audio signal processing circuit 263, an audio amplification circuit 264, a speaker 265, an internal bus 270, a CPU 271, a flash ROM 272, and a DRAM (Dynamic Random Access Memory) 273. Yes.
- the television receiver 250 includes an Ethernet interface (Ethernet I / F) 274, a network terminal 275, a remote control receiver 276, a remote control transmitter 277, and a DTCP circuit 278.
- Ethernet interface Ethernet I / F
- the antenna terminal 255 is a terminal for inputting a television broadcast signal received by a receiving antenna (not shown).
- the digital tuner 256 processes the television broadcast signal input to the antenna terminal 255 and outputs a predetermined transport stream corresponding to the user's selected channel.
- the demultiplexer 257 extracts a partial TS (Transport Stream) (a TS packet of video data and a TS packet of audio data) corresponding to the user's selected channel from the transport stream obtained by the digital tuner 256.
- TS Transport Stream
- the demultiplexer 257 extracts PSI / SI (Program Specific Information / Service Information) from the transport stream obtained by the digital tuner 256 and outputs the PSI / SI (Program Specific Information / Service Information) to the CPU 271.
- PSI / SI Program Specific Information / Service Information
- a plurality of channels are multiplexed in the transport stream obtained by the digital tuner 256.
- the process of extracting the partial TS of an arbitrary channel from the transport stream by the demultiplexer 257 can be performed by obtaining the packet ID (PID) information of the arbitrary channel from the PSI / SI (PAT / PMT). .
- PID packet ID
- the MPEG decoder 258 performs a decoding process on a video PES (Packetized Elementary Stream) packet configured by a TS packet of the video data obtained by the demultiplexer 257 to obtain image data. Also, the MPEG decoder 258 performs a decoding process on the audio PES packet configured by the TS packet of the audio data obtained by the demultiplexer 257 to obtain audio data.
- a video PES Packetized Elementary Stream
- the video signal processing circuit 259 and the graphic generation circuit 260 perform scaling processing (resolution conversion processing) and graphic processing on the image data obtained by the MPEG decoder 258 or the image data received by the HDMI receiving unit 252 as necessary. Data superimposition processing is performed.
- the video signal processing circuit 259 performs a stereoscopic image (see FIG. 2) on the left eye image data and the right eye image data. ) Is displayed.
- the panel drive circuit 261 drives the display panel 262 based on the video (image) data output from the graphic generation circuit 260.
- the display panel 262 includes, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), and the like.
- the audio signal processing circuit 263 performs necessary processing such as D / A conversion on the audio data obtained by the MPEG decoder 258.
- the audio amplifier circuit 264 amplifies the audio signal output from the audio signal processing circuit 263 and supplies the amplified audio signal to the speaker 265.
- the CPU 271 controls the operation of each unit of the television receiver 250.
- the flash ROM 272 stores control software and data.
- the DRAM 273 constitutes a work area for the CPU 271.
- the CPU 271 develops software and data read from the flash ROM 272 on the DRAM 273 to activate the software, and controls each unit of the television receiver 250.
- the remote control receiving unit 276 receives the remote control signal (remote control code) transmitted from the remote control transmitter 277 and supplies it to the CPU 271.
- the CPU 271 controls each unit of the television receiver 250 based on the remote control code.
- the network terminal 275 is a terminal connected to the network, and is connected to the Ethernet interface 274.
- the CPU 271, flash ROM 272, DRAM 273, and Ethernet interface 274 are connected to the internal bus 270.
- the DTCP circuit 278 decrypts the encrypted data supplied from the network terminal 275 or the high-speed data line interface 253 to the Ethernet interface 274.
- the HDMI receiving unit (HDMI sink) 252 receives baseband image (video) and audio data supplied to the HDMI terminal 251 via the HDMI cable 350 by communication conforming to HDMI. Details of the HDMI receiving unit 252 will be described later. Similar to the high-speed data line interface 213 of the disc player 210 described above, the high-speed data line interface 253 is configured using predetermined lines (reserved line and HPD line in this embodiment) constituting the HDMI cable 350. It is an interface for bidirectional communication.
- the high-speed data line interface 253 is inserted between the Ethernet interface 274 and the HDMI terminal 251.
- the high-speed data line interface 253 transmits the transmission data supplied from the CPU 271 to the counterpart device from the HDMI terminal 251 via the HDMI cable 350.
- the high-speed data line interface 253 supplies reception data received from the counterpart device from the HDMI cable 350 via the HDMI terminal 251 to the CPU 271. Details of the high-speed data line interface 253 will be described later.
- the 3D signal processing unit 254 performs processing (decoding processing) corresponding to the transmission method on the 3D image data received by the HDMI receiving unit 252 to generate left eye image data and right eye image data. That is, the 3D signal processing unit 254 performs processing opposite to that of the 3D signal processing unit 229 of the disc player 210 described above to form left-eye image data and right-eye image data, or two-dimensional data constituting 3D image data. Obtain image data and depth data. In addition, when the 2D image data and the depth data are acquired (MPEG-C method), the 3D signal processing unit 229 further uses the 2D image data and the depth data to generate the left eye image data and the right data. An operation for generating eye image data is performed.
- the operation of the television receiver 250 shown in FIG. 4 will be briefly described.
- the television broadcast signal input to the antenna terminal 255 is supplied to the digital tuner 256.
- the digital tuner 256 processes the television broadcast signal, outputs a predetermined transport stream corresponding to the user's selected channel, and supplies the predetermined transport stream to the demultiplexer 257.
- a partial TS video data TS packet, audio data TS packet
- the demultiplexer 257 a partial TS (video data TS packet, audio data TS packet) corresponding to the user's selected channel is extracted from the transport stream, and the partial TS is supplied to the MPEG decoder 258.
- the video PES packet constituted by the TS packet of the video data is decoded to obtain the video data.
- the video data is subjected to scaling processing (resolution conversion processing), scaling processing, graphics data superimposition processing, and the like in the video signal processing circuit 259 and the graphic generation circuit 260 as necessary, and then the panel drive circuit 261. To be supplied. Therefore, the display panel 262 displays an image corresponding to the user's selected channel.
- audio data is obtained by performing a decoding process on the audio PES packet configured by the TS packet of audio data.
- the audio data is subjected to necessary processing such as D / A conversion by the audio signal processing circuit 263, further amplified by the audio amplification circuit 264, and then supplied to the speaker 265. Therefore, sound corresponding to the user's selected channel is output from the speaker 265.
- encrypted content data (image data, audio data) supplied from the network terminal 275 to the Ethernet interface 274 or supplied from the HDMI terminal 251 via the high-speed data line interface 253 to the Ethernet interface 274. Is decoded by the DTCP circuit 274 and then supplied to the MPEG decoder 258. Thereafter, the operation is the same as when the above-described television broadcast signal is received, an image is displayed on the display panel 262, and sound is output from the speaker 265.
- the HDMI receiving unit 252 acquires image data and audio data transmitted from the disc player 210 connected to the HDMI terminal 251 via the HDMI cable 350.
- the image data is supplied to the video signal processing circuit 259 via the 3D signal processing unit 254.
- the audio data is directly supplied to the audio signal processing circuit 263. Thereafter, the operation is the same as when the above-described television broadcast signal is received, an image is displayed on the display panel 262, and sound is output from the speaker 265.
- the 3D signal processing unit 254 When the image data received by the HDMI receiving unit 252 is 3D image data, the 3D signal processing unit 254 performs processing (decoding processing) corresponding to the transmission method on the 3D image data. Left eye image data and right eye image data are generated. Then, the left eye image data and the right eye image data are supplied from the 3D signal processing unit 254 to the video signal processing unit 259. Also, in the video signal processing circuit 259, when left-eye image data and right-eye image data constituting 3D image data are supplied, a stereoscopic image (FIG. 2) is generated. Therefore, a stereoscopic image is displayed on the display panel 262.
- processing decoding processing
- FIG. 5 shows a configuration example of the HDMI transmitting unit (HDMI source) 212 of the disc player 210 and the HDMI receiving unit (HDMI sink) 252 of the television receiver 250 in the AV system 200 of FIG.
- HDMI transmitting unit HDMI source
- HDMI receiving unit HDMI sink
- the HDMI transmission unit 212 is an effective image section (hereinafter also referred to as an active video section as appropriate) that is a section obtained by removing a horizontal blanking section and a vertical blanking section from a section from one vertical synchronization signal to the next vertical synchronization signal. ),
- the differential signal corresponding to the pixel data of the uncompressed image for one screen is transmitted to the HDMI receiving unit 252 in one direction through a plurality of channels, and in the horizontal blanking interval or the vertical blanking interval,
- a differential signal corresponding to at least audio data, control data, other auxiliary data, etc. associated with the image is transmitted to the HDMI receiving unit 252 in one direction through a plurality of channels.
- the HDMI transmission unit 212 includes the HDMI transmitter 81.
- the transmitter 81 converts, for example, pixel data of an uncompressed image into a corresponding differential signal, and is connected via the HDMI cable 350 with three TMDS channels # 0, # 1, and # 2 that are a plurality of channels. Serial transmission in one direction to the HDMI receiving unit 252.
- the transmitter 81 converts audio data accompanying uncompressed images, further necessary control data and other auxiliary data, etc. into corresponding differential signals, and converts them into three TMDS channels # 0, # 1, #. 2 is serially transmitted in one direction to the HDMI sink 252 connected via the HDMI cable 350.
- the transmitter 81 transmits the pixel clock synchronized with the pixel data transmitted through the three TMDS channels # 0, # 1, and # 2 to the HDMI receiving unit 252 connected via the HDMI cable 350 using the TMDS clock channel. Send.
- the HDMI receiving unit 252 receives a differential signal corresponding to pixel data transmitted in one direction from the HDMI transmitting unit 212 using a plurality of channels in the active video section, and also receives a horizontal blanking section or a vertical blanking. In a section, differential signals corresponding to audio data and control data transmitted in one direction from the HDMI transmission unit 212 are received through a plurality of channels.
- the HDMI receiving unit 252 has an HDMI receiver 82.
- the receiver 82 is transmitted in one direction from the HDMI transmission unit 212 connected via the HDMI cable 350 via the TMDS channels # 0, # 1, and # 2.
- differential signals corresponding to data and control data are received in synchronization with the pixel clock transmitted from the HDMI transmission unit 212 via the TMDS clock channel.
- pixel data and audio data are transmitted in one direction from the HDMI transmission unit 212 to the HDMI reception unit 252 in synchronization with the pixel clock.
- TMDS channels # 0 to # 2 transmission channels for serial transmission
- TMDS clock channel transmission channel for transmitting a pixel clock
- DDC Display Data Channel
- CEC line 84 There is a channel.
- the DDC 83 includes two signal lines (not shown) included in the HDMI cable 350, and the HDMI transmission unit 212 is connected to the HDMI reception unit 252 connected via the HDMI cable 350 from the E-EDID (Enhanced Extended Extended Display Identification Data). Used to read
- E-EDID Enhanced Extended Extended Display Identification Data
- the HDMI receiving unit 252 has an EDID ROM (Read Only Memory) 85 that stores E-EDID, which is performance information related to its performance (Configuration / capability), in addition to the HDMI receiver 81. .
- E-EDID is performance information related to its performance (Configuration / capability)
- the HDMI transmission unit 212 reads the E-EDID of the HDMI reception unit 252 from the HDMI reception unit 252 connected via the HDMI cable 350 via the DDC 83.
- the HDMI transmission unit 212 sends the read E-EDID to the CPU 214.
- the CPU 214 uses this E-EDID.
- the data is stored in the flash ROM 272 or the DRAM 273.
- the CPU 214 can recognize the performance setting of the HDMI receiving unit 252 based on the E-EDID. For example, the CPU 214 recognizes an image format (profile) supported by the electronic device having the HDMI receiving unit 252, for example, RGB, YCbCr 4: 4: 4, YCbCr 4: 2: 2, and the like. In this embodiment, the CPU 214 recognizes a 3D image / audio data transmission method that can be supported by the electronic apparatus having the HDMI receiving unit 252 based on the 3D image data transmission method information included in the E-EDID. .
- the CEC line 84 is composed of one signal line (not shown) included in the HDMI cable 350, and is used for bidirectional communication of control data between the HDMI transmission unit 212 and the HDMI reception unit 252.
- the HDMI cable 350 includes a line (HPD line) 86 connected to a pin called HPD (Hot Plug Detect).
- HPD Hot Plug Detect
- the source device can detect the connection of the sink device using the line 86.
- the HDMI cable 350 includes a line 87 used for supplying power from the source device to the sink device.
- the HDMI cable 350 includes a reserved line 88.
- FIG. 6 shows a configuration example of the HDMI transmitter 81 and the HDMI receiver 82 of FIG.
- the HDMI transmitter 81 includes three encoders / serializers 81A, 81B, and 81C corresponding to the three TMDS channels # 0, # 1, and # 2, respectively.
- Each of the encoders / serializers 81A, 81B, and 81C encodes the image data, auxiliary data, and control data supplied thereto, converts the parallel data into serial data, and transmits the data by a differential signal.
- the B component (B component) is supplied to the encoder / serializer 81A, and the G component (Gcomponent) is The encoder / serializer 81B is supplied, and the R component (R component) is supplied to the encoder / serializer 81C.
- auxiliary data for example, there are audio data and control packets.
- the control packets are supplied to, for example, the encoder / serializer 81A, and the audio data is supplied to the encoders / serializers 81B and 81C.
- control data includes a 1-bit vertical synchronization signal (VSYNC), a 1-bit horizontal synchronization signal (HSYNC), and 1-bit control bits CTL0, CTL1, CTL2, and CTL3.
- the vertical synchronization signal and the horizontal synchronization signal are supplied to the encoder / serializer 81A.
- the control bits CTL0 and CTL1 are supplied to the encoder / serializer 81B, and the control bits CTL2 and CTL3 are supplied to the encoder / serializer 81C.
- the encoder / serializer 81A transmits the B component of the image data, the vertical synchronization signal and the horizontal synchronization signal, and auxiliary data supplied thereto in a time division manner. That is, the encoder / serializer 81A converts the B component of the image data supplied thereto into 8-bit parallel data that is a fixed number of bits. Further, the encoder / serializer 81A encodes the parallel data, converts it into serial data, and transmits it through the TMDS channel # 0.
- the encoder / serializer 81A encodes 2-bit parallel data of the vertical synchronization signal and horizontal synchronization signal supplied thereto, converts the data into serial data, and transmits the serial data through the TMDS channel # 0. Furthermore, the encoder / serializer 81A converts the auxiliary data supplied thereto into parallel data in units of 4 bits. Then, the encoder / serializer 81A encodes the parallel data, converts it into serial data, and transmits it through the TMDS channel # 0.
- Encoder / serializer 81B transmits the G component of image data, control bits CTL0 and CTL1, and auxiliary data supplied thereto in a time-sharing manner. That is, the encoder / serializer 81B sets the G component of the image data supplied thereto as parallel data in units of 8 bits, which is a fixed number of bits. Further, the encoder / serializer 81B encodes the parallel data, converts it into serial data, and transmits it through the TMDS channel # 1.
- the encoder / serializer 81B encodes the 2-bit parallel data of the control bits CTL0 and CTL1 supplied thereto, converts the data into serial data, and transmits it through the TMDS channel # 1. Furthermore, the encoder / serializer 81B converts the auxiliary data supplied thereto into parallel data in units of 4 bits. Then, the encoder / serializer 81B encodes the parallel data, converts it into serial data, and transmits it through the TMDS channel # 1.
- the encoder / serializer 81C transmits the R component of the image data, control bits CTL2 and CTL3, and auxiliary data supplied thereto in a time division manner. That is, the encoder / serializer 81C sets the R component of the image data supplied thereto as parallel data in units of 8 bits, which is a fixed number of bits. Further, the encoder / serializer 81C encodes the parallel data, converts it into serial data, and transmits it through the TMDS channel # 2.
- the encoder / serializer 81C encodes the 2-bit parallel data of the control bits CTL2 and CTL3 supplied thereto, converts it into serial data, and transmits it through the TMDS channel # 2. Furthermore, the encoder / serializer 81C converts the auxiliary data supplied thereto into parallel data in units of 4 bits. Then, the encoder / serializer 81C encodes the parallel data, converts it into serial data, and transmits it through the TMDS channel # 2.
- the HDMI receiver 82 has three recovery / decoders 82A, 82B, and 82C corresponding to the three TMDS channels # 0, # 1, and # 2, respectively. Then, each of the recovery / decoders 82A, 82B, and 82C receives image data, auxiliary data, and control data transmitted as differential signals through the TMDS channels # 0, # 1, and # 2. Further, each of the recovery / decoders 82A, 82B, and 82C converts the image data, auxiliary data, and control data from serial data to parallel data, and further decodes and outputs them.
- the recovery / decoder 82A receives the B component of image data, the vertical synchronization signal, the horizontal synchronization signal, and the auxiliary data that are transmitted as differential signals through the TMDS channel # 0. Then, the recovery / decoder 82A converts the B component of the image data, the vertical synchronization signal, the horizontal synchronization signal, and the auxiliary data from serial data to parallel data, and decodes and outputs them.
- the recovery / decoder 82B receives the G component of the image data, the control bits CTL0 and CTL1, and the auxiliary data transmitted by the differential signal through the TMDS channel # 1. Then, the recovery / decoder 82B converts the G component of the image data, the control bits CTL0 and CTL1, and the auxiliary data from serial data to parallel data, and decodes and outputs them.
- the recovery / decoder 82C receives the R component of the image data, the control bits CTL2 and CTL3, and the auxiliary data transmitted as a differential signal through the TMDS channel # 2. Then, the recovery / decoder 82C converts the R component of the image data, the control bits CTL2 and CTL3, and the auxiliary data from serial data to parallel data, and decodes and outputs them.
- FIG. 7 shows an example of the structure of TMDS transmission data.
- FIG. 7 shows sections of various transmission data when image data of horizontal ⁇ vertical 1920 pixels ⁇ 1080 lines is transmitted on the TMDS channels # 0, # 1, and # 2.
- Video Data period a video field
- Data island period Video Island field
- Control period a data period in which transmission data is transmitted through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- the video field period is a period from a rising edge (active edge) of a certain vertical synchronizing signal to a rising edge of the next vertical synchronizing signal, and is a horizontal blanking period (horizontal blanking) and a vertical blanking period (vertical blanking).
- the video field section is divided into an active video section (Active Video) that is a section excluding the horizontal blanking period and the vertical blanking period.
- the video data section is assigned to the active video section.
- 1920 pixels (pixels) ⁇ 1080 lines of effective pixel (Active pixel) data constituting uncompressed image data for one screen is transmitted.
- Data island section and control section are assigned to horizontal blanking period and vertical blanking period.
- auxiliary data (Auxiliary data) is transmitted. That is, the data island period is assigned to a part of the horizontal blanking period and the vertical blanking period.
- audio data packets which are data not related to control, of auxiliary data are transmitted.
- the control section is assigned to other parts of the horizontal blanking period and the vertical blanking period.
- this control period for example, vertical synchronization signals, horizontal synchronization signals, control packets, and the like, which are data related to control, of auxiliary data are transmitted.
- FIG. 8 shows an example of the pin arrangement of the HDMI terminals 211 and 251.
- the pin arrangement shown in FIG. 8 is called type A (type-A).
- TMDS Data # i + which is the differential signal of TMDS channel #i
- the two lines that transmit TMDS Data # i- are two lines that are assigned TMDS Data # i + (the pin number is 1). , 4 and 7) and TMDS Data # i- assigned pins (pin numbers 3, 6 and 9).
- the CEC line 84 through which the CEC signal, which is control data, is transmitted is connected to the pin with the pin number 13, and the pin with the pin number 14 is a reserved pin. Further, a line through which an SDA (Serial Data) signal such as E-EDID is transmitted is connected to a pin having a pin number of 16, and an SCL (Serial ⁇ Clock) signal that is a clock signal used for synchronization when the SDA signal is transmitted and received. Is transmitted to a pin having a pin number of 15.
- the above-described DDC 83 includes a line for transmitting the SDA signal and a line for transmitting the SCL signal.
- the HPD line 86 for detecting the connection of the sink device by the source device is connected to the pin having the pin number 19. Further, as described above, the line 87 for supplying power is connected to a pin having a pin number of 18.
- the high-speed data line interface 213 of the disc player 210 and the high-speed data line interface 253 of the television receiver 250 will be described.
- the disc player 210 will be described as a source device
- the television receiver 250 will be described as a sink device.
- FIG. 9 shows a configuration example of the high-speed data line interface of the source device and the sink device.
- This high-speed data line interface constitutes a communication unit that performs LAN (Local Area Network) communication.
- This communication unit is a pair of differential transmission lines among a plurality of lines constituting the HDMI cable.
- a reserve line (Ether + line) corresponding to a vacant (Reserve) pin (14 pins)
- Communication is performed using a bidirectional communication path constituted by HPD lines (Ether-lines) corresponding to the HPD pins (19 pins).
- the source device includes a LAN signal transmission circuit 411, a termination resistor 412, AC coupling capacitors 413 and 414, a LAN signal reception circuit 415, a subtraction circuit 416, a pull-up resistor 421, a resistor 422 and a capacitor 423 constituting a low-pass filter, and a comparator 424. , A pull-down resistor 431, a resistor 432 and a capacitor 433 forming a low-pass filter, and a comparator 434.
- the high-speed data line interface (high-speed data line I / F) includes a LAN signal transmission circuit 411, a terminating resistor 412, AC coupling capacitors 413 and 414, a LAN signal reception circuit 415, and a subtraction circuit 416.
- a series circuit of a pull-up resistor 421, an AC coupling capacitor 413, a termination resistor 412, an AC coupling capacitor 414, and a pull-down resistor 431 is connected between the power supply line (+ 5.0V) and the ground line.
- a connection point P1 between the AC coupling capacitor 413 and the termination resistor 412 is connected to the positive output side of the LAN signal transmission circuit 411 and to the positive input side of the LAN signal reception circuit 415.
- the connection point P2 between the AC coupling capacitor 414 and the termination resistor 412 is connected to the negative output side of the LAN signal transmission circuit 411 and to the negative input side of the LAN signal reception circuit 415.
- a transmission signal (transmission data) SG411 is supplied to the input side of the LAN signal transmission circuit 411.
- the output signal SG412 of the LAN signal receiving circuit 415 is supplied to the positive side terminal of the subtraction circuit 416, and the transmission signal (transmission data) SG411 is supplied to the negative side terminal of the subtraction circuit 416.
- the transmission signal SG411 is subtracted from the output signal SG412 of the LAN signal receiving circuit 415 to obtain a reception signal (reception data) SG413.
- connection point Q1 between the pull-up resistor 421 and the AC coupling capacitor 413 is connected to the ground line via a series circuit of the resistor 422 and the capacitor 423.
- the output signal of the low-pass filter obtained at the connection point between the resistor 422 and the capacitor 423 is supplied to one input terminal of the comparator 424.
- the output signal of the low-pass filter is compared with a reference voltage Vref1 (+ 3.75V) supplied to the other input terminal.
- the output signal SG414 of the comparator 424 is supplied to the control unit (CPU) of the source device.
- connection point Q2 between the AC coupling capacitor 414 and the pull-down resistor 431 is connected to the ground line via a series circuit of the resistor 432 and the capacitor 433.
- the output signal of the low-pass filter obtained at the connection point between the resistor 432 and the capacitor 433 is supplied to one input terminal of the comparator 434.
- the output signal of the low-pass filter is compared with a reference voltage Vref2 (+1.4 V) supplied to the other input terminal.
- the output signal SG415 of the comparator 434 is supplied to the control unit (CPU) of the source device.
- the sink device includes a LAN signal transmission circuit 441, a termination resistor 442, AC coupling capacitors 443 and 444, a LAN signal reception circuit 445, a subtraction circuit 446, a pull-down resistor 451, a resistor 452 and a capacitor 453 constituting a low-pass filter, a comparator 454, A choke coil 461, a resistor 462, and a resistor 463 are provided.
- the high-speed data line interface includes a LAN signal transmission circuit 441, a terminating resistor 442, AC coupling capacitors 443 and 444, a LAN signal reception circuit 445, and a subtraction circuit 446.
- a series circuit of a resistor 462 and a resistor 463 is connected between the power supply line (+5.0 V) and the ground line.
- a series circuit of a choke coil 461, an AC coupling capacitor 444, a termination resistor 442, an AC coupling capacitor 443, and a pull-down resistor 451 is connected between the connection point of the resistors 462 and 463 and the ground line.
- connection point P3 between the AC coupling capacitor 443 and the termination resistor 442 is connected to the positive output side of the LAN signal transmission circuit 441 and to the positive input side of the LAN signal reception circuit 445.
- a connection point P4 between the AC coupling capacitor 444 and the termination resistor 442 is connected to the negative output side of the LAN signal transmission circuit 441 and to the negative input side of the LAN signal reception circuit 445.
- a transmission signal (transmission data) SG417 is supplied to the input side of the LAN signal transmission circuit 441.
- the output signal SG418 of the LAN signal receiving circuit 445 is supplied to the positive side terminal of the subtraction circuit 446, and the transmission signal SG417 is supplied to the negative side terminal of the subtraction circuit 446.
- the transmission signal SG417 is subtracted from the output signal SG418 of the LAN signal receiving circuit 445 to obtain a reception signal (reception data) SG419.
- connection point Q3 between the pull-down resistor 451 and the AC coupling capacitor 443 is connected to the ground line via a series circuit of the resistor 452 and the capacitor 453.
- the output signal of the low-pass filter obtained at the connection point between the resistor 452 and the capacitor 453 is supplied to one input terminal of the comparator 454.
- the output signal of the low-pass filter is compared with a reference voltage Vref3 (+1.25 V) supplied to the other input terminal.
- the output signal SG416 of the comparator 454 is supplied to the control unit (CPU) of the sink device.
- the reserved line 501 and the HPD line 502 included in the HDMI cable constitute a differential twisted pair.
- the source side end 511 of the reserved line 501 is connected to pin 14 of the HDMI terminal of the source device, and the sink side end 521 of the reserved line 501 is connected to pin 14 of the HDMI terminal of the sink device.
- the source side end 512 of the HPD line 502 is connected to the 19th pin of the HDMI terminal of the source device, and the sink side end 522 of the HPD line 502 is connected to the 19th pin of the HDMI terminal of the sink device.
- connection point Q1 between the pull-up resistor 421 and the AC coupling capacitor 413 described above is connected to the 14th pin of the HDMI terminal, and the connection point Q2 between the pull-down resistor 431 and the AC coupling capacitor 414 described above. Is connected to the 19th pin of the HDMI terminal.
- the connection point Q3 between the pull-down resistor 451 and the AC coupling capacitor 443 described above is connected to the 14th pin of the HDMI terminal, and the connection point Q4 between the choke coil 461 and the AC coupling capacitor 444 described above. Is connected to the 19th pin of the HDMI terminal.
- the transmission signal (transmission data) SG411 is supplied to the input side of the LAN signal transmission circuit 411, and a differential signal (positive output signal, negative output signal) corresponding to the transmission signal SG411 is output from the LAN signal transmission circuit 411. Is output.
- the differential signal output from the LAN signal transmission circuit 411 is supplied to the connection points P1 and P2, and transmitted to the sink device through a pair of differential transmission lines (reserved line 501 and HPD line 502) of the HDMI cable. Is done.
- the transmission signal (transmission data) SG417 is supplied to the input side of the LAN signal transmission circuit 441, and the differential signal (positive output signal, negative output signal) corresponding to the transmission signal SG417 from the LAN signal transmission circuit 441. ) Is output.
- the differential signal output from the LAN signal transmission circuit 441 is supplied to the connection points P3 and P4, and transmitted to the source device through a pair of lines (reserved line 501 and HPD line 502) of the HDMI cable.
- the difference output from the LAN signal transmitting circuit 411 is output as the output signal SG412 of the LAN signal receiving circuit 415.
- An addition signal of the transmission signal corresponding to the motion signal (current signal) and the reception signal corresponding to the differential signal transmitted from the sink device as described above is obtained.
- the subtracting circuit 416 the transmission signal SG411 is subtracted from the output signal SG412 of the LAN signal receiving circuit 415. Therefore, the output signal SG413 of the subtraction circuit 416 corresponds to the transmission signal (transmission data) SG417 of the sink device.
- An addition signal of the transmission signal corresponding to the motion signal (current signal) and the reception signal corresponding to the differential signal transmitted from the source device as described above is obtained.
- the subtracting circuit 446 the transmission signal SG417 is subtracted from the output signal SG418 of the LAN signal receiving circuit 445. For this reason, the output signal SG419 of the subtraction circuit 446 corresponds to the transmission signal (transmission data) SG411 of the source device.
- bidirectional LAN communication can be performed between the high-speed data line interface of the source device and the high-speed data line interface of the sink device.
- the HPD line 502 transmits to the source device that the HDMI cable is connected to the sink device at the DC bias level in addition to the above-described LAN communication. That is, the resistors 462 and 463 and the choke coil 461 in the sink device bias the HPD line 502 to about 4V via the 19th pin of the HDMI terminal when the HDMI cable is connected to the sink device.
- the source device extracts the DC bias of the HPD line 502 with a low-pass filter including a resistor 432 and a capacitor 433, and compares it with a reference voltage Vref2 (for example, 1.4V) by a comparator 434.
- Vref2 for example, 1.4V
- the 19-pin voltage of the HDMI terminal of the source device is lower than the reference voltage Vref2 because the pull-down resistor 431 exists if the HDMI cable is not connected to the sink device. Conversely, the HDMI cable is connected to the sink device. Is higher than the reference voltage Vref2. Therefore, the output signal SG415 of the comparator 434 is at a high level when the HDMI cable is connected to the sink device, and is at a low level otherwise. Thereby, the control unit (CPU) of the source device can recognize whether or not the HDMI cable is connected to the sink device based on the output signal SG415 of the comparator 434.
- devices connected to both ends of the HDMI cable at the DC bias potential of the reserved line 501 are devices capable of LAN communication (hereinafter referred to as “eHDMI compatible devices”), or LAN communication is not possible. It has a function of mutually recognizing possible devices (hereinafter “eHDMI non-compliant devices”).
- the source device pulls up the reserve line 501 with the resistor 421 (+ 5V), and the sink device pulls down the reserve line 501 with the resistor 451.
- the resistors 421 and 451 do not exist in devices that do not support eHDMI.
- the source device uses the comparator 424 to compare the DC potential of the reserved line 501 that has passed through the low-pass filter including the resistor 422 and the capacitor 423 with the reference voltage Vref1.
- the voltage of the reserved line 501 is 2.5V.
- the voltage of the reserve line 501 becomes 5V due to the presence of the pull-up resistor 421.
- the control unit (CPU) of the source device can recognize whether or not the sink device is an eHDMI-compatible device based on the output signal SG414 of the comparator 424.
- the sink device uses the comparator 454 to compare the DC potential of the reserved line 501 that has passed through the low-pass filter including the resistor 452 and the capacitor 453 with the reference voltage Vref3.
- the source device is an eHDMI compatible device and has a pull-up resistor 421
- the voltage of the reserved line 501 is 2.5V.
- the source device is a device that does not support eHDMI and does not have the pull-up resistor 421
- the voltage of the reserved line 501 becomes 0 V due to the presence of the pull-down resistor 451.
- the control unit (CPU) of the sink device can recognize whether or not the source device is an eHDMI-compatible device based on the output signal SG416 of the comparator 454.
- LAN communication is performed in an interface that performs image (video) and audio data transmission, connection device information exchange and authentication, device control data communication, and LAN communication using a single HDMI cable. Since the connection state of the interface is notified by the DC bias potential of at least one of the transmission lines through bidirectional communication via a pair of differential transmission lines, the SCL line and the SDA line are physically connected to the LAN. Spatial separation that does not involve communication can be performed. As a result, a circuit for LAN communication can be formed irrespective of the electrical specifications defined for DDC, and stable and reliable LAN communication can be realized at low cost.
- the pull-up resistor 421 shown in FIG. 9 may be provided not in the source device but in the HDMI cable.
- each of the terminals of the pull-up resistor 421 is connected to each of the reserved line 501 and the line (signal line) connected to the power supply (power supply potential) among the lines provided in the HDMI cable.
- each of the terminals of the pull-down resistor 451 is connected to each of the reserved line 501 and the line (ground line) connected to the ground (reference potential) among the lines provided in the HDMI cable.
- Each of the terminals of the resistor 463 is connected to each of the HPD line 502 and a line (ground line) connected to the ground (reference potential) among lines provided in the HDMI cable.
- the 3D image data of the original signal is composed of left eye (L) image data and right eye (R) image data
- L left eye
- R right eye
- FIG. 10 an example will be described in which the image data of the left eye (L) and the right eye (R) are each image data of a 1920 ⁇ 1080p pixel format.
- Systems (1) to (3) are the most desirable because they can be transmitted without degrading the quality of the original signal. However, since the transmission band is twice as large as the current state, this is possible when there is a margin in the transmission band.
- the methods (4) to (6) are methods for transmitting 3D image data in the current 1920 ⁇ 1080p transmission band.
- method (1) is a method in which pixel data of left eye image data and pixel data of right eye image data are sequentially switched and transmitted for each TMDS clock.
- the frequency of the pixel clock may be the same as the conventional one, but a switching circuit for each pixel is required.
- the number of pixels in the horizontal direction is 3840 pixels, but it may be composed of 2 lines of 1920 pixels.
- System (2) is a system in which one line of left-eye image data and one line of right-eye image data are alternately transmitted as shown in FIG. 11B, and the lines are switched by a line memory.
- a new video format definition of 1920 ⁇ 2160 is required as the video format.
- the method (3) is a method in which left-eye image data and right-eye image data are sequentially switched for each field and transmitted as shown in FIG. 11 (c).
- a field memory is required for the switching process, but signal processing at the source device is the simplest.
- Method (4) is a method of alternately transmitting one line of left eye image data and one line of right eye image data, as shown in FIG. In this case, the left eye image data and the right eye image data are each thinned by half.
- This method is the video signal itself of the stereoscopic image display method referred to as the “phase difference plate method” described above, and is the simplest method for signal processing on the display unit of the sink device. The vertical resolution is halved.
- the data of each line of the left eye image data is transmitted in the first half in the vertical direction, and the data of each line of the left eye image data is transmitted in the second half in the vertical direction. It is a method to do. In this case, as in the method (4) described above, the lines of the left eye image data and the right eye image data are thinned out to 1 ⁇ 2, so the vertical resolution is halved with respect to the original signal. Switching is not necessary.
- Method (6) is a “Side By Side” method that is currently used in experimental broadcasting.
- the pixel data of the left eye image data is transmitted, and the horizontal direction In the latter half, the pixel data of the right eye image data is transmitted.
- the left eye image data and the right eye image data are each decimated in half of the pixel data in the horizontal direction, so that the horizontal resolution is higher than in the above methods (4) and (5). 1/2.
- the content can be determined even by a sink device that does not support 3D image data, and is highly compatible with the display unit of a conventional sink device.
- the 3D signal processing unit 229 of the above-described disc player 210 when any of the methods (1) to (6) is selected, 3D image data of the original signal (left eye (L) image and right eye (R)) From the image data, a process of generating composite data (see FIGS. 11A to 11C and FIGS. 12A to 12C) corresponding to the selected transmission method is performed. In that case, the 3D signal processing unit 254 of the television receiver 250 described above performs a process of separating and extracting the left eye (L) image data and the right eye (R) image data from the combined data.
- FIG. 13 shows an example of TMDS transmission data of method (1).
- FIG. 14 shows an example of the packing format when transmitting 3D image data of the method (1) through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- image data transmission methods two types of RGB 4: 4: 4 and YCbCr 4: 4: 4 are shown.
- TMDS clock 2 ⁇ pixel clock.
- RGB4: 4: 4 system 8-bit data that constitutes pixel data of right eye (R) image data in the data area of the first half of each pixel (pixel) in TMDS channels # 0, # 1, and # 2, respectively.
- Blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- pixel data of left eye (L) image data is configured in the data area of the second half of each pixel (pixel) in TMDS channels # 0, # 1, and # 2, respectively.
- 8 bits of blue (B) data, 8 bits of green (G) data, and 8 bits of red (R) data are arranged.
- YCbCr4 In the YCbCr4: 4: 4 system, 8-bit blue color that forms pixel data of right-eye (R) image data in the first half data area of each pixel (pixel) in TMDS channels # 0, # 1, and # 2, respectively. Difference (Cb) data, 8-bit luminance (Y) data, and 8-bit red difference (Cr) data are arranged. Further, in this YCbCr44: 4: 4 system, pixel data of the left eye (L) image data is configured in the second half data area of each pixel (pixel) in the TMDS channels # 0, # 1, and # 2, respectively. 8-bit blue difference (Cb) data, 8-bit luminance (Y) data, and 8-bit red difference (Cr) data are arranged.
- the left eye image data may be arranged in the first half data area of each pixel and the right eye image data may be arranged in the second half data area of each pixel.
- FIG. 15 shows an example of TMDS transmission data of method (2).
- 1920 pixels ⁇ 2160 lines of active video section 1920 pixels (pixels) ⁇ 2160 lines of effective pixel (active pixel) data (the synthesis of left eye (L) image data and right eye (R) image data) Data) is arranged.
- L left eye
- R right eye
- FIG. 16 shows an example of the packing format when transmitting 3D image data of the method (2) using the three TMDS channels # 0, # 1, and # 2 of HDMI.
- image data transmission methods two types of RGB 4: 4: 4 and YCbCr 4: 4: 4 are shown.
- RGB4: 4: 4 system pixel data of left eye (L) image data is configured in the data area of each pixel (pixel) of odd lines in TMDS channels # 0, # 1, and # 2, respectively.
- Bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- pixel data of right eye (R) image data is respectively stored in the data area of each pixel (pixel) of even lines in TMDS channels # 0, # 1, and # 2.
- the 8-bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- YCbCr4 4: 4 method, 8-bit data that constitutes pixel data of left eye (L) image data in the data area of each pixel (pixel) of odd lines in TMDS channels # 0, # 1, and # 2, respectively.
- Blue color difference (Cb) data, 8-bit luminance (Y) data, and 8-bit red color difference (Cr) data are arranged.
- pixel data of right eye (R) image data is configured in the data area of each pixel (pixel) of even lines in TMDS channels # 0, # 1, and # 2, respectively.
- 8 bits of blue color difference (Cb) data, 8 bits of luminance (Y) data, and 8 bits of red color difference (Cr) data are arranged.
- the right eye image data may be arranged on odd lines and the left eye image data may be arranged on even lines.
- FIG. 17 shows an example of TMDS transmission data of method (3).
- left-eye (L) image data of effective pixels (active pixels) for 1920 pixels ⁇ 1080 lines is arranged in an active video section of 1920 pixels ⁇ 1080 lines in an odd field.
- right-eye (R) image data of effective pixels (active pixels) for 1920 pixels ⁇ 1080 lines is arranged in an active video section of 1920 pixels ⁇ 1080 lines in the even field.
- FIGS. 18 and 19 show examples of packing formats when 3D image data of the method (3) is transmitted through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- image data transmission methods two types of RGB 4: 4: 4 and YCbCr 4: 4: 4 are shown.
- RGB 4: 4: 4 system pixel data of left-eye (L) image data is configured in the data area of each pixel (pixel) in the odd field in TMDS channels # 0, # 1, and # 2, respectively.
- Bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- pixel data of right eye (R) image data is respectively stored in the data area of each pixel (pixel) in the even field in TMDS channels # 0, # 1, and # 2.
- the 8-bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- YCbCr4 In the YCbCr4: 4: 4 system, 8-bit data that constitutes pixel data of left eye (L) image data in the data area of each pixel (pixel) in the odd field in TMDS channels # 0, # 1, and # 2, respectively. Blue color difference (Cb) data, 8-bit luminance (Y) data, and 8-bit red color difference (Cr) data are arranged. Also, in this YCbCr44: 4: 4 system, pixel data of right eye (R) image data in the data area of the second half of each pixel (pixel) of the even field in TMDS channels # 0, # 1, and # 2, respectively. 8 bit blue difference (Cb) data, 8 bit luminance (Y) data, and 8 bit red difference (Cr) data are arranged.
- the right eye image data may be arranged in the data area of each pixel in the odd field
- the left eye image data may be arranged in the data area of each pixel in the even field.
- FIG. 20 shows an example of TMDS transmission data of method (4).
- 1920 pixels ⁇ 1080 lines of active video section 1920 pixels (pixels) ⁇ 1080 lines of effective pixel (active pixel) data (compositing left eye (L) image data and right eye (R) image data).
- L left eye
- R right eye
- both the left eye image data and the right eye image data are odd lines
- both the left eye image data and the right eye image data are even lines
- the left eye image data is an odd line
- the right eye image data is There are four types of even lines, left eye image data, even lines, and right eye image data, odd lines.
- FIG. 20 shows a case where the left eye image data is an odd line and the right eye image data is an even line.
- FIG. 21 shows an example of packing format when transmitting 3D image data of the method (4) using the three TMDS channels # 0, # 1, and # 2 of HDMI.
- the image data transmission system three types of RGB 4: 4: 4, YCbCr: 4: 4: 4, and YCbCr 4: 2: 2 are shown.
- RGB4: 4: 4 system pixel data of left eye (L) image data is configured in the data area of each pixel (pixel) of odd lines in TMDS channels # 0, # 1, and # 2, respectively.
- Bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- pixel data of right eye (R) image data is respectively stored in the data area of each pixel (pixel) of even lines in TMDS channels # 0, # 1, and # 2.
- the 8-bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- YCbCr4 4: 4 method, 8-bit data that constitutes pixel data of left eye (L) image data in the data area of each pixel (pixel) of odd lines in TMDS channels # 0, # 1, and # 2, respectively.
- Blue color difference (Cb) data, 8-bit luminance (Y) data, and 8-bit red color difference (Cr) data are arranged.
- pixel data of right eye (R) image data is configured in the data area of each pixel (pixel) of even lines in TMDS channels # 0, # 1, and # 2, respectively.
- 8 bits of blue color difference (Cb) data, 8 bits of luminance (Y) data, and 8 bits of red color difference (Cr) data are arranged.
- Bit 3 data is arranged, and bits 0 to 3 of the blue difference (Cb) data and bits 0 to 3 of the red difference (Cr) data are alternately arranged for each pixel.
- the luminance (Y) data bits 4 to 11 of the left eye (L) image data are set in the data area of each pixel (pixel) of the odd line in the TMDS channel # 1. Data is arranged.
- bits 4 to 11 of the blue difference (Cb) data of the left eye (L) image data are placed in the data area of each pixel (pixel) of the odd line in the TMDS channel # 2.
- data of bits 4 to 11 of the red color difference (Cr) data are alternately arranged for each pixel.
- the bits (4) to (11) of the luminance (Y) data of the right eye (R) image data are placed in the data area of each pixel (pixel) of the even line in the TMDS channel # 1. Data is arranged.
- bits 4 to 11 of the blue difference (Cb) data of the right eye (R) image data are placed in the data area of each pixel (pixel) of the even line in the TMDS channel # 2. And data of bits 4 to 11 of the red color difference (Cr) data are alternately arranged for each pixel.
- the right eye image data may be arranged on odd lines and the left eye image data may be arranged on even lines.
- FIG. 22 shows an example of TMDS transmission data of method (5).
- 1920 pixels ⁇ 1080 lines of active video section 1920 pixels (pixels) ⁇ 1080 lines of effective pixel (active pixel) data (compositing left eye (L) image data and right eye (R) image data).
- L left eye
- R right eye
- both the left eye image data and the right eye image data are odd lines
- both the left eye image data and the right eye image data are even lines
- the left eye image data is an odd line
- the right eye image data is There are four types of even lines, left eye image data, even lines, and right eye image data, odd lines.
- FIG. 22 shows a case where the left eye image data is an odd line and the right eye image data is an even line.
- FIG. 23 and FIG. 24 show an example of packing format when transmitting 3D image data of the method (5) through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- the image data transmission system three types of RGB 4: 4: 4, YCbCr: 4: 4: 4, and YCbCr 4: 2: 2 are shown.
- RGB 4: 4: 4 system pixel data of left eye (L) image data is configured in the data area of each pixel (pixel) in the first half of the vertical direction in TMDS channels # 0, # 1, and # 2, respectively.
- Bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- the pixel data of the right eye (R) image data is stored in the data area of each pixel in the latter half of the vertical direction in TMDS channels # 0, # 1, and # 2.
- the 8-bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- YCbCr4 4: 4 system
- 8-bit data that forms pixel data of left-eye (L) image data in the data area of each pixel (pixel) in the first half of the vertical direction in TMDS channels # 0, # 1, and # 2 Blue color difference (Cb) data, 8-bit luminance (Y) data, and 8-bit red color difference (Cr) data are arranged.
- pixel data of right eye (R) image data is configured in the data area of each pixel in the latter half of the vertical direction in TMDS channels # 0, # 1, and # 2, respectively.
- 8 bits of blue color difference (Cb) data, 8 bits of luminance (Y) data, and 8 bits of red color difference (Cr) data are arranged.
- the pixel data of the left eye (L) image data is formed in the data area of each pixel (pixel) in the first half of the vertical direction in TMDS channel # 0.
- Bit 3 data is arranged, and bits 0 to 3 of the blue difference (Cb) data and bits 0 to 3 of the red difference (Cr) data are alternately arranged for each pixel. ing.
- the data of bit 4 to bit 11 of the luminance (Y) data of the left eye (L) image data is stored in the data area of each pixel (pixel) in the vertical first half in the TMDS channel # 1. Is arranged.
- bits 4 to 11 of the blue difference (Cb) data of the left eye (L) image data are placed in the data area of each pixel (pixel) in the vertical first half in the TMDS channel # 2.
- data of bits 4 to 11 of the red color difference (Cr) data are alternately arranged for each pixel.
- bits of luminance (Y) data that constitute pixel data of right eye (R) image data in the data area of each pixel (pixel) in the second half of the vertical in TMDS channel # 0 0 to bit 3 data is arranged, and bit 0 to bit 3 data of blue difference (Cb) data and bit 0 to bit 3 data of red difference (Cr) data are alternately arranged for each pixel. Is arranged.
- the data of bit 4 to bit 11 of the luminance (Y) data of the right eye (R) image data is stored in the data area of each pixel (pixel) in the vertical second half in TMDS channel # 1. Is arranged.
- bits 4 to 11 of the blue difference (Cb) data of the right eye (R) image data are placed in the data area of each pixel (pixel) in the vertical second half in the TMDS channel # 2.
- data of bits 4 to 11 of the red color difference (Cr) data are alternately arranged for each pixel.
- the right eye image data may be arranged in the data area of each pixel in the vertical first half and the left eye image data may be arranged in the data area of each pixel in the vertical second half.
- FIG. 25 shows an example of TMDS transmission data of method (6).
- 1920 pixels ⁇ 1080 lines of active video section 1920 pixels (pixels) ⁇ 1080 lines of effective pixel (active pixel) data (compositing left eye (L) image data and right eye (R) image data).
- L left eye
- R right eye
- both left eye image data and right eye image data are odd pixels
- both left eye image data and right eye image data are even pixels
- left eye image data is odd pixels
- right eye image data is There are four types of even-numbered pixels, left-eye image data, even-numbered pixels, and right-eye image data, odd-numbered pixels.
- FIG. 26 shows an example of packing format when transmitting 3D image data of the method (6) through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- the image data transmission system three types of RGB 4: 4: 4, YCbCr: 4: 4: 4, and YCbCr 4: 2: 2 are shown.
- RGB4: 4: 4 system pixel data of left eye (L) image data is configured in the data area of each pixel (pixel) in the first half of the horizontal in TMDS channels # 0, # 1, and # 2, respectively.
- Bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- pixel data of right eye (R) image data is respectively stored in the data area of each pixel in the second half of the horizontal in TMDS channels # 0, # 1, and # 2.
- the 8-bit blue (B) data, 8-bit green (G) data, and 8-bit red (R) data are arranged.
- YCbCr4 In the YCbCr4: 4: 4 system, 8-bit data that constitutes pixel data of left-eye (L) image data in the data area of each pixel (pixel) in the first half of the horizontal in TMDS channels # 0, # 1, and # 2, respectively. Blue color difference (Cb) data, 8-bit luminance (Y) data, and 8-bit red color difference (Cr) data are arranged. Further, in this YCbCr44: 4: 4 system, pixel data of right eye (R) image data is configured in the data area of each pixel in the second half of the horizontal in TMDS channels # 0, # 1, and # 2, respectively. 8 bits of blue color difference (Cb) data, 8 bits of luminance (Y) data, and 8 bits of red color difference (Cr) data are arranged.
- the pixel data of the left eye (L) image data is formed in the data area of each pixel (pixel) in the first half of the horizontal in TMDS channel # 0, and bits 0 to 0 of luminance (Y) data
- Bit 3 data is arranged, and bits 0 to 3 of the blue difference (Cb) data and bits 0 to 3 of the red difference (Cr) data are alternately arranged for each pixel. ing.
- the data of bits 4 to 11 of the luminance (Y) data of the left eye (L) image data is stored in the data area of each pixel (pixel) in the first half of the horizontal direction in the TMDS channel # 1. Is arranged.
- bits 4 to 11 of the blue difference (Cb) data of the left eye (L) image data are placed in the data area of each pixel (pixel) in the first half of the horizontal direction in the TMDS channel # 2.
- data of bits 4 to 11 of the red color difference (Cr) data are alternately arranged for each pixel.
- the data of bit 4 to bit 11 of the luminance (Y) data of the right eye (R) image data is stored in the data area of each pixel (pixel) in the horizontal second half in TMDS channel # 1. Is arranged.
- bits 4 to 11 of the blue difference (Cb) data of the right eye (R) image data are placed in the data area of each pixel (pixel) in the horizontal second half in TMDS channel # 2.
- data of bits 4 to 11 of the red color difference (Cr) data are alternately arranged for each pixel.
- the right eye image data may be arranged in the data area of each pixel in the first half of the horizontal
- the left eye image data may be arranged in the data area of each pixel in the second half of the horizontal.
- the 3D image data of the original signal is composed of two-dimensional (2D) image data (see FIG. 27A) and depth data corresponding to each pixel (see FIG. 27B).
- 2D two-dimensional
- the 4: 4: 4 system two-dimensional image data is converted into the 4: 2: 2 system, depth data is arranged in the vacant area, and the two-dimensional image data and the depth data are combined.
- Data is transmitted through the TMDS channel of HDMI. That is, in this case, pixel data constituting two-dimensional image data and depth data corresponding to the pixel data are arranged in the data area of each pixel (image).
- FIG. 28 shows an example of TMDS transmission data in the MPEG-C system.
- data of active pixels (active pixel) corresponding to 1920 pixels ⁇ 1080 lines is arranged in an active video section of 1920 pixels ⁇ 1080 lines.
- FIG. 29 shows an example of a packing format when transmitting 3D image data of the MPEG-C system through the three TMDS channels # 0, # 1, and # 2 of HDMI.
- Fig. 29 (a) shows the packing format of YCbCr 4: 4: 4 format two-dimensional image data for comparison.
- 8-bit blue color difference (Cb) data and 8-bit luminance (Y) that constitute pixel data of two-dimensional image data in the data area of each pixel (pixel) in TMDS channels # 0, # 1, and # 2, respectively.
- Data and 8-bit red color difference (Cr) data are arranged.
- FIG. 29B shows the packing format of the combined data of the two-dimensional image data and the depth data.
- 8-bit blue difference (Cb) data and 8-bit red difference (Cr) data are alternately arranged for each pixel.
- 8-bit luminance (Y) data is arranged in the data area of each pixel in the TMDS channel # 1.
- 8-bit depth data (D) is arranged in the data area of each pixel (pixel) in TMDS channel # 2.
- the method shown in FIG. 29B is referred to as a “YCbCrD4: 2: 2: 4” method.
- the pixel data of the color difference signals Cb and Cr are thinned out to 1 ⁇ 2, whereas the depth data is not thinned out. This is because the depth data is 8-bit data related to the luminance (Y) data, and it is necessary to maintain the same quality as the luminance (Y) data without thinning out.
- the 3D signal processing unit (encoding unit) 229 of the above-described disc player 210 uses the above-described “YCbCrD4: 2” from the 3D image data (two-dimensional image data and depth data) of the original signal. : 2: 4 ”The synthetic data corresponding to the method is generated.
- the 3D signal processing unit (decoding unit) 254 of the television receiver 250 described above generates two-dimensional image data from the synthesized data of the “YCbCrD4: 2: 2: 4” method illustrated in FIG. Separate and extract depth data. Then, as shown in FIG.
- the 3D signal processing unit 254 performs interpolation processing on the color difference data Cb and Cr as shown in FIG. 30B to obtain YCbCr4: 4: 4 two-dimensional image data. Convert. Further, the 3D signal processing unit 254 performs calculations using the two-dimensional image data and the depth data, and generates left eye (L) image data and right eye (R) image data.
- the CPU 214 of the disc player 210 performs a 3D image data transmission method that can be supported by the television receiver 250 based on the E-EDID read from the HDMI receiving unit 252 of the television receiver 250. Recognize
- FIG. 31 shows an example of the data structure of E-EDID.
- This E-EDID is composed of a basic block and an extended block.
- data defined by the E-EDID1.3 standard represented by “E-EDID1.3 Basic Structure” is placed, followed by the conventional EDID represented by “Preferred timing”.
- Timing information for maintaining the compatibility of the EDID and timing information different from “Preferential timing” for maintaining compatibility with the conventional EDID represented by “2nd timing” are arranged.
- the basic block includes information indicating the name of the display device represented by “Monitor NAME” following “2nd timing” and an aspect ratio of 4: 3 and 16 represented by “Monitor Range Limits”. : Information indicating the number of displayable pixels in the case of 9 is arranged in order.
- “Short Video Descriptor” is arranged at the head of the extension block. This is information indicating the displayable image size (resolution), the frame rate, and whether it is interlaced or progressive. Subsequently, “Short Audio Descriptor” is arranged. This is information such as a reproducible audio codec system, a sampling frequency, a cutoff band, and the number of codec bits. Subsequently, information on left and right speakers represented by “Speaker Allocation” is arranged.
- extension block maintains compatibility with the conventional EDID represented by “3rd timing”, the data uniquely defined for each manufacturer represented by “Vender Specific” following “Speaker Allocation”. Timing information for maintaining compatibility with the conventional EDID represented by “4th timing” is arranged.
- FIG. 32 shows an example of the data structure of the Vender Specific area.
- a 0th block to an Nth block which are 1-byte blocks are provided.
- Data area of 3D image / audio information to be stored by the sink device from the 8th byte to the 11th byte following the already defined 0th byte to the 7th byte.
- information indicating the number “0x000C03” registered for HDMI (R) represented by “24-bit IEEE Registration Identifier (0x000C03) LSB first” is arranged in the first to third bytes. Further, in the fourth byte and the fifth byte, information indicating the physical address of the 24-bit sink device represented by “A”, “B”, “C”, and “D” is arranged.
- the sixth byte is a flag indicating a function supported by the sink device represented by “Supports-AI”, “DC-48 bit”, “DC-36 bit”, and “DC-30 bit”.
- the eighth byte shows the correspondence in RGB 4: 4: 4
- the ninth byte shows the correspondence in YCbCr 4: 4: 4
- the tenth byte shows the correspondence in YCbCr 4: 2: 2.
- the 7th to 1st bits of the 8th byte to the 10th byte respectively have 6 video formats (RGB 4) of the 3D image (the above-described methods (1) to (6)) supported by the sink device. : 4: 4 format, YCbCr 4: 4: 4 format, YCbCr 4: 2: 2 format) is written.
- the seventh bit corresponds to a method (method (1): “Pixel ALT”) in which pixel data of left eye image data and pixel data of right eye image data are sequentially switched for each TMDS clock. Indicates.
- the sixth bit indicates whether or not it is compatible with a method of transmitting one line of left eye image data and one line of right eye image data alternately (method (2): “Simul”).
- the fifth bit indicates whether or not it is compatible with a method (method (3): “Field Seq.”) In which left-eye image data and right-eye image data are sequentially switched and transmitted for each field.
- method (3) “Field Seq.”
- the left-eye image data and the right-eye image data are each thinned by half in the vertical direction, and one line of left-eye image data and one line of right-eye image data are alternately transmitted.
- Method (4) “Line Seq.”.
- the left eye image data and the right eye image data are respectively thinned by half in the vertical direction, and the data of each line of the left eye image data is transmitted in the first half, and each line of the left eye image data is transmitted in the second half. It indicates whether or not the data transmission method (method (5): “Top & Bottom”) is supported.
- the left-eye image data and the right-eye image data are respectively thinned by half in the horizontal direction, and the pixel data of the left-eye image data is transmitted in the first half, and each pixel of the left-eye image data is transmitted in the second half. It indicates whether or not the data transmission method (method (6): “Side by Side”) is supported.
- the first bit indicates whether or not it supports a transmission method (MPEG-C method) using a two-dimensional image (main image) and depth data defined by MPEG-C. Subsequent bits can be assigned when other methods are proposed.
- MPEG-C method a transmission method using a two-dimensional image (main image) and depth data defined by MPEG-C. Subsequent bits can be assigned when other methods are proposed.
- the 7th to 5th bits indicate the 3D audio transmission format supported by the sink device.
- the seventh bit indicates the correspondence to the method A
- the sixth bit indicates the correspondence to the method B
- the fifth bit indicates the correspondence to the method C. Subsequent bits can be assigned when other methods are proposed. A description of the systems A to C is omitted.
- the CPU 214 of the disc player 210 confirms the connection of the television receiver (sink device) 250 through the HPD line, and then uses the DDC to send the E-EDID from the television receiver 250.
- the 3D image / audio information is read, and the television receiver (sink device) recognizes the corresponding 3D image / audio data transmission method.
- the disc player (source device) 210 transmits 3D image / audio data (3D image data, 3D audio data) to the television receiver (sink device) 250
- 3D image / audio data 3D image data, 3D audio data
- the above-mentioned is used.
- any one of the 3D image / audio data transmission methods that can be supported by the television receiver 250 is selected and transmitted.
- the disc player (source device) 210 transmits information regarding the currently transmitted image / audio format to the television receiver (sink device) 250.
- the disc player 210 transmits the information to the television receiver 250 by inserting the information in the blanking period of the 3D image data (video signal) to be transmitted to the television receiver 250.
- the disc player 210 uses, for example, an HDMI AVI (Auxiliary Video Information) InfoFrame packet, AudioInfoFrame packet, and the like to insert information about the currently transmitted image / audio format into the blanking period of the 3D image data. .
- FIG. 33 shows an example of the data structure of an AVI InfoFrame packet.
- the auxiliary information about an image can be transmitted from the source device to the sink device by the AVI-InfoFrame packet.
- “Packet Type” indicating the type of data packet is defined in the 0th byte.
- the “Packet Type” of the AVI InfoFrame packet is “0x82”.
- the version information of the packet data definition is described in the first byte.
- the AVI InfoFrame packet is currently “0x02”, but when the 3D image data transmission method is defined in the present invention, it becomes “0x03” as shown.
- In the second byte information indicating the packet length is described.
- AVIInfoFrame is currently “0x0D”, but when the 3D image output format information is defined in the 17th byte in the present invention, it becomes “0x0E” as shown in the figure.
- Each AVI InfoFrame is defined in CEA-861-D Section -4 6-4 and will be omitted.
- the 17th byte specifies any one of the 3D image data transmission methods selected by the source device (in this embodiment, the disc player 210).
- the seventh bit indicates a method of transmitting the pixel data of the left eye image data and the pixel data of the right eye image data by sequentially switching every TMDS clock (method (1): “Pixel ALT”).
- the sixth bit indicates a method of transmitting one line of left eye image data and one line of right eye image data alternately (method (2): “Simul”).
- the fifth bit indicates a method of transmitting the left-eye image data and the right-eye image data by sequentially switching each field for transmission (method (3): “Field Seq.”).
- the left-eye image data and the right-eye image data are each thinned by half in the vertical direction, and one line of left-eye image data and one line of right-eye image data are alternately transmitted.
- Method (4) “Line Seq.”.
- the left eye image data and the right eye image data are respectively thinned by half in the vertical direction, and the data of each line of the left eye image data is transmitted in the first half, and each line of the left eye image data is transmitted in the second half. Indicates a method for transmitting the data (method (5): “Top & Bottom”).
- the left-eye image data and the right-eye image data are respectively thinned by half in the horizontal direction, and the pixel data of the left-eye image data is transmitted in the first half, and each pixel of the left-eye image data is transmitted in the second half.
- Indicates a method for transmitting data (method (6): “Side by Side”).
- the first bit indicates selection of a transmission method (MPEG-C method) using a two-dimensional image and depth data defined by MPEG-C.
- the sink device in this embodiment, the television receiver 250
- the sink device can determine that 3D image data is transmitted when any of the seventh to first bits is set.
- a video format of 3840 ⁇ 1080 is used
- a video format of 1920 ⁇ 2160 is used. Therefore, the video format specified by the VIC6 to VIC0 bits of the seventh byte of the AVI InfoFrame is selected from the video formats shown in FIG.
- RGB 4: 4: 4, YCbCr 4: 4: 4, and YCbCr 4: 2: 2 are designated by the sixth and fifth bits of the fourth byte of the AVIInfoFrame.
- Deep Color information must be transmitted in a separate packet from AVIInfoFrame. Therefore, as shown in FIG. 35, the bits from CD3 to CD0 of the General Control Protocol packet specify 48 bits (0x7) in the case of methods (1) to (3).
- AudioInfoFrame packet is placed in the data island section described above.
- FIG. 36 shows the data structure of an AudioInfoFrame packet.
- the audio-related information frame packet can be used to transmit audio-related supplementary information from the source device to the sink device.
- Packet Type indicating the type of data packet is defined in the 0th byte
- AudioInfoFrame used in the present invention is “0x84”.
- the version information of the packet data definition is described in the first byte.
- the Audio-InfoFrame packet is currently “0x01”, but when the 3D audio data transmission method is defined in the present invention, it becomes “0x02” as shown.
- In the second byte information indicating the packet length is described. AudioInfoFrame is currently “0x0A”.
- 3D audio output format information in the present invention is defined in the ninth byte. From the 7th bit to the 5th bit, one of the selected transmission methods among the 3D audio data transmission methods supported by the sink device is designated. As an example, the seventh bit indicates transmission in the method A, the sixth bit indicates transmission in the method B, and the fifth bit indicates transmission in the method C.
- the disc player 210 starts processing in step ST1, and then proceeds to processing in step ST2.
- step ST2 the disc player 210 determines whether or not the HPD signal is at the high level “H”.
- the television receiver (sink device) 250 is not connected to the disc player 210.
- the disc player 210 immediately proceeds to step ST8 and ends the process.
- the disc player 210 When the HPD signal is at the high level “H”, the disc player 210 reads the E-EDID (see FIGS. 31 and 32) of the television receiver (sink device) 250 in step ST3. In step ST4, the disc player 210 determines whether there is 3D image / audio information.
- the disc player 210 sets data indicating non-transmission of 3D image / audio in the AVI InfoFrame packet and Audio InfoFrame packet in step ST9, and then proceeds to step ST8 to perform processing. finish.
- the setting of data indicating non-transmission of 3D image / sound means that all 7th to 4th bits of the 17th byte of the AVI InfoFrame packet (see FIG. 33) are set to “0”. This means that all the seventh to fifth bits of the ninth byte of the Audio-InfoFrame packet (see FIG. 36) are set to “0”.
- the disc player 210 determines the transmission method of 3D image / audio data in step ST5. In step ST6, the disc player 210 determines whether or not transmission of 3D image / audio data is started. When the transmission of 3D image / audio is not started, the disc player 210 sets data indicating non-transmission of 3D image / audio in the AVI InfoFrame packet and Audio InfoFrame packet in step ST9, and then proceeds to step ST8 for processing. Exit.
- step ST6 When transmission of 3D image / audio data is started in step ST6, the disc player 210 sets data indicating the transmission method of 3D image / audio data in the AVI InfoFrame packet and Audio InfoFrame packet in step ST7, and thereafter Then, the process proceeds to step ST8, and the process ends.
- 3D image data transmission method determination processing processing in step ST5 in FIG. 37
- the disc player (source device) 210 will be described with reference to the flowchart in FIG. 38. .
- the disc player 210 starts processing in step ST11, and then proceeds to processing in step ST12.
- the disc player 210 determines whether the seventh to fifth bits of the eighth to tenth bytes of the Vender Specific area are set.
- the transmission method according to these bit settings is a method for transmitting the data of the left eye image and the right eye image having the highest image quality without deterioration, and is the method that is most easily processed by the sink device. Therefore, when the seventh to fifth bits are set, the disc player 210 selects one of the transmission methods (1) to (3) set by these bits in step ST13. A transmission method is selected, and then the process ends in step ST14.
- step ST15 the disc player 210 determines whether the fourth to third bits of the eighth to tenth bytes in the Vender Specific area are set.
- the transmission method related to these bit settings is a method in which independent left-eye image data and right-eye image data having the next highest image quality are sequentially transmitted for each line, and the processing of the sink device is in units of two frames, and a memory is required. It becomes.
- the disc player 210 selects a transmission method for either the method (4) or (5) set by these bits in step ST16, and then In step ST14, the process is terminated.
- step ST17 the disc player 210 determines whether or not the second bit of the eighth to tenth bytes in the Vender Specific area is set.
- the transmission method related to this bit setting transmits independent left-eye image data and right-eye image data with the next highest image quality in the same frame and transmits the horizontal resolution in half by a method called “Side By Side”. This is a method and requires processing for expanding the horizontal resolution by a factor of 2 in the processing of the sink device.
- the second bit is set, the disc player 210 selects the transmission method of the method (6) set by this bit at step ST18, and thereafter ends the processing at step ST14.
- step ST19 the disc player 210 determines whether or not the first bit of the 8th to 10th bytes of the Vender Specific area is set.
- the transmission method according to this bit setting is an MPEG-C method in which two-dimensional image data, which is common image data for the left eye and right eye, and depth data for the left eye and right eye are separately transmitted. In this method, it is necessary to generate left-eye image data and right-eye image data from these two-dimensional image data and depth data by processing of the sink device, and the processing becomes complicated.
- the disc player 210 selects the MPEG-C transmission method set by this bit in step ST20, and thereafter ends the processing in step ST14.
- step ST21 the disc player 210 determines that there is no method capable of transmitting 3D image data, sets 3D non-selection, and then ends the processing in step ST14.
- the disc player 210 transmits the 3D image / audio data to be transmitted.
- 3D image / audio data transmission method information that can be supported by the television receiver 250 is received and transmitted.
- the disc player 210 transmits the transmission method information of the 3D image / audio data to be transmitted to the television receiver 250 using the AVI InfoFrame packet and the Audio InfoFrame packet. Therefore, it is possible to satisfactorily transmit 3D image / audio data between the disc player 210 and the television receiver 250.
- the disc player (source device) 210 uses the AVI InfoFrame packet and the Audio InfoFrame packet as the transmission method information of 3D image / audio data to be transmitted to the television receiver 250.
- the video signal is transmitted to the television receiver 250 by being inserted in the blanking period.
- the disc player (source device) 210 transmits 3D image / audio data transmission method information to be transmitted to the television receiver 250 to the television receiver 250 via the CEC line 84 that is a control data line of the HDMI cable 350. You may make it transmit. Also, for example, the disc player 210 receives 3D image / audio data transmission method information to be transmitted to the television receiver 250 via a bidirectional communication path composed of a reserved line of the HDMI cable 350 and an HPD line. You may make it transmit to the machine 250.
- the E-EDID of the television receiver 250 includes 3D image / audio data transmission method information corresponding to the television receiver 250, and the disc player 210 uses the HDMI cable. By reading the E-EDID via the 350 DDC 83, the television receiver 250 acquires the transmission method information of the corresponding 3D image / audio data.
- the disc player 210 transmits 3D image / audio data transmission method information supported by the television receiver 250 from the television receiver 250 via the CEC line 84 which is a control data line of the HDMI cable 350 or the HDMI cable. You may make it receive via the bidirectional
- the above-described embodiment shows an example using an HDMI transmission path.
- the baseband digital interface there are DVI (Digital Visual Interface), DP (Display Port) interface, wireless interface using 60 GHz millimeter wave, etc. in addition to HDMI.
- DVI Digital Visual Interface
- DP Display Port
- the present invention can be similarly applied to transmission of 3D image / audio data using these digital interfaces.
- the 3D image / audio data transmission method supported by the receiving device is stored in an area called E-EDID held by the receiving device. Therefore, in the case of this DVI, as in the case of the above-described HDMI, the transmitting device uses the DDC (Display Data Channel) to transmit the 3D image / audio data to the receiving device.
- DDC Display Data Channel
- the above-mentioned 3D image / audio information is read from the EDID, and the transmission method can be determined.
- FIG. 39 shows a configuration example of a DP system using a DP interface.
- a display port transmitting device and a display port receiving device are connected by a DP interface.
- the display port transmitting device includes a display port transmitter
- the display port receiving device includes a display port receiver.
- the main link consists of one, two, or four double-terminated differential signal pairs (pair lanes), does not have a dedicated clock signal, but instead has a clock embedded in the 8B / 10B encoded data stream.
- the DP interface two transmission rates are defined. One has a bandwidth per pair lane of 2.16 Gbps. The other has a bandwidth per pair lane of 1.296 Gbps. Accordingly, the logical upper limit transmission bit rate in the transmission path of the DP interface is 2.16 Gbps per port, and 8.64 Gbps for a maximum of 4 ports.
- the transmission speed and the pixel frequency are independent, and the depth and resolution of the pixel, the frame frequency, the presence / absence of additional data such as audio data and DRM information in the transfer stream, and the amount thereof, It can be adjusted freely.
- the DP interface has a half-duplex bi-directional external (auxiliary) channel with a bandwidth of 1 Mbit / sec and a maximum delay of 500 ms.
- auxiliary bi-directional external
- Exchange information on the functions of In the present invention information relating to 3D image / sound is transmitted using the external (auxiliary) channel.
- 3D image / audio data transmission method information supported by the receiving device is recorded in EDID similar to HDMI. Hot plug detection is provided to detect that the connection destination has been changed.
- FIG. 40 shows a configuration example of a wireless system using a wireless interface.
- the transmission apparatus includes an image / audio data reproduction unit, a wireless transmission / reception unit, a storage unit, and a control unit that controls these units.
- the receiving device includes a video / audio output unit, a wireless transmission / reception unit, a storage unit, and a control unit that controls these units.
- the transmission device and the reception device are connected by a wireless transmission path.
- information on the transmission method of 3D image / audio data that can be handled by the receiving device is stored in the storage unit of the receiving device, and is transmitted to the transmitting device via a wireless transmission path. Also, 3D image / audio data transmission method information from the transmission device is multiplexed with video / audio / control signals and sent to the reception device via a wireless transmission path.
- the logical upper limit transmission rate on each transmission line (10.2 Gbps for HDMI, 3.96 Gbps for DVI, 2.16 Gbps per DP, 8.64 Gbps with a maximum of 4 ports)
- the Gigabit Ether optical fiber is defined as 1 Gbps or 10 Gbps).
- the upper limit transmission rate may not be reached due to the transmission path length, the electrical characteristics of the transmission path, etc., and the transmission rate required for transmission of 3D image data to be transmitted by the transmission apparatus May not be obtained. At that time, it is necessary to appropriately select a 3D image data transmission method.
- FIG. 41 shows a configuration example of a transmission system 600 that confirms the transmission rate of the transmission path and determines the 3D image data transmission method.
- the transmission system 600 has a configuration in which a transmission device 610 and a reception device 650 are connected via a transmission path 660.
- the transmission device 610 includes a control unit 611, a storage unit 612, a reproduction unit 613, a 3D signal processing unit 614, and a transmission unit 615.
- the control unit 611 controls the operation of each unit of the transmission device 610.
- the reproduction unit 613 reproduces 3D image data to be transmitted from a recording medium such as an optical disc, an HDD, or a semiconductor memory.
- the 3D signal processing unit 614 matches the 3D image data (for example, left-eye image data and right-eye image data) reproduced by the reproduction unit 613 with a transmission method specified by the control unit 611 (FIG. 11, FIG. 12, see FIG. 28).
- the transmission unit 615 transmits the 3D image data obtained by the 3D signal processing unit 614 to the reception device 650.
- the transmission unit 615 transmits transmission method information of 3D image data to be transmitted to the reception device 650 using, for example, an AVI InfoFrame packet or the like.
- the transmission unit 615 receives transmission method information and transmission rate information of 3D image data corresponding to the reception device 650 sent from the reception device 650, and supplies the received information to the control unit 611.
- the receiving device 650 includes a control unit 651, a storage unit 652, a transmission unit 653, a 3D signal processing unit 654, an output unit 655, and a detection unit 656.
- the control unit 611 controls the operation of each unit of the receiving device 650.
- the storage unit 652 stores 3D image data transmission method information supported by the receiving device 650.
- the transmission unit 653 receives 3D image data sent from the transmission device 653. In addition, the transmission unit 653 receives 3D image data transmission method information sent from the transmission device 653 and supplies it to the control unit 651. Further, the transmission unit 653 transmits the transmission method information of 3D image data stored in the storage unit 652 and corresponding to the reception device 650 to the transmission device 610.
- the transmission unit 653 transmits the transmission rate information obtained by the control unit 651 to the transmission device 610. That is, the detection unit 656 determines the state of the transmission path 660 based on, for example, bit error information supplied from the transmission unit 653. The control unit 651 determines the quality of the transmission path 660 based on the determination result of the detection unit 656, and the transmission rate of the transmission path 660 is required for the transmission method of the 3D image data notified from the transmission device 610. When the rate is lower than the rate, transmission rate information indicating that fact is sent to the transmission device 610 through the transmission unit 653.
- the 3D signal processing unit 654 processes the 3D image data received by the transmission unit 653 to generate left eye image data and right eye image data.
- the control unit 651 controls the operation of the 3D signal processing unit 654 based on 3D image data transmission method information sent from the transmission device 610.
- the display unit 656 displays a stereoscopic image based on the left eye image data and the right eye image data generated by the 3D signal processing unit 654.
- the transmission device 610 the 3D image data reproduced by the reproduction unit 613 (left-eye image data and right-eye image data, or two-dimensional image data and depth data) is supplied to the 3D signal processing unit 614.
- the control unit 611 selects a predetermined transmission method from among the transmission methods supported by the receiving device 650 based on the transmission method information of the 3D image data supported by the receiving device 650 received from the receiving device 650. .
- the 3D image data reproduced by the reproduction unit 613 is processed so that the transmission method matches the transmission method selected by the control unit 611.
- the 3D image data processed by the 3D signal processing unit 614 is transmitted to the reception device 650 by the transmission unit 615 via the transmission path 660.
- information on the transmission method selected by the control unit 611 is transmitted from the transmission unit 615 to the reception device 650.
- the transmission unit 653 receives the 3D image data transmitted from the transmission device 610, and the 3D image data is supplied to the 3D signal processing unit 654. In addition, the transmission unit 653 receives transmission method information of 3D image data transmitted from the transmission device 610, and the transmission method information is supplied to the control unit 651. In the 3D signal processing unit 654, under the control of the control unit 651, the 3D image data received by the transmission unit 653 is processed according to the transmission method, and the left eye image data and the right eye image Data is generated.
- the left eye image data and the right eye image data are supplied to the display unit 655.
- the display unit 656 displays a stereoscopic image using the left eye image data and the right eye image data generated by the 3D signal processing unit 654 (see FIG. 2).
- the detection unit 656 determines the state of the transmission path 660 based on, for example, bit error information supplied from the transmission unit 653, and the determination result is supplied to the control unit 651.
- the control unit 651 determines the quality of the transmission path 660 based on the determination result of the detection unit 656.
- the control unit 651 When the transmission rate of the transmission path 660 is lower than the transmission rate required for the 3D image data transmission method notified from the transmission device 610, the control unit 651 generates transmission rate information indicating that fact, and this transmission The rate information is transmitted from the transmission unit 653 to the transmission device 610.
- the transmission unit 650 receives the transmission rate information transmitted from the reception device 650, and this transmission rate information is supplied to the control unit 611. Based on the transmission rate information, the control unit 611 changes the selection of the 3D image data transmission method so as to be within the transmission rate of the transmission path 660. In the 3D signal processing unit 614, the 3D image data reproduced by the reproduction unit 613 is processed so as to match the changed transmission method. Then, the processed 3D image data is transmitted by the transmission unit 615 to the reception device 650 via the transmission path 660. In addition, information on the transmission method changed by the control unit 611 is transmitted from the transmission unit 615 to the reception device 650.
- the transmission device 610 has a necessary transmission rate as a transmission method of 3D image data to be transmitted based on the transmission rate information sent from the reception device 650.
- a transmission method that falls within the transmission rate of the transmission line 660 can be selected. Therefore, stereoscopic image data can always be transmitted satisfactorily regardless of changes in the state of the transmission path.
- the transmission rate information transmitted from the receiving device 650 to the transmitting device 610 indicates that the transmission rate of the transmission path 660 is lower than the transmission rate required for the 3D image data transmission method notified from the transmitting device 610. Although shown, this transmission rate information may indicate the transmission rate of the transmission path 660.
- the transmission rate information indicating that from the reception device 650 to the transmission device 610 Although what was transmitted was shown, you may make it as follows. That is, in this case, only the transmission method within the transmission rate of the transmission path 660 is rewritten in the E-EDID stored in the storage unit 652 and the transmission method information of 3D image data that can be supported by the receiving device 650 is rewritten. Enabled.
- the receiving device 650 needs to notify the transmitting device 610 of the change of the E-EDID.
- the transmission line 660 is an HDMI interface
- the HPD signal is temporarily controlled to “L”, and the transmission device 610 is controlled to read the E-EDID again.
- the 3D image data when the 3D image data is composed of left-eye image data and right-eye image data, either one is transmitted through the TMDS channel, and the other is transmitted through a predetermined line of the HDMI cable 350 (in this embodiment, the reserved line).
- an HPD line may be transmitted via a bidirectional communication path.
- the 3D image data when the 3D image data is composed of 2D image data and depth data, the 2D image data is transmitted through the TMDS channel, and the depth data is transmitted to a predetermined line of the HDMI cable 350 (in this embodiment, You may transmit in the bidirectional
- the disc player 210 is used as the transmission device (source device) and the television receiver 250 is used as the reception device (sink device).
- the transmission device source device
- the television receiver 250 is used as the reception device (sink device).
- other transmission devices and reception devices are used.
- the present invention can be similarly applied to what is used.
- the present invention favorably transmits 3D image data from a transmission device to a reception device using a transmission method selected based on correspondence information of the transmission method of 3D image data in the reception device.
- 3D image data transmission system configured by a receiving device.
- TV receiver 251 ⁇ HDMI terminal, 252 ⁇ HDMI receiver, 253 ⁇ High-speed data line interface, 254 ⁇ 3D signal processor, 255 ⁇ ⁇ ⁇ antenna terminal, 256 ⁇ ⁇ ⁇ digital tuner, 257 ⁇ ⁇ ⁇ demultiplexer 258, MPEG decoder, 259, video signal processing circuit, 260, graphic generation circuit, 261, panel drive circuit, 262, display panel, 263, audio signal processing circuit, 264 ... Audio amplifier circuit, 265 ... Speaker, 170 ... Internal bus, 271 ... CPU, 272 ... Flash ROM, 273 ... DRAM, 274 ... Ethernet interface, 275 ... Network terminal, 276 ... Remote control receiver, 277 ... Remote control transmitter, 278 ...
- DTC Circuit 350 ... HDMI cable, 600 ... Transmission system, 610 ... Transmission device, 611 ... Control unit, 612 ... Storage unit, 613 ... Playback unit, 614 ... 3D signal Processing unit, 615... Transmission unit, 650... Receiving device, 651... Control unit, 652... Storage unit, 653 .. transmission unit, 654 .. 3D signal processing unit, 655. .Display unit, 656... Detection unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
立体視画像を表示するための立体画像データを、伝送路を介して外部機器に送信するデータ送信部と、
上記外部機器から上記伝送路を介して送られてくる、該外部機器が対応可能な立体画像データの伝送方式情報を受信する伝送方式情報受信部と、
上記伝送方式情報受信部で受信された伝送方式情報に基づいて、上記外部機器が対応可能な立体画像データの伝送方式から、上記データ送信部で送信される上記立体画像データの伝送方式として所定の伝送方式を選択する伝送方式選択部と、
上記データ送信部で送信される上記立体画像データの伝送方式の情報を、上記伝送路を介して上記外部機器に送信する伝送方式情報送信部と
を備える送信装置にある。
外部機器から、伝送路を介して、立体視画像を表示するための立体画像データを受信するデータ受信部と、
上記外部機器から、上記データ受信部で受信される立体画像データの伝送方式情報を受信する伝送方式情報受信部と、
上記伝送方式情報受信部で受信された伝送方式情報に基づいて、上記データ受信部で受信された立体画像データを処理して左眼画像データおよび右眼画像データを生成するデータ処理部と、
自身が対応可能な立体画像データの伝送方式情報を記憶しておく伝送方式情報記憶部と、 上記伝送方式情報記憶部に記憶されている伝送方式情報を、上記伝送路を介して上記外部機器に送信する伝送方式情報送信部と
を備える受信装置にある。
Claims (29)
- 立体視画像を表示するための立体画像データを、伝送路を介して外部機器に送信するデータ送信部と、
上記外部機器から上記伝送路を介して送られてくる、該外部機器が対応可能な立体画像データの伝送方式情報を受信する伝送方式情報受信部と、
上記伝送方式情報受信部で受信された伝送方式情報に基づいて、上記外部機器が対応可能な立体画像データの伝送方式から、上記データ送信部で送信される上記立体画像データの伝送方式として所定の伝送方式を選択する伝送方式選択部と、
上記データ送信部で送信される上記立体画像データの伝送方式の情報を、上記伝送路を介して上記外部機器に送信する伝送方式情報送信部と
を備える送信装置。 - 上記データ送信部は、上記外部機器に、上記立体画像データを、複数チャネルで、差動信号により、上記伝送路を介して送信する
請求項1に記載の送信装置。 - 上記伝送方式情報送信部は、上記データ送信部で送信される上記立体画像データの伝送方式情報を、該立体画像データのブランキング期間に挿入することで、該伝送方式情報を上記外部機器に送信する
請求項2に記載の送信装置。 - 上記伝送方式情報送信部は、上記データ送信部で送信される上記立体画像データの伝送方式情報を、上記伝送路を構成する制御データラインを介して、上記外部機器に送信する
請求項2に記載の送信装置。 - 上記伝送方式情報送信部は、上記伝送路の所定ラインを用いて構成される双方向通信路を介して、上記データ送信部で送信される上記立体画像データの伝送方式情報を上記外部機器に送信する
請求項2に記載の送信装置。 - 上記双方向通信路は一対の差動伝送路であり、該一対の差動伝送路のうち少なくとも一方は直流バイアス電位によって上記外部機器の接続状態を通知する機能を有する
請求項5に記載の送信装置。 - 上記外部機器から上記伝送路を介して送られてくる該伝送路の伝送レート情報を受信する伝送レート情報受信部を備え、
上記伝送方式選択部は、上記伝送方式情報受信部で受信された伝送方式情報と共に、上記伝送レート情報送信部で受信された伝送レート情報に基づいて、上記所定の伝送方式を選択する
請求項3に記載の送信装置。 - 上記立体画像データは、第1のデータおよび第2のデータを含み、
上記データ送信部は、上記第1のデータを第1の伝送路を介して上記外部機器に送信し、上記第2のデータを第2の伝送路を介して上記外部機器に送信する
請求項1に記載の送信装置。 - 上記第2の伝送路は、上記第1の伝送路の所定ラインを用いて構成された双方向通信路であり、
上記データ送信部は、上記外部機器に上記第1の伝送路を介して、上記第1のデータを、複数のチャネルで、差動信号により送信し、上記外部機器に上記双方向通信路を介して、上記第2のデータを送信する
請求項8に記載の送信装置。 - 上記第1のデータは左眼画像データまたは右眼画像データであり、上記第2のデータは上記右眼画像データまたは上記左眼画像データである
請求項9に記載の送信装置。 - 上記第1のデータは2次元画像データであり、上記第2のデータは各画素に対応した奥行きデータである
請求項9に記載の送信装置。 - 上記立体画像データは2次元画像データおよび各画素に対応した奥行きデータを含み、
上記データ送信部は、各画素のデータ領域に、上記2次元データを構成する画素データおよび該画素データに対応した上記奥行きデータを配置して送信する
請求項1に記載の送信装置。 - 上記伝送方式情報送信部は、上記データ送信部で送信される上記立体画像データの伝送方式情報を、該立体画像データのブランキング期間に挿入することで、該伝送方式情報を上記外部機器に送信する
請求項1に記載の送信装置。 - 上記外部機器から上記伝送路を介して送られてくる該伝送路の伝送レート情報を受信する伝送レート情報受信部を備え、
上記伝送方式選択部は、上記伝送方式情報受信部で受信された伝送方式情報と共に、上記伝送レート情報送信部で受信された伝送レート情報に基づいて、上記所定の伝送方式を選択する
請求項1に記載の送信装置。 - 外部機器から伝送路を介して送られてくる、該外部機器が対応可能な立体画像データの伝送方式情報を受信する伝送方式情報受信ステップと、
上記伝送方式情報受信ステップで受信された伝送方式情報に基づいて、上記外部機器が対応可能な立体画像データの伝送方式から所定の伝送方式を選択する伝送方式選択ステップと、
上記伝送方式選択ステップで選択された伝送方式の立体画像データを、上記伝送路を介して上記外部機器に送信するデータ送信ステップと、
上記データ送信ステップで送信される上記立体画像データの伝送方式情報を、上記伝送路を介して上記外部機器に送信する伝送方式情報送信ステップと
を有する立体画像データ送信方法。 - 外部機器から、伝送路を介して、立体視画像を表示するための立体画像データを受信するデータ受信部と、
上記外部機器から、上記データ受信部で受信される立体画像データの伝送方式情報を受信する伝送方式情報受信部と、
上記伝送方式情報受信部で受信された伝送方式情報に基づいて、上記データ受信部で受信された立体画像データを処理して左眼画像データおよび右眼画像データを生成するデータ処理部と、
自身が対応可能な立体画像データの伝送方式情報を記憶しておく伝送方式情報記憶部と、
上記伝送方式情報記憶部に記憶されている伝送方式情報を、上記伝送路を介して上記外部機器に送信する伝送方式情報送信部と
を備える受信装置。 - 上記データ受信部は、上記外部機器から、複数チャネルで差動信号により、上記伝送路を介して、上記立体画像データを受信する
請求項16に記載の受信装置。 - 上記伝送方式情報受信部は、上記データ受信部で受信された上記立体画像データのブランキング期間から、該立体画像データの伝送方式情報を抽出する
請求項17に記載の受信装置。 - 上記伝送方式情報受信部は、上記データ受信部で受信された上記立体画像データの伝送方式情報を、上記外部機器から、上記伝送路を構成する制御データラインを介して受信する
請求項17に記載の受信装置。 - 上記伝送方式情報受信部は、上記データ受信部で受信された上記立体画像データの伝送方式情報を、上記外部機器から、上記伝送路の所定ラインを用いて構成される双方向通信路を介して受信する
請求項17に記載の受信装置。 - 上記双方向通信路は一対の差動伝送路であり、該一対の差動伝送路のうち少なくとも一方は直流バイアス電位によって上記外部機器の接続状態を通知する機能を有する
請求項20に記載の受信装置。 - 上記データ受信部のデータ受信状態に基づいて、上記伝送路の伝送レート情報を取得する伝送レート情報取得部と、
上記伝送レート情報取得部で取得された伝送レート情報を、上記伝送路を介して上記外部機器に送信する伝送レート情報送信部とをさらに備える
請求項18に記載の受信装置。 - 上記立体画像データは、第1のデータおよび第2のデータを含み、
上記データ受信部は、上記第1のデータを上記外部機器から第1の伝送路を介して受信し、上記第2のデータを上記外部機器から第2の伝送路を介して受信する
請求項16に記載の受信装置。 - 上記第2の伝送路は、上記第1の伝送路の所定ラインを用いて構成された双方向通信路であり、
上記データ受信部は、上記外部機器から上記第1の伝送路を介して、上記第1のデータを、複数のチャネルで、差動信号により受信し、上記外部機器から上記双方向通信路を介して、上記第2のデータを受信する
請求項23に記載の受信装置。 - 上記第1のデータは左眼画像データまたは右眼画像データであり、上記第2のデータは上記右眼画像データまたは上記左眼画像データである
請求項24に記載の受信装置。 - 上記第1のデータは2次元画像データであり、上記第2のデータは各画素に対応した奥行きデータである
請求項24に記載の受信装置。 - 上記伝送方式情報受信部は、上記データ受信部で受信された上記立体画像データのブランキング期間から、該立体画像データの伝送方式情報を抽出する
請求項16に記載の受信装置。 - 上記データ受信部のデータ受信状態に基づいて、上記伝送路の伝送レート情報を取得する伝送レート情報取得部と、
上記伝送レート情報取得部で取得された伝送レート情報を、上記伝送路を介して上記外部機器に送信する伝送レート情報送信部とをさらに備える
請求項16に記載の受信装置。 - 自身が対応可能な立体画像データの伝送方式情報を、伝送路を介して、外部機器に送信する伝送方式情報送信ステップと、
上記外部機器から、上記伝送路を介して、立体画像データを受信するデータ受信ステップと、
上記外部機器から、上記データ受信ステップで受信された立体画像データの伝送方式情報を受信する伝送方式情報受信ステップと、
上記伝送方式情報受信ステップで受信された伝送方式情報に基づいて、上記データ受信ステップで受信された立体画像データを処理して左眼画像データおよび右眼画像データを生成するデータ処理ステップと
を有する立体画像データ受信方法。
Priority Applications (15)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09797937.1A EP2302925B1 (en) | 2008-07-16 | 2009-07-15 | Transmitter, three-dimensional image data transmitting method, receiver, and three-dimensional image data receiving method |
EP19196516.9A EP3598743B1 (en) | 2008-07-16 | 2009-07-15 | Method transmitting and receiving three-dimensional image data |
BRPI0904809-0A BRPI0904809A2 (pt) | 2008-07-16 | 2009-07-15 | Aparelho de transmissão, método de transmissão de dados de imagem estéreo, aparelho de recepção, e, método de recepção de dados de imagem estéreo |
ES09797937.1T ES2626302T3 (es) | 2008-07-16 | 2009-07-15 | Transmisor, método de transmisión de datos de imagen tridimensional, receptor y método de recepción de datos de imagen tridimensional |
US12/733,580 US9185385B2 (en) | 2008-07-16 | 2009-07-15 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
EP18167244.5A EP3389262B1 (en) | 2008-07-16 | 2009-07-15 | System transmitting and receiving three-dimensional image data |
CN2009801003377A CN101803382B (zh) | 2008-07-16 | 2009-07-15 | 发送器、三维图像数据发送方法、接收器和三维图像数据接收方法 |
KR1020097026595A KR101386816B1 (ko) | 2008-07-16 | 2009-07-15 | 송신 장치, 입체 화상 데이터 송신 방법, 수신 장치 및 입체 화상 데이터 수신 방법 |
RU2010108478/07A RU2522424C2 (ru) | 2008-07-16 | 2009-07-15 | Передающее устройство, способ передачи данных стереоскопического изображения, приемное устройство и способ приема данных стереоскопического изображения |
EP16207378.7A EP3174288B1 (en) | 2008-07-16 | 2009-07-15 | Transmitter, three-dimensional image data transmitting method |
US13/968,677 US20140092211A1 (en) | 2008-07-16 | 2013-08-16 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
US14/882,828 US9451235B2 (en) | 2008-07-16 | 2015-10-14 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
US15/217,028 US9762887B2 (en) | 2008-07-16 | 2016-07-22 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
US15/491,439 US9807363B2 (en) | 2008-07-16 | 2017-04-19 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
US15/685,451 US10015468B2 (en) | 2008-07-16 | 2017-08-24 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-184520 | 2008-07-16 | ||
JP2008184520A JP5338166B2 (ja) | 2008-07-16 | 2008-07-16 | 送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/733,580 A-371-Of-International US9185385B2 (en) | 2008-07-16 | 2009-07-15 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
US13/968,677 Continuation US20140092211A1 (en) | 2008-07-16 | 2013-08-16 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
US14/882,828 Continuation US9451235B2 (en) | 2008-07-16 | 2015-10-14 | Transmitting apparatus, stereo image data transmitting method, receiving apparatus, and stereo image data receiving method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010008012A1 true WO2010008012A1 (ja) | 2010-01-21 |
Family
ID=41550414
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/062788 WO2010008012A1 (ja) | 2008-07-16 | 2009-07-15 | 送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 |
Country Status (10)
Country | Link |
---|---|
US (6) | US9185385B2 (ja) |
EP (4) | EP3174288B1 (ja) |
JP (1) | JP5338166B2 (ja) |
KR (1) | KR101386816B1 (ja) |
CN (2) | CN102883215B (ja) |
BR (1) | BRPI0904809A2 (ja) |
ES (1) | ES2626302T3 (ja) |
RU (1) | RU2522424C2 (ja) |
TW (3) | TWI425822B (ja) |
WO (1) | WO2010008012A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102169707A (zh) * | 2010-02-18 | 2011-08-31 | 三星电子株式会社 | 图像显示系统及其显示方法 |
US20110285818A1 (en) * | 2010-05-20 | 2011-11-24 | Samsung Electronics Co., Ltd. | Source device and sink device and method of transmitting and receiving multimedia service and related data |
WO2012077208A1 (ja) * | 2010-12-09 | 2012-06-14 | Necディスプレイソリューションズ株式会社 | 信号処理回路およびその制御方法 |
CN102918853A (zh) * | 2010-06-01 | 2013-02-06 | 英特尔公司 | 用于立体三维系统中的自适应稳定图像时序的方法和装置 |
CN102934390A (zh) * | 2010-03-19 | 2013-02-13 | 矽晶程式库股份有限公司 | 无线传输系统以及其中所用的无线发射机、无线接收机、无线发射方法、无线接收方法及无线通信方法 |
CN105791804A (zh) * | 2010-02-18 | 2016-07-20 | 三星电子株式会社 | 图像显示系统及其显示方法 |
US9491432B2 (en) | 2010-01-27 | 2016-11-08 | Mediatek Inc. | Video processing apparatus for generating video output satisfying display capability of display device according to video input and related method thereof |
EP3139620A4 (en) * | 2014-05-01 | 2017-12-13 | Sony Corporation | Communication apparatus or communication method, and computer program |
WO2024075743A1 (ja) * | 2022-10-04 | 2024-04-11 | ザインエレクトロニクス株式会社 | 送信装置、受信装置および送受信システム |
Families Citing this family (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102239696B (zh) * | 2008-12-04 | 2014-02-19 | 日本电气株式会社 | 图像传输系统、图像传输设备和图像传输方法 |
WO2010084437A2 (en) * | 2009-01-20 | 2010-07-29 | Koninklijke Philips Electronics N.V. | Transferring of 3d image data |
CN102292996B (zh) * | 2009-01-20 | 2014-09-17 | 皇家飞利浦电子股份有限公司 | 3d图像数据的传输 |
US20110256927A1 (en) | 2009-03-25 | 2011-10-20 | MEP Games Inc. | Projection of interactive game environment |
US20110165923A1 (en) * | 2010-01-04 | 2011-07-07 | Davis Mark L | Electronic circle game system |
US9971458B2 (en) | 2009-03-25 | 2018-05-15 | Mep Tech, Inc. | Projection of interactive environment |
JP2010258583A (ja) * | 2009-04-22 | 2010-11-11 | Panasonic Corp | 立体画像表示装置、立体画像再生装置および立体画像視認システム |
US9131215B2 (en) * | 2009-09-29 | 2015-09-08 | Samsung Electronics Co., Ltd. | Method and apparatus for transmitting and receiving uncompressed three-dimensional video data via digital data interface |
JP5588144B2 (ja) * | 2009-10-14 | 2014-09-10 | パナソニック株式会社 | 映像信号処理装置及び映像信号処理方法 |
KR20110064722A (ko) * | 2009-12-08 | 2011-06-15 | 한국전자통신연구원 | 영상 처리 정보와 컬러 정보의 동시 전송을 위한 코딩 장치 및 방법 |
KR101787133B1 (ko) | 2010-02-15 | 2017-10-18 | 톰슨 라이센싱 | 비디오 콘텐츠 처리 장치 및 방법 |
JP4861495B2 (ja) | 2010-05-31 | 2012-01-25 | 株式会社東芝 | 映像変換装置及び映像変換方法 |
JP4861493B2 (ja) * | 2010-05-31 | 2012-01-25 | 株式会社東芝 | 情報出力制御装置及び情報出力制御方法 |
JP4861494B2 (ja) * | 2010-05-31 | 2012-01-25 | 株式会社東芝 | 映像出力制御装置及び映像出力制御方法 |
JP4861496B2 (ja) | 2010-05-31 | 2012-01-25 | 株式会社東芝 | 映像変換装置及び映像変換方法 |
TWI423197B (zh) * | 2010-07-20 | 2014-01-11 | Innolux Corp | 驅動方法與顯示裝置 |
KR20120017228A (ko) * | 2010-08-18 | 2012-02-28 | 엘지전자 주식회사 | 이동 단말기 및 상기 이동 단말기의 영상 표시 방법 |
US20120050462A1 (en) * | 2010-08-25 | 2012-03-01 | Zhibing Liu | 3d display control through aux channel in video display devices |
JP2012053165A (ja) * | 2010-08-31 | 2012-03-15 | Sony Corp | 情報処理装置、プログラムおよび情報処理方法 |
TWI406559B (zh) * | 2010-09-28 | 2013-08-21 | Innolux Corp | 顯示方法及執行其之電腦可讀取媒體 |
JP2012120142A (ja) * | 2010-11-08 | 2012-06-21 | Sony Corp | 立体画像データ送信装置、立体画像データ送信方法および立体画像データ受信装置 |
WO2012071063A1 (en) | 2010-11-23 | 2012-05-31 | Circa3D, Llc | Blanking inter-frame transitions of a 3d signal |
KR20120058702A (ko) | 2010-11-27 | 2012-06-08 | 전자부품연구원 | 디지털 방송에서 서비스 호환 방식 전송 방법 |
KR20120058700A (ko) | 2010-11-27 | 2012-06-08 | 전자부품연구원 | 디지털 방송의 전송 모드 제공 및 인지 방법 |
KR20120066433A (ko) * | 2010-12-14 | 2012-06-22 | 삼성전자주식회사 | 영상송신장치 및 그 제어방법과, 영상수신장치 및 그 제어방법 |
JP5811602B2 (ja) * | 2010-12-16 | 2015-11-11 | ソニー株式会社 | 画像生成装置、プログラム、画像表示システム、および画像表示装置 |
US8526714B2 (en) * | 2011-01-13 | 2013-09-03 | Himax Media Solutions, Inc. | Method and system for reconstructing a stereoscopic image stream from quincunx sampled frames |
JP5673172B2 (ja) * | 2011-02-09 | 2015-02-18 | ソニー株式会社 | 電子機器、電子機器における立体画像情報送信方法、および電子機器における立体画像情報受信方法 |
WO2012112142A1 (en) | 2011-02-15 | 2012-08-23 | Thomson Licensing | Apparatus and method for generating a disparity map in a receiving device |
US9412330B2 (en) * | 2011-03-15 | 2016-08-09 | Lattice Semiconductor Corporation | Conversion of multimedia data streams for use by connected devices |
KR101817939B1 (ko) | 2011-03-28 | 2018-01-15 | 삼성디스플레이 주식회사 | 3차원 영상 데이터 처리 방법 및 이를 수행하는 표시 장치 |
JP5790132B2 (ja) * | 2011-05-06 | 2015-10-07 | 富士通株式会社 | 情報処理装置、情報処理方法、情報処理プログラム |
KR101852349B1 (ko) * | 2011-06-23 | 2018-04-27 | 삼성디스플레이 주식회사 | 입체 영상 표시 방법 및 입체 영상 표시 장치 |
US9351028B2 (en) | 2011-07-14 | 2016-05-24 | Qualcomm Incorporated | Wireless 3D streaming server |
JP5328852B2 (ja) * | 2011-07-25 | 2013-10-30 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理方法、プログラム及び情報記憶媒体 |
US9247231B2 (en) | 2011-08-29 | 2016-01-26 | Nec Display Solutions, Ltd. | 3D image signal processing apparatus |
US8891894B2 (en) * | 2011-09-30 | 2014-11-18 | Apple Inc. | Psychovisual image compression |
JP5694412B2 (ja) * | 2011-10-20 | 2015-04-01 | 株式会社東芝 | 送信装置、受信装置、送信方法及び受信方法 |
JP5232319B2 (ja) | 2011-10-20 | 2013-07-10 | 株式会社東芝 | 通信装置及び通信方法 |
JP4940376B2 (ja) * | 2011-11-02 | 2012-05-30 | 株式会社東芝 | 映像出力制御装置及び映像出力制御方法 |
JP5060650B2 (ja) * | 2011-11-02 | 2012-10-31 | 株式会社東芝 | 情報出力制御装置及び情報出力制御方法 |
TWI541586B (zh) * | 2011-12-16 | 2016-07-11 | 鴻海精密工業股份有限公司 | 立體攝像裝置 |
WO2013122387A1 (en) * | 2012-02-15 | 2013-08-22 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transceiving system, data transmitting method, and data receiving method |
WO2013122385A1 (en) | 2012-02-15 | 2013-08-22 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transreceiving system, data transmitting method, data receiving method and data transreceiving method |
JP5002734B2 (ja) * | 2012-02-22 | 2012-08-15 | 株式会社東芝 | 映像出力制御装置及び映像出力制御方法 |
CN102780894B (zh) * | 2012-05-31 | 2016-12-14 | 新奥特(北京)视频技术有限公司 | 一种3d图像的编解码方法 |
JP5390667B2 (ja) | 2012-06-11 | 2014-01-15 | 株式会社東芝 | 映像送信機器及び映像受信機器 |
US9317109B2 (en) | 2012-07-12 | 2016-04-19 | Mep Tech, Inc. | Interactive image projection accessory |
US9355613B2 (en) | 2012-10-09 | 2016-05-31 | Mediatek Inc. | Data processing apparatus for transmitting/receiving compression-related indication information via display interface and related data processing method |
CN103986549B (zh) | 2013-02-07 | 2018-03-13 | 辉达公司 | 用于网络数据传送的设备、系统以及方法 |
TWI684364B (zh) | 2013-06-21 | 2020-02-01 | 日商新力股份有限公司 | 送訊裝置、高動態範圍影像資料送訊方法、收訊裝置、高動態範圍影像資料收訊方法及程式 |
US9778546B2 (en) | 2013-08-15 | 2017-10-03 | Mep Tech, Inc. | Projector for projecting visible and non-visible images |
EP3107288B1 (en) | 2014-02-10 | 2020-09-09 | LG Electronics Inc. | Method and apparatus for transmitting/receiving broadcast signal for 3-dimensional (3d) broadcast service |
WO2015129496A1 (ja) * | 2014-02-26 | 2015-09-03 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
CN103929610B (zh) * | 2014-04-23 | 2017-08-08 | 利亚德光电股份有限公司 | 用于led电视的数据处理方法、装置及led电视 |
US10637972B2 (en) * | 2014-05-06 | 2020-04-28 | Lattice Semiconductor Corporation | System for dynamic audio visual capabilities exchange |
US9554183B2 (en) * | 2014-05-08 | 2017-01-24 | Lattice Semiconductor Corporation | Caching of capabilities information of counterpart device for efficient handshaking operation |
CN106063260B (zh) * | 2014-05-08 | 2018-04-03 | 奥林巴斯株式会社 | 视频处理器、视频处理器的工作方法 |
WO2015190864A1 (ko) * | 2014-06-12 | 2015-12-17 | 엘지전자(주) | 고속 인터페이스를 이용하여 객체 기반 오디오 데이터를 처리하는 방법 및 장치 |
KR102310241B1 (ko) * | 2015-04-29 | 2021-10-08 | 삼성전자주식회사 | 소스 디바이스, 그의 제어 방법, 싱크 디바이스 및 그의 화질 개선 처리 방법 |
US10621690B2 (en) * | 2015-09-17 | 2020-04-14 | Qualcomm Incorporated | Storing bandwidth-compressed graphics data |
US10102606B2 (en) * | 2016-09-30 | 2018-10-16 | Intel Corporation | Transmission of data based on a configuration database |
JP6949860B2 (ja) * | 2016-10-20 | 2021-10-13 | エイム電子株式会社 | Hdmi光ケーブルおよびhdmi光変換装置 |
EP3373595A1 (en) * | 2017-03-07 | 2018-09-12 | Thomson Licensing | Sound rendering with home cinema system and television |
CN106973188A (zh) * | 2017-04-11 | 2017-07-21 | 北京图森未来科技有限公司 | 一种图像传输装置和方法 |
US10462417B2 (en) * | 2017-08-31 | 2019-10-29 | Apple Inc. | Methods and apparatus for reducing electromagnetic interference resultant from data transmission over a high-speed audio/visual interface |
JP2022031983A (ja) | 2018-10-02 | 2022-02-24 | ソニーセミコンダクタソリューションズ株式会社 | 送信装置、受信装置及び送受信システム |
TWI723683B (zh) * | 2019-12-17 | 2021-04-01 | 瑞昱半導體股份有限公司 | 視訊介面轉換裝置及方法 |
JP2021150791A (ja) | 2020-03-18 | 2021-09-27 | ソニーグループ株式会社 | 撮像装置および撮像装置の制御方法 |
JP2021150790A (ja) | 2020-03-18 | 2021-09-27 | ソニーグループ株式会社 | 送信装置、送信方法および受信装置 |
CN114302194B (zh) * | 2021-01-14 | 2023-05-05 | 海信视像科技股份有限公司 | 一种显示设备及多设备切换时的播放方法 |
US11363050B1 (en) | 2021-03-25 | 2022-06-14 | Bank Of America Corporation | Information security system and method for incompliance detection in data transmission |
US11412177B1 (en) * | 2021-07-12 | 2022-08-09 | Techpoint, Inc. | Method and apparatus for transmitting and receiving audio over analog video transmission over a single coaxial cable |
TWI783845B (zh) * | 2022-01-04 | 2022-11-11 | 大陸商北京集創北方科技股份有限公司 | 級聯驅動電路之資料傳輸方法、led顯示驅動電路、led顯示裝置及資訊處理裝置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005051547A (ja) * | 2003-07-29 | 2005-02-24 | Matsushita Electric Ind Co Ltd | 映像音声出力装置、映像音声受信装置、映像音声出力方法、映像音声受信方法、およびコンピュータプログラム |
JP2005318490A (ja) * | 2004-03-31 | 2005-11-10 | Victor Co Of Japan Ltd | 伝送システム |
JP2007336518A (ja) * | 2006-05-16 | 2007-12-27 | Sony Corp | 伝送方法、伝送システム、送信方法、送信装置、受信方法及び受信装置 |
JP2008042645A (ja) * | 2006-08-08 | 2008-02-21 | Nikon Corp | カメラおよび画像表示装置並びに画像記憶装置 |
JP2008145679A (ja) * | 2006-12-08 | 2008-06-26 | Sharp Corp | 表示装置及びavシステム |
JP2008544679A (ja) * | 2005-06-23 | 2008-12-04 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 画像および関連データの組み合わされた交換 |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2010453C1 (ru) * | 1989-12-18 | 1994-03-30 | Сергей Николаевич Сидоров | Устройство для телевизионной передачи и приема стереоскопического изображения |
JP2001061164A (ja) * | 1999-08-19 | 2001-03-06 | Toshiba Corp | 立体映像信号伝送方法 |
TW580826B (en) * | 2001-01-12 | 2004-03-21 | Vrex Inc | Method and apparatus for stereoscopic display using digital light processing |
JP3789794B2 (ja) | 2001-09-26 | 2006-06-28 | 三洋電機株式会社 | 立体画像処理方法、装置、およびシステム |
US7277121B2 (en) | 2001-08-29 | 2007-10-02 | Sanyo Electric Co., Ltd. | Stereoscopic image processing and display system |
KR100397511B1 (ko) | 2001-11-21 | 2003-09-13 | 한국전자통신연구원 | 양안식/다시점 3차원 동영상 처리 시스템 및 그 방법 |
JP4154569B2 (ja) | 2002-07-10 | 2008-09-24 | 日本電気株式会社 | 画像圧縮伸長装置 |
JP4190357B2 (ja) | 2003-06-12 | 2008-12-03 | シャープ株式会社 | 放送データ送信装置、放送データ送信方法および放送データ受信装置 |
KR20050004339A (ko) * | 2003-07-02 | 2005-01-12 | 엘지전자 주식회사 | 고밀도 광디스크의 그래픽 데이터 관리방법 및 그에 따른고밀도 광디스크 |
EP1587035A1 (en) * | 2004-04-14 | 2005-10-19 | Koninklijke Philips Electronics N.V. | Ghost artifact reduction for rendering 2.5D graphics |
CN1697521A (zh) * | 2004-05-13 | 2005-11-16 | 孙道明 | 一种视讯数据传输与处理的方法 |
EP1617370B1 (en) | 2004-07-15 | 2013-01-23 | Samsung Electronics Co., Ltd. | Image format transformation |
KR100716982B1 (ko) | 2004-07-15 | 2007-05-10 | 삼성전자주식회사 | 다차원 영상 포맷의 변환장치 및 방법 |
CN1756317A (zh) | 2004-10-01 | 2006-04-05 | 三星电子株式会社 | 变换多维视频格式的设备和方法 |
JP3112392U (ja) | 2005-05-10 | 2005-08-11 | 船井電機株式会社 | Hdtv |
KR100722855B1 (ko) * | 2005-09-06 | 2007-05-30 | 삼성전자주식회사 | 미디어 수신장치와 이를 포함하는 미디어 시스템 및 그제어방법 |
JP2007166277A (ja) * | 2005-12-14 | 2007-06-28 | Nippon Telegr & Teleph Corp <Ntt> | 3次元画像情報の伝送方法、送信側装置および受信側装置 |
US20070242062A1 (en) | 2006-04-18 | 2007-10-18 | Yong Guo | EDID pass through via serial channel |
WO2007124056A2 (en) * | 2006-04-21 | 2007-11-01 | Locolabs, Inc. | Inline audio/visual conversion |
RU66644U1 (ru) * | 2006-10-18 | 2007-09-10 | Михаил Сергеевич Цветков | Модуль многоканального ввода-вывода и обработки hd/sd sdi видео dvi/hdmi графики |
KR101432846B1 (ko) * | 2006-11-07 | 2014-08-26 | 소니 주식회사 | 전자기기 및 케이블 장치 |
JP2008131282A (ja) | 2006-11-20 | 2008-06-05 | Sony Corp | 映像伝送方法、映像伝送システム及び映像処理装置 |
JP4967731B2 (ja) * | 2007-03-15 | 2012-07-04 | セイコーエプソン株式会社 | 画像表示装置及びそのための光学部材 |
US8207962B2 (en) * | 2007-06-18 | 2012-06-26 | Mediatek Inc. | Stereo graphics system based on depth-based image rendering and processing method thereof |
US7836223B2 (en) * | 2007-07-02 | 2010-11-16 | Silicon Image, Inc. | Operation of media interface to provide bidirectional communications |
JP2009135686A (ja) * | 2007-11-29 | 2009-06-18 | Mitsubishi Electric Corp | 立体映像記録方法、立体映像記録媒体、立体映像再生方法、立体映像記録装置、立体映像再生装置 |
KR101520620B1 (ko) * | 2008-08-18 | 2015-05-18 | 삼성전자주식회사 | 2차원/3차원 재생 모드 결정 방법 및 장치 |
-
2008
- 2008-07-16 JP JP2008184520A patent/JP5338166B2/ja active Active
-
2009
- 2009-07-10 TW TW098123492A patent/TWI425822B/zh not_active IP Right Cessation
- 2009-07-10 TW TW102141643A patent/TWI514846B/zh active
- 2009-07-10 TW TW102128459A patent/TWI432014B/zh active
- 2009-07-15 EP EP16207378.7A patent/EP3174288B1/en active Active
- 2009-07-15 US US12/733,580 patent/US9185385B2/en active Active
- 2009-07-15 CN CN201210352343.5A patent/CN102883215B/zh active Active
- 2009-07-15 EP EP19196516.9A patent/EP3598743B1/en active Active
- 2009-07-15 WO PCT/JP2009/062788 patent/WO2010008012A1/ja active Application Filing
- 2009-07-15 CN CN2009801003377A patent/CN101803382B/zh active Active
- 2009-07-15 ES ES09797937.1T patent/ES2626302T3/es active Active
- 2009-07-15 KR KR1020097026595A patent/KR101386816B1/ko active IP Right Grant
- 2009-07-15 EP EP09797937.1A patent/EP2302925B1/en active Active
- 2009-07-15 EP EP18167244.5A patent/EP3389262B1/en active Active
- 2009-07-15 BR BRPI0904809-0A patent/BRPI0904809A2/pt active Search and Examination
- 2009-07-15 RU RU2010108478/07A patent/RU2522424C2/ru not_active IP Right Cessation
-
2013
- 2013-08-16 US US13/968,677 patent/US20140092211A1/en not_active Abandoned
-
2015
- 2015-10-14 US US14/882,828 patent/US9451235B2/en active Active
-
2016
- 2016-07-22 US US15/217,028 patent/US9762887B2/en active Active
-
2017
- 2017-04-19 US US15/491,439 patent/US9807363B2/en active Active
- 2017-08-24 US US15/685,451 patent/US10015468B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005051547A (ja) * | 2003-07-29 | 2005-02-24 | Matsushita Electric Ind Co Ltd | 映像音声出力装置、映像音声受信装置、映像音声出力方法、映像音声受信方法、およびコンピュータプログラム |
JP2005318490A (ja) * | 2004-03-31 | 2005-11-10 | Victor Co Of Japan Ltd | 伝送システム |
JP2008544679A (ja) * | 2005-06-23 | 2008-12-04 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 画像および関連データの組み合わされた交換 |
JP2007336518A (ja) * | 2006-05-16 | 2007-12-27 | Sony Corp | 伝送方法、伝送システム、送信方法、送信装置、受信方法及び受信装置 |
JP2008042645A (ja) * | 2006-08-08 | 2008-02-21 | Nikon Corp | カメラおよび画像表示装置並びに画像記憶装置 |
JP2008145679A (ja) * | 2006-12-08 | 2008-06-26 | Sharp Corp | 表示装置及びavシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2302925A4 * |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9491432B2 (en) | 2010-01-27 | 2016-11-08 | Mediatek Inc. | Video processing apparatus for generating video output satisfying display capability of display device according to video input and related method thereof |
US9509976B2 (en) | 2010-02-18 | 2016-11-29 | Samsung Electronics Co., Ltd. | Image display system and display method thereof |
CN102169707A (zh) * | 2010-02-18 | 2011-08-31 | 三星电子株式会社 | 图像显示系统及其显示方法 |
EP2362667A3 (en) * | 2010-02-18 | 2014-06-25 | Samsung Electronics Co., Ltd. | 2D/3D Image display system and display method thereof |
CN105791804A (zh) * | 2010-02-18 | 2016-07-20 | 三星电子株式会社 | 图像显示系统及其显示方法 |
US9014539B2 (en) | 2010-02-18 | 2015-04-21 | Samsung Electronics Co., Ltd. | Image display system and display method thereof |
US9294320B2 (en) | 2010-03-19 | 2016-03-22 | Silicon Library Inc. | Wireless transmission system and wireless transmitter, wireless receiver, wireless transmission method, wireless reception method and wireless communication method used with same |
CN102934390B (zh) * | 2010-03-19 | 2017-03-08 | 矽晶程式库股份有限公司 | 无线传输系统以及其中所用的无线发射机、无线接收机、无线发射方法、无线接收方法及无线通信方法 |
CN102934390A (zh) * | 2010-03-19 | 2013-02-13 | 矽晶程式库股份有限公司 | 无线传输系统以及其中所用的无线发射机、无线接收机、无线发射方法、无线接收方法及无线通信方法 |
US20110285818A1 (en) * | 2010-05-20 | 2011-11-24 | Samsung Electronics Co., Ltd. | Source device and sink device and method of transmitting and receiving multimedia service and related data |
US9055281B2 (en) * | 2010-05-20 | 2015-06-09 | Samsung Electronics Co., Ltd. | Source device and sink device and method of transmitting and receiving multimedia service and related data |
EP2577981A4 (en) * | 2010-06-01 | 2013-11-13 | Intel Corp | METHOD AND APPARATUS FOR INTELLIGENT USE OF ACTIVE SPACE IN A PACKAGING FORMAT OF FRAMES |
CN102918855B (zh) * | 2010-06-01 | 2015-11-25 | 英特尔公司 | 用于合理使用帧打包格式的活动空间的方法及设备 |
CN105187816A (zh) * | 2010-06-01 | 2015-12-23 | 英特尔公司 | 用于合理使用帧打包格式的活动空间的方法及设备 |
CN102918853B (zh) * | 2010-06-01 | 2016-02-10 | 英特尔公司 | 用于立体三维系统中的自适应稳定图像时序的方法和装置 |
US9143767B2 (en) | 2010-06-01 | 2015-09-22 | Intel Corporation | Method and apparatus for adaptive stable image timing in stereoscopic 3D systems |
EP2577979A4 (en) * | 2010-06-01 | 2014-08-27 | Intel Corp | METHOD AND APPARATUS FOR ADAPTIVE STABLE IMAGE SYNCHRONIZATION IN STEREOSCOPIC 3D SYSTEMS |
CN102918855A (zh) * | 2010-06-01 | 2013-02-06 | 英特尔公司 | 用于合理使用帧打包格式的活动空间的方法及设备 |
CN102918853A (zh) * | 2010-06-01 | 2013-02-06 | 英特尔公司 | 用于立体三维系统中的自适应稳定图像时序的方法和装置 |
US9641824B2 (en) | 2010-06-01 | 2017-05-02 | Intel Corporation | Method and apparatus for making intelligent use of active space in frame packing format |
CN105187816B (zh) * | 2010-06-01 | 2017-12-22 | 英特尔公司 | 用于合理使用帧打包格式的活动空间的方法及设备 |
WO2012077208A1 (ja) * | 2010-12-09 | 2012-06-14 | Necディスプレイソリューションズ株式会社 | 信号処理回路およびその制御方法 |
EP3139620A4 (en) * | 2014-05-01 | 2017-12-13 | Sony Corporation | Communication apparatus or communication method, and computer program |
WO2024075743A1 (ja) * | 2022-10-04 | 2024-04-11 | ザインエレクトロニクス株式会社 | 送信装置、受信装置および送受信システム |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5338166B2 (ja) | 送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 | |
JP5448558B2 (ja) | 送信装置、立体画像データの送信方法、受信装置、立体画像データの受信方法、中継装置および立体画像データの中継方法 | |
US20140111613A1 (en) | Transmitting apparatus, stereoscopic image data transmitting method, receiving apparatus, and stereoscopic image data receiving method | |
JP5477499B2 (ja) | 送信装置および立体画像データ送信方法 | |
JP6304417B2 (ja) | 受信方法および受信装置 | |
JP6098678B2 (ja) | 送信方法および送信装置 | |
JP5790859B2 (ja) | 送受信システム | |
JP5621944B2 (ja) | 送信装置および送信方法 | |
JP5583866B2 (ja) | 送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 | |
JP2014131272A (ja) | 受信装置および情報処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980100337.7 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 8174/DELNP/2009 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 20097026595 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12733580 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010108478 Country of ref document: RU |
|
REEP | Request for entry into the european phase |
Ref document number: 2009797937 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009797937 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09797937 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: PI0904809 Country of ref document: BR Kind code of ref document: A2 Effective date: 20100309 |