WO2015163264A1 - 符号化装置、符号化方法、送信装置、送信方法、受信装置、受信方法およびプログラム - Google Patents
符号化装置、符号化方法、送信装置、送信方法、受信装置、受信方法およびプログラム Download PDFInfo
- Publication number
- WO2015163264A1 WO2015163264A1 PCT/JP2015/061933 JP2015061933W WO2015163264A1 WO 2015163264 A1 WO2015163264 A1 WO 2015163264A1 JP 2015061933 W JP2015061933 W JP 2015061933W WO 2015163264 A1 WO2015163264 A1 WO 2015163264A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dynamic range
- position information
- information
- knee
- data
- Prior art date
Links
- 230000005540 biological transmission Effects 0.000 title claims description 142
- 238000000034 method Methods 0.000 title claims description 48
- 210000003127 knee Anatomy 0.000 claims abstract description 271
- 238000006243 chemical reaction Methods 0.000 claims abstract description 174
- 230000006870 function Effects 0.000 claims description 28
- 230000006835 compression Effects 0.000 claims description 19
- 238000007906 compression Methods 0.000 claims description 19
- 239000000284 extract Substances 0.000 claims description 5
- 230000002349 favourable effect Effects 0.000 abstract 1
- 230000008569 process Effects 0.000 description 21
- 238000005516 engineering process Methods 0.000 description 18
- 238000003384 imaging method Methods 0.000 description 16
- 230000005236 sound signal Effects 0.000 description 15
- 230000007175 bidirectional communication Effects 0.000 description 6
- 230000006837 decompression Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- CDFKCKUONRRKJD-UHFFFAOYSA-N 1-(3-chlorophenoxy)-3-[2-[[3-(3-chlorophenoxy)-2-hydroxypropyl]amino]ethylamino]propan-2-ol;methanesulfonic acid Chemical compound CS(O)(=O)=O.CS(O)(=O)=O.C=1C=CC(Cl)=CC=1OCC(O)CNCCNCC(O)COC1=CC=CC(Cl)=C1 CDFKCKUONRRKJD-UHFFFAOYSA-N 0.000 description 4
- 102100035353 Cyclin-dependent kinase 2-associated protein 1 Human genes 0.000 description 4
- 230000006854 communication Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000002688 persistence Effects 0.000 description 4
- 230000003321 amplification Effects 0.000 description 3
- 238000005513 bias potential Methods 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000003199 nucleic acid amplification method Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000000630 rising effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/48—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using compressed domain processing techniques other than decoding, e.g. modification of transform coefficients, variable length coding [VLC] data or run-length data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/39—Control of the bit-mapped memory
- G09G5/391—Resolution modifying circuits, e.g. variable screen formats
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/136—Incoming video signal characteristics or properties
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/184—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being bits, e.g. of the compressed video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/30—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/70—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/90—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
- H04N19/98—Adaptive-dynamic-range coding [ADRC]
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0271—Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
- G09G2370/045—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller using multiple communication channels, e.g. parallel and serial
- G09G2370/047—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller using multiple communication channels, e.g. parallel and serial using display data channel standard [DDC] communication
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/12—Use of DVI or HDMI protocol in interfaces along the display data pipeline
Definitions
- the present invention relates to an encoding device, an encoding method, a transmission device, a transmission method, a reception device, a reception method, and a program, and more particularly to an encoding device that encodes dynamic range converted image data.
- a high dynamic range image is converted into a standard dynamic range image, and is transmitted along with the conversion information to the display device via a transmission line. Based on the transmitted conversion information, the standard dynamic range image is adjusted to the maximum luminance of the display device.
- a method of converting to a dynamic range and displaying is conceivable.
- Patent Document 1 proposes a recording method and processing for high dynamic range image data.
- the present technology is to allow a dynamic range conversion image to be converted into a desired image satisfactorily.
- the concept of this technology is A setting unit that sets a plurality of knee position information related to the conversion from the first dynamic range image to the second dynamic range image; An encoding unit that encodes the second dynamic range image to generate encoded data; A determination unit for determining a priority order of the plurality of knee position information; The encoding apparatus includes an adding unit that adds the plurality of knee position information to the encoded data of the second dynamic range image in a state where the priority is given.
- the setting unit sets a plurality of knee position information related to the conversion from the first dynamic range image to the second dynamic range image.
- the encoding unit encodes the second dynamic range image to generate encoded data.
- the priority of the knee position information is determined by the determination unit. For example, the determination unit may determine the priority order of the plurality of knee position information based on the compression / decompression rate of the knee position indicated by each of the plurality of knee position information.
- the adding unit adds a plurality of pieces of knee position information to the encoded data of the second dynamic range image in a state where priority is given.
- the plurality of knee position information may be given priority by arranging the arrangement in order of priority.
- a plurality of knee position information may be given priority by adding information indicating a priority relationship of the plurality of knee position information.
- a plurality of knee position information regarding the conversion is added to the encoded image data of the second dynamic range image obtained from the first dynamic range image in a state where priority is given. It is what is done. Therefore, for example, even when all knee position information cannot be transmitted or when all knee position information cannot be used, a dynamic range conversion image can be easily converted into a desired image. It becomes.
- a storage processing unit that stores encoded data of the second dynamic range image to which a plurality of knee position information is added may be further provided.
- a data transmission unit that transmits uncompressed image data of the second dynamic range image to an external device via a transmission path;
- the knee position information related to the conversion from the first dynamic range image to the second dynamic range image is limited to the number of knee positions that can be supported by the external device via the transmission path.
- the transmission device includes an information transmission unit for transmission.
- the data transmission unit transmits the non-compressed image data of the second dynamic range image to the external device via the transmission path.
- the data transmission unit may be configured to transmit uncompressed image data to an external device via a transmission path using a differential signal.
- the information transmitting unit restricts the knee position information regarding the conversion from the first dynamic range image to the second dynamic range image via the transmission path to the external device to the number of knee positions that the external device can handle. Sent.
- the information transmission unit is configured to transmit a plurality of knee position information of the uncompressed image data transmitted by the data transmission unit to an external device by inserting the knee position information into the blanking period of the uncompressed image data. May be.
- an information receiving unit that receives the number information of knee positions that can be supported by the external device from an external device may be further provided.
- a data acquisition unit that acquires encoded data of the second dynamic range image to which a plurality of knee position information is added in a state where priority is given, and encoded data of the second dynamic range image Decoding unit for obtaining uncompressed image data of the second dynamic range image, and knee position information corresponding to the number of knee positions that can be supported by the external device based on the priority order from the plurality of knee position information It may be made to further be provided with the information selection part which chooses.
- the knee position information related to the conversion from the first dynamic range image to the second dynamic range image is transmitted to the external device by limiting the number of knee positions that the external device can handle. Is. Therefore, the external device can receive the number of knee position information that can be handled by itself, and can convert a dynamic range converted image into a desired image without performing processing for selecting knee position information to be used. Easy to do.
- An image data receiving unit that receives uncompressed image data of the second dynamic range image from an external device via a transmission path;
- An information receiving unit that receives a predetermined number of knee position information related to conversion from the first dynamic range image to the second dynamic range image from the external device via the transmission path;
- a conversion processing unit that performs conversion processing on the uncompressed image data of the second dynamic range image based on the predetermined number of knee position information to obtain uncompressed image data of the predetermined dynamic range image;
- the receiver is provided with an information transmission unit that transmits the number information of knee positions that can be supported by the device itself to the external device via the transmission path.
- the data receiving unit receives uncompressed image data of the second dynamic range image from the external device via the transmission path.
- the data receiving unit may receive uncompressed image data from an external device through a transmission path using a differential signal.
- the information receiving unit receives a predetermined number of knee position information regarding the conversion from the first dynamic range image to the second dynamic range image from the external device via the transmission path.
- the information receiving unit may extract a plurality of knee position information of the uncompressed image data received by the data receiving unit from the blanking period of the uncompressed image data.
- the information transmission unit transmits the number of knee positions that can be handled by the device itself to the external device via the transmission path.
- a storage unit that stores the number information of knee positions may be further provided, and the information transmission unit may acquire and transmit the number information of knee positions from the storage unit.
- a predetermined number of knee position information related to the conversion from the first dynamic range image to the second dynamic range image is received from the external device, and the uncompressed image data of the second dynamic range image is obtained.
- the conversion processing based on the knee position information information on the number of knee positions that can be handled by the device itself is transmitted to an external device. Therefore, it is possible to receive the number of knee position information that can be handled from an external device, and it is easy to convert a dynamic range converted image into a desired image without performing processing for selecting knee position information to be used. It becomes possible.
- FIG. 1 shows a configuration example of an AV (Audio Visual) system 10 as an embodiment.
- the AV system 10 includes a digital camera 11 as an imaging device for high dynamic range images, a BD (Blu-ray Disc) player 12 as an HDMI source device, and a television receiver 13 as an HDMI sink device.
- the digital camera 11 and the BD player 12 deliver a high dynamic range image via a storage medium 14 such as a BD disc or a memory card.
- the BD player 12 and the television receiver 13 are connected via an HDMI cable 15 as a transmission path.
- the digital camera 11 supplies conversion information used by the imaging unit 11a that captures the first dynamic range image, the conversion unit 11b that converts the first dynamic range image to the second dynamic range image, and the encoding unit 11d.
- the setting unit 11c sets SPS (Sequence Parameter Set), PPS (Picture Parameter Set), VUI (video usability information), SEI (Supplemental Enhancement Information), and the like. Further, the setting unit 11c sets knee function information SEI (knee_function_info SEI) including conversion information as SEI according to an instruction from a producer or the like. The setting unit 11c supplies the set parameter set such as SPS, PPS, VUI, SEI to the encoding unit 11d.
- the BD player 12 reads the encoded data from the storage medium 14 and decodes it into an uncompressed image.
- the BD player 12 acquires conversion information of the high dynamic range image from the data decoded by the decoder 12d.
- an information transmission unit 12 e that transmits to the television receiver 13, an HDMI terminal 12 a to which the HDMI transmission unit (HDMITX) 12 b and the high-speed bus interface (high-speed bus I / F) 12 c are connected, and the television receiver 13.
- the information receiving unit 12f for receiving the number information of knee positions that can be supported by the television receiver 13 is transmitted.
- One end of the HDMI cable 15 is connected to the HDMI terminal 12a of the BD player 12, and the other end of the HDMI cable 15 is connected to the HDMI terminal 13a of the television receiver 13.
- the television receiver 13 receives the HDMI terminal 13a to which the HDMI receiving unit (HDMIRX) 13b and the high-speed bus interface (high-speed bus I / F) 13c are connected, and conversion information of uncompressed images transmitted from the BD player 12.
- Information receiver 13e, converter 13d that converts the dynamic range of the uncompressed image received by HDMI receiver 13b based on the received conversion information, and the number of knee positions that television receiver 13 can handle for BD player 12
- An information transmission unit 13f that transmits information and a storage unit 13g that stores the number information of knee positions are provided.
- Knee Function Info SEI “Example of syntax for Knee Function Info SEI”
- HEVC High Efficiency Video Coding
- JCTVC Joint Collaboration Team-Video Coding
- Fig. 2 shows an example of syntax for knee function info SEI.
- a knee conversion ID (knee_function_id) and a knee conversion cancel flag (knee_function_cancel_flag) are set.
- the knee conversion ID is a unique ID for the purpose of knee conversion that is knee compression or knee expansion.
- the knee conversion cancel flag is a flag indicating whether or not to cancel the continuity of the previous knee function info SEI.
- the knee conversion cancel flag is set to a high level “1” when canceling the continuity of the previous knee function info SEI, and set to a low level “0” when not canceling.
- dynamic range conversion information is set in knee function info SEI.
- This dynamic range conversion information includes a persistence flag (knee_function_persistence_flag), a compression / decompression flag (mapping_flag), input image dynamic range information (input_d_range), input image display display maximum luminance information (input_disp_luminance), and output image dynamic range information ( output_d_range), output display maximum luminance information (output_disp_luminace), and knee position number information (num_knee_point_minus1) are set. Pre-conversion position information (input_knee_point) and post-conversion position information (output_knee_point) are set for each knee position.
- Persist flag indicates whether the knee function info SEI sent once is valid after that or only once.
- the persistence flag is set to low level “0” if it is valid only for pictures with knee function info SEI, and is valid until the stream switches or until a new knee function info SEI arrives. In this case, the high level is set to “1”.
- the compression / decompression flag is a flag indicating whether knee conversion is knee compression. That is, when the number of knee positions is 1, when the position information before conversion is equal to or greater than the position information after conversion, it is determined that knee conversion is knee expansion, and the position information before conversion is smaller than the position information after conversion. It can be determined that the knee conversion is knee compression.
- the compression flag is set. Note that the compression flag may be set even when the number of knee points is one.
- the compression flag is set to high level “1” when knee conversion is knee compression, and is set to low level “0” when knee conversion is knee expansion.
- the number of knee positions is a value obtained by subtracting 1 from the number of knee positions.
- the order i (i is an integer of 0 or more) in which the pre-conversion position information and the post-conversion position information of the knee position are set is the order in which the pre-conversion position information is small.
- the pre-conversion position information (input_knee_point) is information indicating the knee position of the image to be encoded before conversion in dynamic range conversion, and the knee position when the maximum luminance value of the image to be encoded is 1000 per mil. It is a thousandth rate.
- the knee position is a brightness other than 0 at the start point of the brightness range to be knee-converted at the same conversion rate of the brightness dynamic range of the encoding target image.
- the post-conversion position information (output_knee_point) is information representing the starting point of the luminance range corresponding to the luminance range to be knee-converted starting from the knee position of the image after conversion in the dynamic range conversion. Specifically, the post-conversion position information is a percentage of the brightness of the converted image corresponding to the knee point when the maximum value of the brightness of the converted image is 1000 per mil.
- FIG. 3 shows an example in which the first dynamic range image to be encoded is a high dynamic range image. Users can change 0-40%, 40-100%, 100-180%, 180-400% to 0-60%, 60-80%, 80-90%, 90-100% of high dynamic range image brightness.
- the second dynamic range image obtained as a result of knee conversion is used as a desired converted image.
- knee function info SEI 100 is set as position information before conversion (input_knee_point [0]) of the first knee position, and 600 is set as position information after conversion (output_knee_point [0]).
- the 250 is set as position information before conversion (input_knee_point [1]) of the second knee position, and 800 is set as position information after conversion (output_knee_point [1]).
- 450 is set as pre-conversion position information (input_knee_point [2]) of the third knee position, and 900 is set as post-conversion position information (output_knee_point [2]).
- knee function info SEI input image dynamic range information (input_d_range) of 4000 and input image display display maximum luminance information (input_disp_luminance) of 800 (cd / m 2).
- input_d_range input image dynamic range information
- input_disp_luminance input image display display maximum luminance information
- a compression flag is assumed to be 1.
- the television receiver 13 recognizes that the first to third luminance “output_knee_point” are 60%, 80%, and 90%, respectively. Also, the television receiver 13 recognizes from the input image dynamic range information that the maximum luminance value of the encoding target image is 400%.
- the television receiver 13 connects 0 to 40%, % 40-100%, 100-180%, 180-400% of the luminance of the high dynamic range image obtained as a result of decoding by connecting the knee positions in the order of setting. Knee conversion to 0-60%, 60-80%, 80-90%, 90-100%. As a result, the television receiver 13 can convert the high dynamic range image obtained as a result of the decoding into a desired second dynamic range image.
- the conversion rate can be set more finely than when one knee position is set. Therefore, more accurate knee conversion can be performed, but when there is no transmission capacity capable of transmitting a plurality of knee position information at one time depending on the transmission path, as described above, when setting in order of the position before conversion, All knee position information may not be transmitted.
- FIG. 4 shows an example of a method and method for determining the priority.
- the descending order of the length of the perpendicular line is d [2]> d [1]> d [3].
- the position information before conversion and the position information after conversion of the knee function info SEI are (250 , 800), (100, 600), (450, 900). In this case, the priority order of the plurality of knee position information is determined based on the compression / decompression rate of the knee position indicated by each of the plurality of knee position information.
- FIG. 5 shows a configuration example of the HDMI transmission unit 12b of the BD player 12 and the HDMI reception unit 13b of the television receiver 13 in the AV system 10 of FIG.
- the HDMI transmission unit 12b is an effective image section 21 (hereinafter referred to as “active” as appropriate) that is a section obtained by removing the horizontal blanking section 22 and the vertical blanking section 23 from the section from one vertical synchronization signal to the next vertical synchronization signal.
- active effective image section 21
- video section a differential signal corresponding to pixel data of an uncompressed image for one screen is transmitted in one direction to the HDMI receiving unit 13b through a plurality of channels.
- the HDMI transmission unit 12b transmits, at a plurality of channels, differential signals corresponding to at least audio data, control data, other auxiliary data, etc. associated with an image in the horizontal blanking interval 22 or the vertical blanking interval 23. It transmits to the HDMI receiving unit 13b in one direction.
- the HDMI transmission unit 12 b includes the HDMI transmitter 31.
- the transmitter 31 converts, for example, pixel data of an uncompressed image into a corresponding differential signal, and receives HDMI signals on three TMDS (Transition Minimized Differential Signaling) channels # 0, # 1, and # 2. Serial transmission is performed in one direction to the unit 13b.
- TMDS Transition Minimized Differential Signaling
- the transmitter 31 converts audio data accompanying the uncompressed image, further necessary control data and other auxiliary data, etc. into corresponding differential signals, and converts them into three TMDS channels # 0, # 1, #. 2 serially transmits to the HDMI receiving unit 13b in one direction.
- the HDMI receiving unit 13b receives a differential signal corresponding to pixel data transmitted in one direction from the HDMI transmitting unit 12b through a plurality of channels in the active video section 21 (see FIG. 6). Also, the HDMI receiving unit 13b transmits audio signals transmitted in one direction from the HDMI transmitting unit 12b through a plurality of channels in the horizontal blanking interval 22 (see FIG. 6) or the vertical blanking interval 23 (see FIG. 6). A differential signal corresponding to data and control data is received.
- the transmission channel of the HDMI system comprising the HDMI source transmission unit 12b and the HDMI reception unit 13b transmits three TMDS channels # 0 to # 2 as transmission channels for transmitting pixel data and audio data, and a pixel clock.
- TMDS clock channel As a transmission channel, there are transmission channels called DDC (Display Data Channel) 33 and CEC (Consumer Electronics Control) line 34.
- the DDC 33 includes two signal lines included in the HDMI cable 15, and the HDMI transmission unit 12b reads E-EDID (Enhanced Extended Extended Display Identification Data) from the HDMI receiving unit 13b connected via the HDMI cable 15. Used for.
- E-EDID Enhanced Extended Extended Display Identification Data
- the HDMI receiving unit 13b has an EDID ROM (Read Only Memory) that stores E-EDID, which is performance information related to its own performance (Configuration / Capability).
- the HDMI transmitting unit 12b reads the E-EDID of the HDMI receiving unit 13b from the HDMI receiving unit 13b connected via the HDMI cable 15 through the DDC 33.
- the HDMI transmitting unit 12b sets the performance of the HDMI receiving unit 13b based on the E-EDID, that is, for example, the image format (profile) supported by the electronic device having the HDMI receiving unit 13b, for example, It recognizes RGB, YCbCr4: 4: 4, YCbCr4: 2: 2, and the like.
- the CEC line 34 is composed of one signal line included in the HDMI cable 15 and is used for bidirectional communication of control data between the HDMI transmission unit 12b and the HDMI reception unit 13b. Further, the HDMI cable 15 includes an HPD line 35 connected to a pin called HPD (Hot Plug Detect).
- HPD Hot Plug Detect
- the source device can detect the connection of the sink device by the DC bias potential using the HPD line 35.
- the HPD line 35 has a function of receiving a connection state notification from the sink device by a DC bias potential.
- the HPD line has a function of notifying the source device of the connection state by a DC bias potential.
- the HDMI cable 15 includes a line (power line) 36 used for supplying power from the source device to the sink device. Further, the HDMI cable 15 includes a reserved line 37. A pair of differential transmission paths may be configured using the HPD line 35 and the reserved line 37 and used as a bidirectional communication path.
- FIG. 6 shows sections of various transmission data when image data of horizontal ⁇ vertical 1920 pixels ⁇ 1080 lines is transmitted in the TMDS channel.
- a video field 24 Video Data Period
- a data island period 25 Data Island Period
- a video field in which transmission data is transmitted using the three TMDS channels of HDMI There are three types of sections, namely, control section 26 (Control26Period).
- the video field period is a period from a rising edge (Active Edge) of a certain vertical synchronizing signal to a rising edge of the next vertical synchronizing signal, and includes a horizontal blanking period 22 (Horizontal Blanking) and a vertical blanking period 23 ( Vertical Blanking) and an effective pixel section 21 (Active Video) that is a section obtained by removing the horizontal blanking period and the vertical blanking period from the video field section.
- the video data section 24 is assigned to the effective pixel section 21.
- data of 1920 pixels (pixels) ⁇ 1080 lines of effective pixels (Active Pixel) constituting uncompressed image data for one screen is transmitted.
- the data island period 25 and the control period 26 are assigned to the horizontal blanking period 22 and the vertical blanking period 23.
- auxiliary data (Auxiliary Data) is transmitted.
- the data island section 25 is allocated to a part of the horizontal blanking period 22 and the vertical blanking period 23.
- audio data packets that are not related to the control among the auxiliary data are transmitted.
- the control section 26 is allocated to other portions of the horizontal blanking period 22 and the vertical blanking period 23.
- vertical synchronization signals, horizontal synchronization signals, control packets, and the like, which are data related to control, of auxiliary data are transmitted.
- FIG. 7 shows a specific configuration example of the digital camera 11.
- the digital camera 11 includes an imager 121, an imager driver 122, an imaging signal processing circuit 123, a camera control CPU 124, a still image signal processing circuit 125, a moving image signal processing circuit 126, a recording / playback unit 128, a recording A medium 129 is included.
- the digital camera 11 includes a system control CPU 130, a flash ROM 131, an SDRAM 132, a user operation unit 133, a microphone 134, an audio signal processing circuit 135, a graphic generation circuit 141, a panel drive circuit 136, a display A panel 137, a display control unit 142, and a power supply unit 143 are provided.
- the imager 121 is configured by, for example, a CMOS image sensor or a CCD image sensor.
- the imager driver 122 drives the imager 121.
- the imaging signal processing circuit 123 processes the imaging signal obtained by the imager 121 to generate image data (captured image data) corresponding to the subject.
- the camera control CPU 124 controls the operations of the imager driver 122 and the imaging signal processing circuit 123. In this example, an example in which the camera control CPU 124 is provided in addition to the system control CPU 130 is shown, but these may be a single chip or a plurality of cores.
- the still image signal processing circuit 125 performs, for example, a JPEG (Joint Photographic Experts Group) compression encoding process on the image data obtained by the imaging signal processing circuit 123 at the time of capturing a still image, thereby obtaining still image data. Is generated.
- JPEG Joint Photographic Experts Group
- the audio signal processing circuit 135 performs processing such as A / D conversion on the audio signal obtained by the microphone 134 to obtain audio data corresponding to the captured image data.
- the moving image signal processing circuit 126 compresses and encodes the image data obtained by the imaging signal processing circuit 123 together with the audio data obtained by the audio signal processing circuit 125 in accordance with the recording media format when capturing a moving image. Thus, moving image data to which audio data is added is generated.
- the moving image signal processing circuit 126 for example, HEVC encoding is performed.
- the processing of the setting unit 11c and the conversion unit 11b described above is also performed.
- the first dynamic range image captured by the imager 121 is converted into the second dynamic range image, and the second dynamic range image is encoded to generate encoded data.
- conversion information (knee function info SEI) for converting the first dynamic range image into the second dynamic range image is added to the encoded data.
- This conversion information includes a plurality of pieces of knee position information in a state where priority is given.
- the recording / reproducing unit 128 captures the still image data generated by the still image signal processing circuit 125 in the loaded recording medium (storage medium) 129 or the system ROM 130 via the system control CPU 130 when capturing a still image.
- the recording medium 129 is, for example, a removable recording medium such as a BD disk or a memory card.
- the recording / playback unit 128 writes the moving image data generated by the moving image signal processing circuit 126 to the loaded recording medium 129 or to the flash ROM 131 via the system control CPU 130 when capturing a moving image.
- the recording / reproducing unit 128 reads moving image data from the recording medium 129 and reproduces the reproduced image data by reproducing the moving image data from the recording medium 129 when reproducing a still image or a moving image.
- the graphics generation circuit 141 performs graphics data superimposition processing or the like on the image data output from the imaging signal processing circuit 123 or the playback image data generated by the recording / playback unit 128 as necessary.
- the panel driving circuit 136 drives the display panel 137 based on the output image data of the graphic generation circuit 141 and displays a captured image (moving image) or a reproduced image (still image, moving image) on the display panel 137.
- the display control unit 142 controls the display on the display panel 137 by controlling the graphics generation circuit 141 and the panel drive circuit 136.
- the display panel 137 includes, for example, an LCD (Liquid Crystal Display), an organic EL (Organic Electro-Luminescence) panel, or the like.
- the display control unit 142 is provided in addition to the system control CPU 130.
- the system control CPU 130 may directly control the display on the display panel 137.
- the system control CPU 130 and the display control unit 142 may be a single chip or a plurality of cores.
- the power supply unit 143 supplies power to each unit of the digital camera 11.
- the power supply unit 143 may be an AC power supply or a battery (storage battery, dry battery).
- the system control CPU 130 controls operations of the still image signal processing circuit 125, the moving image signal processing circuit 126, the recording / reproducing unit 128, and the like.
- a flash ROM 131, SDRAM 132, and user operation unit 133 are connected to the system control CPU 130.
- the flash ROM 131 stores a control program of the system control CPU 130 and the like.
- the SDRAM 132 is used for temporary storage of data necessary for the control processing of the system control CPU 130.
- the user operation unit 133 constitutes a user interface.
- the user operation unit 133 includes, for example, a switch, a wheel, a touch panel unit that inputs an instruction by proximity / touch, a mouse, a keyboard, a gesture input unit that detects an instruction input by a camera, and a voice input unit that inputs an instruction by voice. Further, it may be a remote control.
- the system control CPU 130 determines the operation state of the user operation unit 133 and controls the operation of the digital camera 11. The user can perform various additional information input operations and the like in addition to the imaging (recording) operation and the reproduction operation by the user operation unit 133.
- the imaging signal obtained by the imager 121 is supplied to the imaging signal processing circuit 123 and processed, and the imaging signal processing circuit 123 obtains image data (captured image data) corresponding to the subject.
- the still image signal processing circuit 125 performs compression encoding processing on the image data output from the image capturing signal processing circuit 123 to generate still image data.
- This still image data is recorded on the recording medium 129 by the recording / reproducing unit 128.
- the moving image signal processing circuit 126 compresses the image data output from the imaging signal processing circuit 123 together with the audio data output from the audio signal processing circuit 135 together with the compression code according to the recording media format. Moving image data to which audio data is added is generated. This moving image data is recorded on the recording medium 129 by the recording / reproducing unit 128.
- still image data is read from the recording medium 129 and subjected to processing such as decoding by the still image signal processing circuit 125 to obtain reproduced image data.
- the reproduced image data is supplied to the panel drive circuit 136 via the system control CPU 130 and the moving image signal processing circuit 126, and a still image is displayed on the display panel 137.
- moving image data is read from the recording medium 129 by the recording / reproducing circuit 128 and subjected to processing such as decoding by the moving image signal processing circuit 126 to obtain reproduced image data.
- the reproduced image data is supplied to the panel drive circuit 136, and a moving image is displayed on the display panel 137.
- FIG. 8 shows a configuration example of the BD player 12.
- the BD player 12 includes an HDMI terminal 12a, an HDMI transmission unit 12b, and a high-speed bus interface 12c.
- the BD player 12 includes a CPU (Central Processing Unit) 204, an internal bus 205, a flash ROM (Read Only Memory) 206, an SDRAM (Synchronous Random Access Memory) 207, a remote control receiving unit 208, and remote control transmission. Machine 209.
- CPU Central Processing Unit
- an internal bus 205 a flash ROM (Read Only Memory) 206
- SDRAM Synchronous Random Access Memory
- the BD player 12 includes a storage (recording) medium control interface 210, a BD (Blu-Ray Disc) drive 211a, an HDD (Hard disc drive) 211b, an SSD (Solid State Drive) 211c, and an Ethernet interface (Ethernet). I / F) 212 and a network terminal 213.
- the BD player 12 includes an MPEG (Moving Picture Picture Expert Group) decoder 215, a graphic generation circuit 216, a video output terminal 217, and an audio output terminal 218.
- MPEG Motion Picture Picture Expert Group
- the BD player 12 may include a display control unit 221, a panel drive circuit 222, a display panel 223, and a power supply unit 224.
- “Ethernet” and “Ethernet” are registered trademarks.
- the high-speed bus interface 12 c, the CPU 204, the flash ROM 206, the SDRAM 207, the remote control receiving unit 208, the storage medium control interface 210, the Ethernet interface 212, and the MPEG decoder 215 are connected to the internal bus 205.
- the CPU 204 controls the operation of each part of the BD player 12.
- the flash ROM 206 stores control software and data.
- the SDRAM 207 constitutes a work area for the CPU 204.
- the CPU 204 develops software and data read from the flash ROM 206 on the SDRAM 207 to activate the software, and controls each part of the BD player 12.
- the remote control receiving unit 208 receives a remote control signal (remote control code) transmitted from the remote control transmitter 209 and supplies it to the CPU 204.
- the CPU 204 controls each part of the BD player 12 according to the remote control code.
- the remote control unit is shown as the user instruction input unit.
- the user instruction input unit has other configurations, for example, a switch, a wheel, a touch panel unit for inputting an instruction by proximity / touch, a mouse It may be a keyboard, a gesture input unit for detecting an instruction input with a camera, a voice input unit for inputting an instruction by voice, or the like.
- the BD drive 211a records content data on a BD disc as a disc-shaped recording medium, or reproduces content data from the BD disc.
- the HDD 211b records content data or reproduces the content data.
- the SSD 211c records content data in a semiconductor memory such as a memory card or reproduces content data from the semiconductor memory.
- the BD drive 211a, HDD 211b, and SSD 211c are connected to the internal bus 205 via the storage medium control interface 210.
- a SATA interface is used as an interface for the BD drive 211a and the HDD 211b.
- a SATA interface or a PCIe interface is used as an interface for the SSD 211c.
- the MPEG decoder 215 performs decoding processing on the MPEG2 stream reproduced by the BD drive 211a, HDD 211b, or SSD 211c to obtain image and audio data.
- the graphic generation circuit 216 performs graphics data superimposition processing on the image data obtained by the MPEG decoder 215 as necessary.
- the video output terminal 217 outputs image data output from the graphic generation circuit 216.
- the audio output terminal 218 outputs audio data obtained by the MPEG decoder 215.
- the panel drive circuit 222 drives the display panel 223 based on the video (image) data output from the graphic generation circuit 216.
- the display control unit 221 controls the display on the display panel 223 by controlling the graphics generation circuit 216 and the panel drive circuit 222.
- the display panel 223 includes, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel) organic EL (Organic Electro-Luminescence) panel, or the like.
- the display control unit 221 may directly control the display on the display panel 223.
- the CPU 204 and the display control unit 221 may be a single chip or a plurality of cores.
- the power supply unit 224 supplies power to each unit of the BD player 12.
- the power supply unit 224 may be an AC power supply or a battery (storage battery, dry battery).
- the HDMI transmission unit (HDMI source) 12b transmits baseband image (video) and audio data from the HDMI terminal 12a through HDMI-compliant communication.
- the high-speed bus-in interface 12c is a bidirectional communication path interface configured by using predetermined lines (in this embodiment, a reserved line and an HPD line) constituting the HDMI cable 15.
- the processing of the information transmission unit 12e and the information reception unit 12f described above is also performed. That is, conversion information of uncompressed image data obtained by decoding the encoded data of the second dynamic range image reproduced from the storage medium 14 from the digital camera 11 by the MPEG decoder 215 is transmitted to the television receiver 13. The Further, the number information of knee positions that can be supported by the television receiver 13 is received from the television receiver 13.
- the conversion information from the first dynamic range image to the second dynamic range image is inserted during the blanking period of the uncompressed image data.
- This conversion information includes knee position information, but is limited to the number of knee positions that can be handled by the television receiver 13.
- the MPEG decoder 215 determines a plurality of knee position information to which priority is given from the encoded data. Is extracted.
- the HDMI transmitting unit 12b from the plurality of knee position information extracted in association with the uncompressed image data by the MPEG decoder 215 in this manner, the HDMI transmitting unit 12b is based on the number of knee positions that can be supported by the television receiver 13 based on the priority order. , Knee position information is selected. Then, the conversion information inserted in the blanking period of the uncompressed image data transmitted to the television receiver 13 includes the selected predetermined number of knee position information.
- the high-speed bus interface 12c is inserted between the Ethernet interface 212 and the HDMI terminal 201.
- the high-speed bus interface 12 c transmits transmission data supplied from the CPU 204 from the HDMI terminal 201 to the counterpart device via the HDMI cable 15.
- the high-speed bus interface 12c supplies the CPU 104 with received data received from the counterpart device from the HDMI cable 15 via the HDMI terminal 12a.
- the operation of the BD player 12 shown in FIG. 8 will be briefly described.
- content data to be recorded is acquired via a digital tuner (not shown), from the network terminal 213 via the Ethernet interface 212, or from the HDMI terminal 12a via the high-speed bus interface 12c.
- This content data is input to the storage medium interface 210 and recorded on the BD disc by the BD drive 211a, on the HDD 211b, or on the semiconductor memory by the SSD 211c.
- content data (MPEG stream) reproduced by the BD drive 211a, HDD 211b, or SSD 211c is supplied to the MPEG decoder 215 via the storage medium interface 210.
- the MPEG decoder 215 performs decoding processing on the reproduced content data to obtain baseband image and audio data.
- the image data is output to the video output terminal 217 through the graphic generation circuit 216.
- the audio data is output to the audio output terminal 218.
- the image data obtained by the MPEG decoder 215 is supplied to the panel drive circuit 222 through the graphic generation circuit 216 in accordance with a user operation, and the reproduced image is displayed on the display panel 223.
- audio data obtained by the MPEG decoder 115 is supplied to a speaker (not shown) according to a user operation, and audio corresponding to the reproduced image is output.
- the image and audio data obtained by the MPEG decoder 215 is transmitted through the HDMI TMDS channel at the time of reproduction, the image and audio data are supplied to the HDMI transmission unit 12b and packed. The data is output from the HDMI transmission unit 12b to the HDMI terminal 12a.
- the content data reproduced by the BD drive 211 is transmitted to the network at the time of reproduction, the content data is output to the network terminal 213 via the Ethernet interface 212.
- the content data reproduced by the BD drive 211 is sent to the bidirectional communication path of the HDMI cable 15, the content data is output to the HDMI terminal 12a via the high-speed bus interface 12c.
- before outputting the image data it may be transmitted after being encrypted using a copyright protection technology such as HDCP, DTCP, DTCP +, or the like.
- FIG. 9 shows a configuration example of the television receiver 13.
- the television receiver 13 includes an HDMI terminal 13a, an HDMI receiving unit 13b, and a high-speed bus interface 13c.
- the television receiver 13 includes an antenna terminal 305, a digital tuner 306, an MPEG decoder 307, a video signal processing circuit 308, a graphic generation circuit 309, a panel drive circuit 310, and a display panel 311. Yes.
- the television receiver 13 includes an audio signal processing circuit 312, an audio amplification circuit 313, a speaker 314, an internal bus 320, a CPU 321, a flash ROM 322, and an SDRAM (Synchronous Random Access Memory) 323. Yes.
- the television receiver 13 includes an Ethernet interface (Ethernet I / F) 324, a network terminal 325, a remote control receiving unit 326, and a remote control transmitter 327.
- the television receiver 13 includes a display control unit 331 and a power supply unit 332. “Ethernet” and “Ethernet” are registered trademarks.
- the antenna terminal 305 is a terminal for inputting a television broadcast signal received by a receiving antenna (not shown).
- the digital tuner 306 processes the television broadcast signal input to the antenna terminal 305 and generates a partial TS (Transport Stream) (TS packet of video data, audio data) from a predetermined transport stream corresponding to the user's selected channel. TS packet) is extracted.
- TS Transport Stream
- the digital tuner 306 takes out PSI / SI (Program Specific Information / Service Information) from the obtained transport stream and outputs it to the CPU 221.
- PSI / SI Program Specific Information / Service Information
- the process of extracting a partial TS of an arbitrary channel from a plurality of transport streams obtained by the digital tuner 306 is obtained by obtaining packet ID (PID) information of the arbitrary channel from PSI / SI (PAT / PMT). It becomes possible.
- PID packet ID
- the MPEG decoder 307 performs a decoding process on a video PES (Packetized Elementary Stream) packet composed of TS packets of video data obtained by the digital tuner 306 to obtain image data. Also, the MPEG decoder 307 performs a decoding process on the audio PES packet constituted by the TS packet of the audio data obtained by the digital tuner 306 to obtain audio data.
- a video PES Packetized Elementary Stream
- the video signal processing circuit 308 and the graphic generation circuit 309 perform scaling processing (resolution conversion processing), graphic processing on image data obtained by the MPEG decoder 307 or image data received by the HDMI receiving unit 302 as necessary. Data superimposition processing is performed.
- the panel drive circuit 310 drives the display panel 311 based on the video (image) data output from the graphic generation circuit 309.
- the display control unit 331 controls the display on the display panel 311 by controlling the graphics generation circuit 309 and the panel drive circuit 310.
- the display panel 311 includes, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Organic Electro-Luminescence) panel, and the like.
- the display control unit 331 may directly control the display on the display panel 311.
- the CPU 321 and the display control unit 331 may be a single chip or a plurality of cores.
- the power supply unit 332 supplies power to each unit of the television receiver 13.
- the power supply unit 332 may be an AC power supply or a battery (storage battery, dry battery).
- the audio signal processing circuit 312 performs necessary processing such as D / A conversion on the audio data obtained by the MPEG decoder 307.
- the audio amplifier circuit 313 amplifies the audio signal output from the audio signal processing circuit 312 and supplies the amplified audio signal to the speaker 314.
- the speaker 314 may be monaural or stereo. Further, the number of speakers 314 may be one, or two or more.
- the speaker 314 may be an earphone or a headphone. Moreover, the speaker 314 may correspond to 2.1 channel, 5.1 channel, or the like.
- the speaker 314 may be connected to the television receiver 13 wirelessly.
- the speaker 314 may be another device.
- the CPU 321 controls the operation of each part of the television receiver 13.
- the flash ROM 322 stores control software and data.
- the DRAM 323 constitutes a work area for the CPU 321.
- the CPU 321 develops software and data read from the flash ROM 322 on the SDRAM 323 to activate the software, and controls each unit of the television receiver 13.
- the remote control receiving unit 326 receives the remote control signal (remote control code) transmitted from the remote control transmitter 327 and supplies it to the CPU 321.
- the CPU 321 controls each part of the television receiver 13 based on this remote control code.
- a remote control unit is shown as the user instruction input unit.
- the user instruction input unit has other configurations, for example, a touch panel unit that inputs an instruction by proximity / touch, a mouse, a keyboard, a camera It may be a gesture input unit that detects an instruction input, a voice input unit that inputs an instruction by voice, and the like.
- the network terminal 325 is a terminal connected to the network, and is connected to the Ethernet interface 324.
- the high-speed bus interface 13 c, the CPU 321, the flash ROM 322, the SDRAM 323, the Ethernet interface 324, the MPEG decoder 307, and the display control unit 331 are connected to the internal bus 320.
- the HDMI receiving unit (HDMI sink) 13b receives baseband image (video) and audio data supplied to the HDMI terminal 13a via the HDMI cable 13 by communication conforming to HDMI.
- the high-speed bus interface 13c is bi-directionally configured using predetermined lines (reserved line and HPD line in this embodiment) constituting the HDMI cable 15 in the same manner as the high-speed bus interface 13c of the BD player 12 described above. It is a communication path interface.
- the above-described processing of the information receiving unit 13e and the information transmitting unit 13f is also performed. That is, the number information of knee positions that can be handled by the own device is transmitted to the BD player 12.
- the number information of knee positions that can be supported by the own apparatus is stored in the EDID-ROM in the HDMI receiving unit 13b.
- conversion information from the first dynamic range image to the second dynamic range image, which is inserted in the blanking period of the uncompressed image data received from the BD player 12 is extracted.
- This conversion information includes knee position information corresponding to the number of knee positions that can be supported by the device itself.
- dynamic range conversion processing is performed on the uncompressed image data received by the HDMI receiving unit 13b based on the knee position information.
- the high-speed bus interface 13 c is inserted between the Ethernet interface 324 and the HDMI terminal 301.
- the high-speed bus interface 13c transmits the transmission data supplied from the CPU 321 to the counterpart device from the HDMI terminal 13a via the HDMI cable 15.
- the high-speed bus interface 13c supplies the CPU 321 with received data received from the counterpart device from the HDMI cable 15 via the HDMI terminal 13a.
- the received content data when the received content data is sent to the network, the content data is output to the network terminal 325 via the Ethernet interface 324.
- the received content data when the received content data is sent out to the bidirectional communication path of the HDMI cable 15, the content data is output to the HDMI terminal 13a via the high-speed bus interface 13c.
- a copyright protection technology such as HDCP, DTCP, DTCP +, or the like.
- a television broadcast signal input to the antenna terminal 305 is supplied to the digital tuner 306.
- the digital tuner 306 processes a television broadcast signal and outputs a predetermined transport stream corresponding to the user's selected channel. From the transport stream, partial TS (video data TS packet, audio data TS packet) is output. Are extracted, and the partial TS is supplied to the MPEG decoder 307.
- the video PES packet constituted by the TS packet of video data is decoded to obtain video data.
- the video data is subjected to scaling processing (resolution conversion processing), dynamic range processing, graphics data superimposition processing, and the like in the video signal processing circuit 308 and the graphic generation circuit 309 as necessary, and then the panel drive circuit. 310. Therefore, an image corresponding to the user's selected channel is displayed on the display panel 311.
- audio data is obtained by performing a decoding process on the audio PES packet constituted by the TS packet of audio data.
- the audio data is subjected to necessary processing such as D / A conversion by the audio signal processing circuit 312, further amplified by the audio amplification circuit 313, and then supplied to the speaker 314. Therefore, sound corresponding to the user's selected channel is output from the speaker 314.
- content data (image data, audio data) supplied from the network terminal 325 to the Ethernet interface 324 or supplied from the HDMI terminal 13a via the high-speed bus interface 13c is supplied to the MPEG decoder 307. Thereafter, the operation is the same as that when receiving the television broadcast signal described above, an image is displayed on the display panel 311, and sound is output from the speaker 314.
- the HDMI receiving unit 13b acquires image data and audio data transmitted from the BD player 12 connected to the HDMI terminal 13a via the HDMI cable 15.
- the image data is supplied to the video signal processing circuit 308.
- the audio data is supplied to the audio signal processing circuit 312. Thereafter, the operation is the same as that when receiving the television broadcast signal described above, an image is displayed on the display panel 311, and sound is output from the speaker 314.
- the BD player 12 receives the number information of knee positions that can be supported by the television receiver 13 from the television receiver 13 via the HDMI cable 15.
- the television receiver 13 stores the number information of knee positions to which the television receiver 13 corresponds in the storage unit, and transmits the number information of knee positions to the BD player 12 via the HDMI cable 15.
- there was no specification for specifying the number of knee position information and there was no compatibility between manufacturers.
- the BD player 12 Based on the number information of knee positions received from the television receiver 13, the BD player 12 uses a plurality of pieces of knee position information acquired together with uncompressed image data by the MPEG decoder 215 for the number of television receivers that can be handled.
- the knee position information is selected and transmitted to the television receiver 13 via the HDMI cable 15.
- a plurality of knee position information acquired by the MPEG decoder 215 is given a priority and is selected in the order of priority.
- the television receiver 13 receives uncompressed image data from the BD player 12 via the HDMI cable 15 and receives conversion information including the number of knee position information that can be handled by the television receiver 13.
- the television receiver 13 processes the received uncompressed image data based on the received conversion information, and generates display image data.
- FIG. 10 shows an example of the data structure of E-EDID.
- This E-EDID is composed of a basic block and an extended block.
- data defined by the E-EDID1.3 standard represented by “E-EDID1.3 Basic Structure” is arranged, and then the conventional EDID represented by “Preferred timing” and Timing information for maintaining the compatibility of the EDID and timing information different from “Preferred timing” for maintaining the compatibility with the conventional EDID represented by “2nd timing” are arranged.
- the basic block includes information indicating the name of the display device represented by “Monitor NAME” following “2nd timing” and an aspect ratio of 4: 3 and 16 represented by “Monitor Range LIMITs”. : Information indicating the number of displayable pixels in the case of 9 is arranged in order.
- information such as displayable image size (resolution), frame rate, interlaced or progressive, and aspect ratio, which are represented by “Short Video Descriptor”, are described.
- Data data describing information such as reproducible audio codec system, sampling frequency, cut-off band, codec bit number, etc. represented by “Short Audio Descriptor”, and left and right speakers represented by “SpeakerkAllocation” Information about is arranged in order.
- extension block maintains compatibility with the conventional EDID represented by “3rd timing”, the data uniquely defined for each manufacturer represented by “Vender Specific” following “Speaker Allocation”. Timing information for maintaining compatibility with the conventional EDID represented by “4th timing” is arranged.
- FIG. 11 shows an example of the data structure of the VSDB area.
- a 0th block to an Nth block which are 1-byte blocks, are provided.
- the fourth bit of the eighth byte defines the presence / absence flag of the knee position number information, and the data area of the knee position number information to be stored by the television receiver 13 is defined in the ninth byte.
- the sixth byte is a flag indicating a function supported by the sink device represented by “Supports-AI”, “DC-48 bit”, “DC-36 bit”, and “DC-30 bit”.
- information indicating the maximum frequency of the TMDS pixel clock represented by “Max-TMDS-Clock” is arranged in the seventh byte.
- information flags specifying the correspondence of the function of the content type (CNC) are arranged.
- a flag indicating whether or not knee position number information corresponding to the sink device exists is newly arranged in the fourth bit of the eighth byte. When this flag is high level “1”, it indicates that the knee position number information and the corresponding packet format information “DRIF” flag exist in the ninth byte.
- a method of storing knee position number information using a VSDB area is proposed, but the E-EDID data structure is also realized in other data areas such as VCDB (Video Capability Data Block). Since it is possible, this method is not limited.
- VCDB Video Capability Data Block
- the BD player (HDMI source device) 12 confirms the connection of the television receiver (HDMI sink device) 13 through the HPD line 35 (see FIG. 5). Thereafter, the BD player 12 uses the DDC 33 (see FIG. 5) to read the E-EDID and thus the knee position number information from the television receiver 13, and recognizes the corresponding knee position number.
- the BD player 12 transmits the uncompressed image data to the television receiver 13 based on the knee position number information read from the television receiver 13 as described above.
- the dynamic range conversion information including the number of knee position information corresponding to the number 13 is transmitted to the television receiver 13.
- the BD player 12 transmits the information to the television receiver 13 by inserting the information into the blanking period of the uncompressed image data (video signal) to be transmitted to the television receiver 13.
- the BD player 12 uses, for example, an HDMI Vendor Specific InfoFrame (hereinafter referred to as “VSIF”) packet or the like to convert the dynamic range conversion information of the currently transmitted uncompressed image data to the uncompressed image data. Insert in blanking period.
- the VSIF packet is arranged in the data island section 26 (see FIG. 6).
- FIG. 12 shows an example of the data structure of a VSIF packet.
- the VSIF packet enables transmission of incidental information about an image from a source device to a sink device.
- Packet Type (0x81) indicating a VSIF packet is defined.
- the first byte of the first byte sets a level opposite to the “CB flag” set in the immediately preceding VSIF packet. That is, when the “CB flag” is set to the low level “0” in the immediately preceding VSIF packet and the data content of the subsequent VSIF packet is different, the “CB flag” is set to the high level “1”.
- the 6th to 0th bits of the 1st byte set “Version (0x02)”.
- “Length” data is defined from the 4th bit to the 0th bit of the 2nd byte, and the byte length after the 3rd byte is set.
- a checksum is defined in the third byte.
- Information indicating the number “0x000C03” registered for HDMI (R) represented by “24 bits IEEE Registration Identifier (0x000C03) LSB first” is arranged in the fourth to sixth bytes.
- a flag indicating whether or not dynamic range conversion information exists after the 8th byte is designated.
- the fourth bit and the third bit are designated as “0b00”, it indicates that there is no dynamic range conversion information.
- the subsequent 8th to 23rd bytes include the input image dynamic range information (input_d_range) and the input image display display in the dynamic range conversion information.
- Maximum luminance information (input_disp_luminance), output image dynamic range information (output_d_range), and output display maximum luminance information (output_disp_luminace) are specified.
- FIG. 13 shows the data structure of another VSIF packet.
- “Packet Type (0x81)” indicating a VSIF packet is defined.
- “Version (0x01)” indicating the second VSIF packet is set in the first byte.
- “Length” data is defined in the 4th to 0th bits of the 2nd byte, and the 3rd byte. The following byte length is set: a checksum is defined in the third byte, and the fourth to sixth bytes are represented by “24 bits IEEE Registration Identifier (0x000C03) LSB first”. ), Information indicating the number “0x000C03” registered for use is arranged.
- a flag indicating whether or not dynamic range conversion information exists after the 8th byte is designated.
- the fourth bit and the third bit are designated as “0b00”, it indicates that there is no dynamic range conversion information.
- knee position information is designated after the eighth byte.
- the number of knee position information transmitted in the VSIF packet is designated.
- pre-conversion position information input_knee_point
- post-conversion position information output_knee_point
- FIG. 14 shows an example of the data structure of a newly defined DRIF packet.
- Packet Type (0x83) indicating the type of data packet is defined in the 0th byte.
- Version (0x01) indicating the version of the DRIF packet is set in the first byte.
- “Length” data is defined in the second byte, and the byte length after the third byte (maximum 255) is set. In the third byte, a checksum is defined.
- the knee conversion cancel flag “CF” is set.
- the knee conversion cancel flag “CF” is a flag indicating whether or not to cancel the continuity of the immediately preceding DRIF packet data. When canceling continuity, a high level “1” is set. When not canceling, a low level “0” is set.
- the persistence flag “PF” is set.
- the persistence flag “PF” indicates whether the DRIF packet data that has been sent once is valid or only once, and is valid only for a picture to which DRIF packet data is added. Is set until the stream is switched, or when it is valid until new DRIF packet data arrives, a high level “1” is set.
- the input image dynamic range information (input_d_range), from the 9th byte to the 12th byte, the input image display display maximum luminance information (input_disp_luminance), from the 13th byte to the 16th byte.
- the output image dynamic range information (output_d_range) is specified, and the output display maximum luminance information (output_disp_luminace) is specified in the 17th to 20th bytes.
- pre-conversion position information (input_knee_point)
- post-conversion position information (output_knee_point) are set every three bytes.
- FIG. 15 is a flowchart for explaining processing for determining the priority order of knee position information of the digital camera 11.
- step ST2 it is determined whether or not the number of knee position information is two or more. When the number of knee position information is 1, since there is no need for priority order determination processing, the digital camera 11 immediately proceeds to step ST7 and ends the processing.
- step ST3 the digital camera 11 calculates the distance between the diagonal line and the knee position, and proceeds to the next step ST4.
- step ST4 the digital camera 11 extracts the maximum knee position information from the calculated distance information, and the i-th pre-conversion position information “input_knee_point [i]” and post-conversion position information “output_knee_point [i]”. And proceed to the next step ST5.
- step ST5 the digital camera 11 adds a loop counter and proceeds to the next step ST6.
- step ST6 the digital camera 11 determines whether or not the number of knee positions for which the priority order is to be determined has been reached. If not, the digital camera 11 returns to step ST4 and extracts the next knee position of the maximum value. When the priority is reached, the priority determination process is completed, the process proceeds to step ST7, and the process ends.
- the digital camera 11 sets and stores knee function information SEI including dynamic range conversion information set in the order of priority of knee positions. For this reason, the receiving side can convert to a desired dynamic range image based on the dynamic range conversion information. Therefore, it can be said that the digital camera 11 can encode an image to be captured so that the decoded uncompressed image can be converted into a desired dynamic range image at the time of decoding.
- FIG. 16 is a flowchart for explaining knee position information transmission processing of the BD player 12.
- the BD player 11 starts processing in step ST11, and then proceeds to processing in step ST12.
- step ST12 the BD player 12 determines whether or not the HPD signal is at the high level “H”. When the HPD signal is not at the high level “H”, it is determined that the television receiver 13 is not connected to the BD player 12, and the BD player 12 immediately proceeds to step ST19 and ends the process.
- the BD player 12 When the HPD signal is at the high level “H”, the BD player 12 reads the E-EDID of the television receiver 13 in step ST13. Then, in step ST14, the BD player 12 determines whether or not the television receiver 13 supports dynamic range conversion processing. If not supported, the BD player 12 proceeds to step ST19 and ends the process.
- the BD player 12 determines the number of knee position information to be transmitted to the television receiver 13 in step ST15. In step ST16, the BD player 12 determines whether or not the television receiver 13 supports the DRIF packet.
- the BD player 12 sets the dynamic range conversion information in the DRIF packet in step ST17 and transmits the dynamic range conversion information to the television receiver 13. Thereafter, the process proceeds to step ST19, and the process is terminated.
- the BD player 12 sets the dynamic range conversion information in the VSIF packet in step ST18, transmits it to the television receiver 13, and then proceeds to step ST19 to end the process.
- the BD player 12 transmits the dynamic range conversion information including knee position information set in order of priority in the number of knee positions supported by the television receiver 13 in the corresponding packet format.
- the receiver 13 can convert it into a desired dynamic range image based on the received dynamic range conversion information.
- the digital camera 11 passes the dynamic range conversion information to the BD player 12 via the storage medium 14.
- the digital camera 11 passes the dynamic range conversion information to the BD player 12 via a digital broadcast wave, an IP packet, a cable television broadcast wave, a wireless radio (Wi-Fi), or a public network (3G, LTE). You may do it.
- a digital broadcast wave an IP packet, a cable television broadcast wave, a wireless radio (Wi-Fi), or a public network (3G, LTE). You may do it.
- Wi-Fi wireless radio
- 3G, LTE public network
- the digital camera 11 determines the priority order of the knee position information and rearranges the pre-conversion position information and the post-conversion position information according to the priority order to generate the knee function information SEI. is doing.
- the pre-conversion position information and post-conversion position information to which priority order information is added may be generated, and the knee function information SEI may be generated in an arbitrary order.
- the digital camera 11 calculates and determines the distance information between the knee position and the diagonal as a method for determining the priority order of the knee position information. May be.
- the BD player 12 transmits the dynamic range conversion information to the television receiver 13 by inserting the dynamic range conversion information into the blanking period of the uncompressed image data using the VSIF packet or the DRIF packet. ing.
- the BD player 12 may transmit the dynamic range conversion information to the television receiver 13 via the CEC line 24 that is a control data line of the HDMI cable 15. Further, for example, the BD player 12 may transmit the dynamic range conversion information to the television receiver 13 via a bidirectional communication path constituted by the reserved line 37 and the HPD line 35 of the HDMI cable 15. .
- the E-EDID of the television receiver 13 includes the number of knee position information corresponding to the television receiver 13 and / or DRIF packet correspondence information, and the BD player 12 By reading the E-EDID via the DDC 33 of the HDMI cable 15, the television receiver 13 acquires the number of knee position information corresponding and / or the DRIF packet correspondence information.
- the BD player 12 sends the number of knee position information and / or DRIF packet correspondence information supported by the television receiver 13 from the television receiver 13 via the CEC line 34 that is a control data line of the HDMI cable 15. Or you may make it receive via the bidirectional
- priority information is added to a plurality of knee position information added to encoded data received by the BD player 12 from the digital camera 11.
- priority information is not added to a plurality of knee position information added to encoded data received by the BD player 12, it is conceivable to determine the priority in the BD player 12.
- the priority order determination method in this case is not described in detail, but is performed in the same manner as the determination method in the digital camera 11 described above.
- an HDMI transmission path is used.
- baseband digital interfaces include MHL (Mobile High-definition Link), DVI (Digital Visual Interface) interfaces, optical fiber interfaces, wireless interfaces using 60 GHz millimeter waves, and the like in addition to HDMI.
- MHL Mobile High-definition Link
- DVI Digital Visual Interface
- optical fiber interfaces optical fiber interfaces
- wireless interfaces using 60 GHz millimeter waves and the like in addition to HDMI.
- HDMI HDMI
- HDMI Mobile High-definition Link
- DVI Digital Visual Interface
- the BD player 12 is used as the transmission device (source device) and the television receiver 13 is used as the reception device (sink device).
- the transmission device source device
- the television receiver 13 is used as the reception device (sink device).
- other transmission devices and reception devices are used. It goes without saying that the present technology can be applied to the one to be used as well.
- the technology can have the following configurations.
- a setting unit that sets a plurality of knee position information related to conversion from the first dynamic range image to the second dynamic range image;
- An encoding unit that encodes the second dynamic range image to generate encoded data;
- a determination unit for determining a priority order of the plurality of knee position information;
- An encoding apparatus comprising: an adding unit that adds the plurality of knee position information to the encoded data of the second dynamic range image in a state where the priority is given.
- the encoding device further including a storage processing unit that stores the encoded data of the second dynamic range image to which the plurality of knee position information is added in a storage medium.
- (6) a setting step for setting a plurality of knee position information relating to conversion from the first dynamic range image to the second dynamic range image;
- An encoding method comprising an adding step of adding the plurality of knee position information to the encoded data of the second dynamic range image in a state where the priority is given.
- Setting means for setting a plurality of knee position information relating to conversion from the first dynamic range image to the second dynamic range image; Encoding means for encoding the second dynamic range image to generate encoded data; Determining means for determining priority of the plurality of knee position information; A program for causing the encoded data of the second dynamic range image to function as an adding unit that adds the plurality of knee position information in a state where the priority is given.
- a data transmission unit that transmits uncompressed image data of the second dynamic range image to an external device via a transmission path;
- a plurality of knee position information related to the conversion from the first dynamic range image to the second dynamic range image via the transmission path is limited to the number of knee positions that can be supported by the external apparatus.
- a transmission device comprising an information transmission unit that transmits the information.
- the transmission device according to (8) further including an information reception unit that receives information on the number of knee positions that can be supported by the external device from the external device.
- a data acquisition unit that acquires encoded data of the second dynamic range image to which a plurality of knee position information is added in a state where priority is given;
- a decoding unit that decodes encoded data of the second dynamic range image to obtain uncompressed image data of the second dynamic range image;
- the information selection part which further selects knee position information for the number of knee positions that the external device can support from the plurality of knee position information based on the priority order, The above (8) or (9) Transmitter.
- the data transmission unit transmits the non-compressed image data to the external device through the transmission path as a differential signal.
- the transmission device according to any one of (8) to (10).
- the information transmission unit transmits the plurality of knee position information of the non-compressed image data transmitted by the data transmission unit to the external device by inserting in the blanking period of the non-compressed image data.
- the transmission device according to any one of (8) to (11).
- (13) a data transmission step of transmitting uncompressed image data of the second dynamic range image to an external device via a transmission path;
- a plurality of knee position information related to the conversion from the first dynamic range image to the second dynamic range image via the transmission path is limited to the number of knee positions that can be supported by the external apparatus.
- a transmission method comprising: an information transmission step for transmitting the information.
- Data transmitting means for transmitting uncompressed image data of the second dynamic range image to an external device via a transmission path;
- a plurality of knee position information related to the conversion from the first dynamic range image to the second dynamic range image via the transmission path is limited to the number of knee positions that can be supported by the external apparatus.
- a data receiving unit that receives uncompressed image data of the second dynamic range image from an external device via a transmission path; An information receiving unit that receives a plurality of knee position information related to conversion from the first dynamic range image to the second dynamic range image from the external device via the transmission line; A conversion processing unit that performs dynamic range conversion processing based on the plurality of knee position information on uncompressed image data of the second dynamic range image; A receiving apparatus comprising: an information transmitting unit configured to transmit information on the number of knee positions that can be supported by the own apparatus to the external device via the transmission path. (16) a storage unit for storing information on the number of knee positions; The reception device according to (15), wherein the information transmission unit acquires and transmits the number information of the knee positions from the storage unit.
- Data receiving means for receiving uncompressed image data of the second dynamic range image from an external device via a transmission path; Information receiving means for receiving a plurality of knee position information relating to conversion from the first dynamic range image to the second dynamic range image from the external device via the transmission line; Conversion processing means for performing dynamic range conversion processing on the uncompressed image data of the second dynamic range image based on the plurality of knee position information; A program that causes the external device to function as information transmission means for transmitting the number information of knee positions that can be supported by the device itself via the transmission path.
- SDRAM 133 ... User operation unit 134 ... Microphone 135 ... Audio signal processing circuit 136 ... Panel drive circuit 137 ... Display panel 140 ... Graphic generation circuit 141 ... Display control unit 142 ... ⁇ Power supply unit 204... CPU 205: Internal bus 206: Flash ROM 207 ... SDRAM 208: Remote control receiver 209 ... Remote control transmitter 210 ... Storage medium control interface 211a ... BD drive 211b ... HDD 211c ... SSD 212 .. Ethernet interface 213... Network terminal 215... MPEG decoder 216... Graphic generation circuit 217... Video output terminal 218 .. Audio output terminal 221. Circuit 223 ... Display panel 224 ... Power supply unit 305 ... Antenna terminal 306 ...
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Controls And Circuits For Display Device (AREA)
- Facsimiles In General (AREA)
Abstract
Description
第1のダイナミックレンジ画像から第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を設定する設定部と、
上記第2のダイナミックレンジ画像を符号化して符号化データを生成する符号化部と、
上記複数のニー位置情報の優先順位を決定する決定部と、
上記第2のダイナミックレンジ画像の符号化データに、上記優先順位が付与された状態で上記複数のニー位置情報を付加する付加部を備える
符号化装置にある。
外部機器に、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを送信するデータ送信部と、
上記外部機器に、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関するニー位置情報を、上記外部機器が対応可能なニー位置の個数に制限して送信する情報送信部を備える
送信装置にある。
外部機器から、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを受信する画像データ受信部と、
上記外部機器から、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する所定数のニー位置情報を受信する情報受信部と、
上記第2のダイナミックレンジ画像の非圧縮画像データに上記所定数のニー位置情報に基づいた変換処理を行って所定のダイナミックレンジ画像の非圧縮画像データを得る変換処理部と、
上記外部機器に、上記伝送路を介して、自装置が対応可能なニー位置の個数情報を送信する情報送信部を備える
受信装置にある。
1.実施の形態
2.変形例
[AVシステムの構成例]
図1は、実施の形態としてのAV(Audio Visual)システム10の構成例を示している。このAVシステム10は、ハイダイナミックレンジ画像の撮像機器としてのデジタルカメラ11と、HDMIソース機器としてのBD(Blu-ray Disc)プレーヤ12と、HDMIシンク機器としてのテレビ受信機13を有している。デジタルカメラ11およびBDプレーヤ12は、例えばBDディスク、メモリカード等の記憶媒体14を介して、ハイダイナミックレンジ画像を受け渡す。BDプレーヤ12およびテレビ受信機13は、伝送路としてのHDMIケーブル15を介して接続されている。
現在、符号化効率の向上を目的として、ITU-Tと、ISO/IECの共同標準化団体であるJCTVC(Joint Collaboration Team - Video Coding)により、HEVC(High Efficiency Video Coding)と呼ばれる符号化方式の標準化が進められている。その標準化提案としてハイダイナミックレンジ画像の圧縮伸長変換情報として、ニー・ファンクション・インフォ・SEI(knee_function_info SEI)が提案されている。
図5は、図1のAVシステム10における、BDプレーヤ12のHDMI送信部12bと、テレビ受信機13のHDMI受信部13bの構成例を示している。HDMI送信部12bは、一の垂直同期信号から次の垂直同期信号までの区間から、水平帰線区間22および垂直帰線区間23を除いた区間である有効画像区間21(以下、適宜、「アクティブビデオ区間」ともいう)(図6参照)において、非圧縮の1画面分の画像の画素データに対応する差動信号を、複数のチャネルで、HDMI受信部13bに一方向に送信する。また、HDMI送信部12bは、水平帰線区間22または垂直帰線区間23において、少なくとも画像に付随する音声データや制御データ、その他の補助データ等に対応する差動信号を、複数のチャネルで、HDMI受信部13bに一方向に送信する。
図7は、デジタルカメラ11の具体的な構成例を示している。このデジタルカメラ11は、イメージャ121と、イメージャドライバ122と、撮像信号処理回路123と、カメラ制御CPU124と、静止画像信号処理回路125と、動画像信号処理回路126と、記録再生部128と、記録媒体129を有している。
図8は、BDプレーヤ12の構成例を示している。このBDプレーヤ12は、HDMI端子12aと、HDMI送信部12bと、高速バスインタフェース12cを有している。また、このBDプレーヤ12は、CPU(Central Processing Unit)204と、内部バス205と、フラッシュROM(Read Only Memory)206と、SDRAM(Synchronous Random Access Memory)207と、リモコン受信部208と、リモコン送信機209を有している。
図9は、テレビ受信機13の構成例を示している。このテレビ受信機13は、HDMI端子13aと、HDMI受信部13bと、高速バスインタフェース13cを有している。また、テレビ受信機13は、アンテナ端子305と、デジタルチューナ306と、MPEGデコーダ307と、映像信号処理回路308と、グラフィック生成回路309と、パネル駆動回路310と、表示パネル311とを有している。
この実施の形態において、BDプレーヤ12は、テレビ受信機13から、HDMIケーブル15を介して、このテレビ受信機13が対応可能なニー位置の個数情報を受信する。この場合、テレビ受信機13は、自身が対応するニー位置の個数情報を記憶部に記憶しており、このニー位置の個数情報を、HDMIケーブル15を介してBDプレーヤ12に送信する。なお、従来は、ニー位置情報の個数を指定する仕様がなく、メーカー間での互換性がなかった。
図10は、E-EDIDのデータ構造例を示している。このE-EDIDは、基本ブロックと拡張ブロックとからなっている。基本ブロックの先頭には、“E-EDID1.3 Basic Structure”で表されるE-EDID1.3の規格で定められたデータが配置され、続いて“Preferred timing”で表される従来のEDIDとの互換性を保つためのタイミング情報、および“2nd timing”で表される従来のEDIDとの互換性を保つための“Preferred timing”とは異なるタイミング情報が配置されている。
この実施の形態においては、このVSDB領域に、テレビ受信機13が対応可能なニー位置個数情報を記憶するために拡張するデータエリアを定義する。図11は、VSDB領域のデータ構造例を示している。このVSDB領域には、1バイトのブロックである第0ブロック乃至第Nブロックが設けられている。
図12は、VSIFパケットのデータ構造例を示している。HDMIでは、このVSIFパケットにより、画像に関する付帯情報をソース機器からシンク機器に伝送可能となっている。第0バイトには、VSIFパケットを示す“Packet Type(0x81)”が定義されている。
図14は、新規に定義するDRIFパケットのデータ構造例を示している。第0バイトにデータパケットの種類を示す“Packet Type (0x83)”を定義する。第1バイトには、DRIFパケットのバージョンを示す、”Version (0x01)を設定する。第2バイトには、“Length”データが定義され、第3バイト以降のバイト長(最大255)を設定する。第3バイトには、チェックサムが定義されている。
図15は、デジタルカメラ11のニー位置情報の優先順位を決定する処理を説明するフローチャートである。
上述実施の形態においては、デジタルカメラ11は、ダイナミックレンジ変換情報を、記憶媒体14を介して、BDプレーヤ12に渡している。
(1)第1のダイナミックレンジ画像から第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を設定する設定部と、
上記第2のダイナミックレンジ画像を符号化して符号化データを生成する符号化部と、
上記複数のニー位置情報の優先順位を決定する決定部と、
上記第2のダイナミックレンジ画像の符号化データに、上記優先順位が付与された状態で上記複数のニー位置情報を付加する付加部を備える
符号化装置。
(2)上記複数のニー位置情報が付加された上記第2のダイナミックレンジ画像の符号化データを記憶媒体に記憶する記憶処理部を、さらに備える
前記(1)に記載の符号化装置。
(3)上記複数のニー位置情報には、配列が優先順位順とされることで、上記優先順位が付与される
前記(1)または(2)に記載の符号化装置。
(4)上記複数のニー位置情報には、上記複数のニー位置情報の優先関係を示す情報が付加されることで、上記優先順位が付与される
前記(1)または(2)に記載の符号化装置。
(5)上記決定部は、
上記複数のニー位置情報がそれぞれ示すニー位置の圧縮伸長率に基づいて、上記複数のニー位置情報の優先順位を決定する
前記(1)から(4)のいずれかに記載の符号化装置。
(6)第1のダイナミックレンジ画像から第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を設定する設定ステップと、
上記第2のダイナミックレンジ画像を符号化部で符号化して符号化データを生成する符号化ステップと、
上記複数のニー位置情報の優先順位を決定する決定ステップと、
上記第2のダイナミックレンジ画像の符号化データに、上記優先順位が付与された状態で上記複数のニー位置情報を付加する付加ステップを有する
符号化方法。
(7)コンピュータを、
第1のダイナミックレンジ画像から第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を設定する設定手段と、
上記第2のダイナミックレンジ画像を符号化して符号化データを生成する符号化手段と、
上記複数のニー位置情報の優先順位を決定する決定手段と、
上記第2のダイナミックレンジ画像の符号化データに、上記優先順位が付与された状態で上記複数のニー位置情報を付加する付加手段と
して機能させるためのプログラム。
(8)外部機器に、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを送信するデータ送信部と、
上記外部機器に、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を、上記外部機器が対応可能なニー位置の個数に制限して送信する情報送信部を備える
送信装置。
(9)上記外部機器から、上記外部機器が対応可能なニー位置の個数情報を受信する情報受信部をさらに備える
前記(8)に記載の送信装置。
(10)優先順位が付与された状態で複数のニー位置情報が付加された上記第2のダイナミックレンジ画像の符号化データを取得するデータ取得部と、
上記第2のダイナミックレンジ画像の符号化データを復号して、上記第2のダイナミックレンジ画像の非圧縮画像データを得る復号部と、
上記複数のニー位置情報から、上記優先順位に基づいて、上記外部機器が対応可能なニー位置の個数分のニー位置情報を選択する情報選択部をさらに備える
前記(8)または(9)に記載の送信装置。
(11)上記データ送信部は、上記外部機器に、上記非圧縮画像データを、差動信号により、上記伝送路を介して送信する、
前記(8)から(10)のいずれかに記載の送信装置。
(12)上記情報送信部は、上記データ送信部で送信される上記非圧縮画像データの上記複数のニー位置情報を、該非圧縮画像データのブランキング期間に挿入することで、上記外部機器に送信する
前記(8)から(11)のいずれかに記載の送信装置。
(13)外部機器に、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを送信するデータ送信ステップと、
上記外部機器に、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を、上記外部機器が対応可能なニー位置の個数に制限して送信する情報送信ステップを有する
送信方法。
(14)コンピュータを、
外部機器に、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを送信するデータ送信手段と、
上記外部機器に、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を、上記外部機器が対応可能なニー位置の個数に制限して送信する情報送信手段と
して機能させるプログラム。
(15)外部機器から、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを受信するデータ受信部と、
上記外部機器から、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を受信する情報受信部と、
上記第2のダイナミックレンジ画像の非圧縮画像データに上記複数のニー位置情報に基づいたダイナミックレンジ変換処理を行う変換処理部と、
上記外部機器に、上記伝送路を介して、自装置が対応可能なニー位置の個数情報を送信する情報送信部を備える
受信装置。
(16)上記ニー位置の個数の情報を記憶しておく記憶部をさらに備え、
上記情報送信部は、上記記憶部から上記ニー位置の個数情報を取得して送信する
前記(15)に記載の受信装置。
(17)上記データ受信部は、上記外部機器から、上記非圧縮画像データを、差動信号により、上記伝送路を介して受信する
前記(15)または(16)に記載の受信装置。
(18)上記情報受信部は、上記データ受信部で受信された上記非圧縮画像データの上記複数のニー位置情報を、該非圧縮画像データのブランキング期間から抽出する
前記(15)から(17)のいずれかに記載の受信装置。
(19)外部機器から、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを受信するデータ受信ステップと、
上記外部機器から、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を受信する情報受信ステップと、
上記第2のダイナミックレンジ画像の非圧縮画像データに上記複数のニー位置情報に基づいたダイナミックレンジ変換処理を行う変換処理ステップと、
上記外部機器に、上記伝送路を介して、自装置が対応可能なニー位置の個数情報を送信する情報送信ステップを有する
受信方法。
(20)コンピュータを、
外部機器から、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを受信するデータ受信手段と、
上記外部機器から、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を受信する情報受信手段と、
上記第2のダイナミックレンジ画像の非圧縮画像データに上記複数のニー位置情報に基づいたダイナミックレンジ変換処理を行う変換処理手段と、
上記外部機器に、上記伝送路を介して、自装置が対応可能なニー位置の個数情報を送信する情報送信手段と
して機能させるプログラム。
11・・・・デジタルカメラ
11a・・・撮像部
11b・・・変換部
11c・・・設定部
11d・・・符号化部
11f・・・記憶処理部
12・・・・BDプレーヤ
12a・・・HDMI端子
12b・・・HDMI送信部
12c・・・高速バスインタフェース
12d・・・復号部
12e・・・情報送信部
12f・・・情報受信部
13・・・・テレビ受信機
13a・・・HDMI端子
13b・・・HDMI受信部
13c・・・高速バスインタフェース
13d・・・変換部
13e・・・情報受信部
13f・・・情報送信部
13g・・・記憶部
14・・・・記憶媒体
15・・・・HDMIケーブル
21・・・・有効画像区間
22・・・・水平帰線区間
23・・・・垂直帰線区間
24・・・・ビデオデータ区間
25・・・・データアイランド区間
26・・・・コントロール区間
31・・・・HDMIトランスミッタ
32・・・・HDMIレシーバ
33・・・・DDCライン
34・・・・CECライン
35・・・・HPDライン
36・・・・電源ライン
37・・・・リザーブライン
121・・・イメージャ
122・・・イメージャドライバ
123・・・撮像信号処理回路
124・・・カメラ制御CPU
125・・・静止画像信号処理回路
126・・・画像信号処理回路
128・・・記録再生部
129・・・記録媒体
130・・・システム制御CPU
131・・・フラッシュROM
132・・・SDRAM
133・・・ユーザ操作部
134・・・マイクロホン
135・・・音声信号処理回路
136・・・パネル駆動回路
137・・・表示パネル
140・・・グラフィック生成回路
141・・・表示制御部
142・・・電源部
204・・・CPU
205・・・内部バス
206・・・フラッシュROM
207・・・SDRAM
208・・・リモコン受信部
209・・・リモコン送信機
210・・・記憶媒体制御インタフェース
211a・・・BDドライブ
211b・・・HDD
211c・・・SSD
212・・イーサネットインタフェース
213・・・ネットワーク端子
215・・・MPEGデコーダ
216・・・グラフィック生成回路
217・・・映像出力端子
218・・音声出力端子
221・・・表示制御部
222・・・パネル駆動回路
223・・・表示パネル
224・・・電源部
305・・・アンテナ端子
306・・・デジタルチューナ
307・・・MPEGデコーダ
308・・・映像信号処理回路
309・・・グラフィック生成回路
310・・・パネル駆動回路
311・・・表示パネル
312・・・音声信号処理回路
313・・・音声増幅回路
314・・・スピーカ
320・・・内部バス
321・・・CPU
322・・・フラッシュROM
323・・・DRAM
324・・・イーサネットインタフェース
325・・・ネットワーク端子
326・・・リモコン受信部
327・・・リモコン送信機
331・・・表示制御部
332・・・電源部
Claims (20)
- 第1のダイナミックレンジ画像から第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を設定する設定部と、
上記第2のダイナミックレンジ画像を符号化して符号化データを生成する符号化部と、
上記複数のニー位置情報の優先順位を決定する決定部と、
上記第2のダイナミックレンジ画像の符号化データに、上記優先順位が付与された状態で上記複数のニー位置情報を付加する付加部を備える
符号化装置。 - 上記複数のニー位置情報が付加された上記第2のダイナミックレンジ画像の符号化データを記憶媒体に記憶する記憶処理部を、さらに備える
請求項1に記載の符号化装置。 - 上記複数のニー位置情報には、配列が優先順位順とされることで、上記優先順位が付与される
請求項1に記載の符号化装置。 - 上記複数のニー位置情報には、上記複数のニー位置情報の優先関係を示す情報が付加されることで、上記優先順位が付与される
請求項1に記載の符号化装置。 - 上記決定部は、
上記複数のニー位置情報がそれぞれ示すニー位置の圧縮伸長率に基づいて、上記複数のニー位置情報の優先順位を決定する
請求項1に記載の符号化装置。 - 第1のダイナミックレンジ画像から第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を設定する設定ステップと、
上記第2のダイナミックレンジ画像を符号化部で符号化して符号化データを生成する符号化ステップと、
上記複数のニー位置情報の優先順位を決定する決定ステップと、
上記第2のダイナミックレンジ画像の符号化データに、上記優先順位が付与された状態で上記複数のニー位置情報を付加する付加ステップを有する
符号化方法。 - コンピュータを、
第1のダイナミックレンジ画像から第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を設定する設定手段と、
上記第2のダイナミックレンジ画像を符号化して符号化データを生成する符号化手段と、
上記複数のニー位置情報の優先順位を決定する決定手段と、
上記第2のダイナミックレンジ画像の符号化データに、上記優先順位が付与された状態で上記複数のニー位置情報を付加する付加手段と
して機能させるためのプログラム。 - 外部機器に、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを送信するデータ送信部と、
上記外部機器に、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関するニー位置情報を、上記外部機器が対応可能なニー位置の個数に制限して送信する情報送信部を備える
送信装置。 - 上記外部機器から、上記外部機器が対応可能なニー位置の個数情報を受信する情報受信部をさらに備える
請求項8に記載の送信装置。 - 優先順位が付与された状態で複数のニー位置情報が付加された上記第2のダイナミックレンジ画像の符号化データを取得するデータ取得部と、
上記第2のダイナミックレンジ画像の符号化データを復号して、上記第2のダイナミックレンジ画像の非圧縮画像データを得る復号部と、
上記複数のニー位置情報から、上記優先順位に基づいて、上記外部機器が対応可能なニー位置の個数分のニー位置情報を選択する情報選択部をさらに備える
請求項8に記載の送信装置。 - 上記データ送信部は、上記外部機器に、上記非圧縮画像データを、差動信号により、上記伝送路を介して送信する、
請求項8に記載の送信装置。 - 上記情報送信部は、上記データ送信部で送信される上記非圧縮画像データの上記複数のニー位置情報を、該非圧縮画像データのブランキング期間に挿入することで、上記外部機器に送信する
請求項8に記載の送信装置。 - 外部機器に、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを送信するデータ送信ステップと、
上記外部機器に、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関するニー位置情報を、上記外部機器が対応可能なニー位置の個数に制限して送信する情報送信ステップを有する
送信方法。 - コンピュータを、
外部機器に、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを送信するデータ送信手段と、
上記外部機器に、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関するニー位置情報を、上記外部機器が対応可能なニー位置の個数に制限して送信する情報送信手段と
して機能させるプログラム。 - 外部機器から、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを受信するデータ受信部と、
上記外部機器から、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する所定数のニー位置情報を受信する情報受信部と、
上記第2のダイナミックレンジ画像の非圧縮画像データに上記所定数のニー位置情報に基づいたダイナミックレンジ変換処理を行う変換処理部と、
上記外部機器に、上記伝送路を介して、自装置が対応可能なニー位置の個数情報を送信する情報送信部を備える
受信装置。 - 上記ニー位置の個数の情報を記憶しておく記憶部をさらに備え、
上記情報送信部は、上記記憶部から上記ニー位置の個数情報を取得して送信する
請求項15に記載の受信装置。 - 上記データ受信部は、上記外部機器から、上記非圧縮画像データを、差動信号により、上記伝送路を介して受信する
請求項15に記載の受信装置。 - 上記情報受信部は、上記データ受信部で受信された上記非圧縮画像データの上記複数のニー位置情報を、該非圧縮画像データのブランキング期間から抽出する
請求項15に記載の受信装置。 - 外部機器から、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを受信するデータ受信ステップと、
上記外部機器から、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する所定数のニー位置情報を受信する情報受信ステップと、
上記第2のダイナミックレンジ画像の非圧縮画像データに上記所定数のニー位置情報に基づいたダイナミックレンジ変換処理を行う変換処理ステップと、
上記外部機器に、上記伝送路を介して、自装置が対応可能なニー位置の個数情報を送信する情報送信ステップを有する
受信方法。 - コンピュータを、
外部機器から、伝送路を介して、第2のダイナミックレンジ画像の非圧縮画像データを受信するデータ受信手段と、
上記外部機器から、上記伝送路を介して、第1のダイナミックレンジ画像から上記第2のダイナミックレンジ画像への変換に関する複数のニー位置情報を受信する情報受信手段と、
上記第2のダイナミックレンジ画像の非圧縮画像データに上記所定数のニー位置情報に基づいたダイナミックレンジ変換処理を行う変換処理手段と、
上記外部機器に、上記伝送路を介して、自装置が対応可能なニー位置の個数情報を送信する情報送信手段と
して機能させるプログラム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11201608755VA SG11201608755VA (en) | 2014-04-22 | 2015-04-20 | Encoding device, encoding method, sending device, sending method, receiving device, receiving method, and program |
KR1020167028086A KR20160145003A (ko) | 2014-04-22 | 2015-04-20 | 부호화 장치, 부호화 방법, 송신 장치, 송신 방법, 수신 장치, 수신 방법 및 프로그램 |
JP2016514908A JP6536573B2 (ja) | 2014-04-22 | 2015-04-20 | 符号化装置、符号化方法、送信装置、送信方法、受信装置、受信方法およびプログラム |
RU2016140575A RU2691084C2 (ru) | 2014-04-22 | 2015-04-20 | Устройство кодирования, способ кодирования, устройство передачи, способ передачи, устройство приема, способ приема и программа |
US15/126,477 US10356423B2 (en) | 2014-04-22 | 2015-04-20 | Encoding device, encoding method, sending device, sending method, receiving device, receiving method, and program |
EP15783507.5A EP3136731B1 (en) | 2014-04-22 | 2015-04-20 | Encoding device, encoding method, transmission device, transmission method, reception device, reception method and program |
US16/381,667 US10638139B2 (en) | 2014-04-22 | 2019-04-11 | Encoding device, encoding method, sending device, sending method, receiving device, receiving method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-087931 | 2014-04-22 | ||
JP2014087931 | 2014-04-22 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/126,477 A-371-Of-International US10356423B2 (en) | 2014-04-22 | 2015-04-20 | Encoding device, encoding method, sending device, sending method, receiving device, receiving method, and program |
US16/381,667 Division US10638139B2 (en) | 2014-04-22 | 2019-04-11 | Encoding device, encoding method, sending device, sending method, receiving device, receiving method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015163264A1 true WO2015163264A1 (ja) | 2015-10-29 |
Family
ID=54332430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/061933 WO2015163264A1 (ja) | 2014-04-22 | 2015-04-20 | 符号化装置、符号化方法、送信装置、送信方法、受信装置、受信方法およびプログラム |
Country Status (7)
Country | Link |
---|---|
US (2) | US10356423B2 (ja) |
EP (1) | EP3136731B1 (ja) |
JP (1) | JP6536573B2 (ja) |
KR (1) | KR20160145003A (ja) |
RU (1) | RU2691084C2 (ja) |
SG (1) | SG11201608755VA (ja) |
WO (1) | WO2015163264A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018047753A1 (ja) * | 2016-09-09 | 2018-03-15 | パナソニックIpマネジメント株式会社 | 表示装置および信号処理方法 |
JP2021510964A (ja) * | 2018-01-11 | 2021-04-30 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | ビデオコーディングのための等しいレンジおよび他のdraパラメータのためのシグナリング機構 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10735683B1 (en) | 2019-04-09 | 2020-08-04 | Obsidian Sensors, Inc. | Systems and methods for low-power image digitization |
KR20220017249A (ko) | 2020-08-04 | 2022-02-11 | 엘지디스플레이 주식회사 | 표시 장치의 데이터 인터페이스 장치 및 방법 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0686071A (ja) * | 1992-05-19 | 1994-03-25 | Minolta Camera Co Ltd | デジタル画像形成装置 |
JP2002152680A (ja) * | 2000-11-10 | 2002-05-24 | Fuji Photo Film Co Ltd | 画像データ形成方法および画像データ記録装置 |
US20050259729A1 (en) * | 2004-05-21 | 2005-11-24 | Shijun Sun | Video coding with quality scalability |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4575129B2 (ja) * | 2004-12-02 | 2010-11-04 | ソニー株式会社 | データ処理装置およびデータ処理方法、並びにプログラムおよびプログラム記録媒体 |
JP4819404B2 (ja) | 2005-06-07 | 2011-11-24 | シャープ株式会社 | 液晶表示装置 |
JP4687265B2 (ja) * | 2005-06-14 | 2011-05-25 | 富士ゼロックス株式会社 | 画像分析装置 |
US8194997B2 (en) * | 2006-03-24 | 2012-06-05 | Sharp Laboratories Of America, Inc. | Methods and systems for tone mapping messaging |
JP4645717B2 (ja) * | 2008-09-26 | 2011-03-09 | ソニー株式会社 | インタフェース回路および映像装置 |
TWI479898B (zh) * | 2010-08-25 | 2015-04-01 | Dolby Lab Licensing Corp | 擴展影像動態範圍 |
JP6202330B2 (ja) * | 2013-10-15 | 2017-09-27 | ソニー株式会社 | 復号装置および復号方法、並びに符号化装置および符号化方法 |
EP3145206B1 (en) * | 2014-05-15 | 2020-07-22 | Sony Corporation | Communication apparatus, communication method, and computer program |
-
2015
- 2015-04-20 RU RU2016140575A patent/RU2691084C2/ru active
- 2015-04-20 KR KR1020167028086A patent/KR20160145003A/ko not_active Application Discontinuation
- 2015-04-20 WO PCT/JP2015/061933 patent/WO2015163264A1/ja active Application Filing
- 2015-04-20 US US15/126,477 patent/US10356423B2/en active Active
- 2015-04-20 SG SG11201608755VA patent/SG11201608755VA/en unknown
- 2015-04-20 EP EP15783507.5A patent/EP3136731B1/en active Active
- 2015-04-20 JP JP2016514908A patent/JP6536573B2/ja active Active
-
2019
- 2019-04-11 US US16/381,667 patent/US10638139B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0686071A (ja) * | 1992-05-19 | 1994-03-25 | Minolta Camera Co Ltd | デジタル画像形成装置 |
JP2002152680A (ja) * | 2000-11-10 | 2002-05-24 | Fuji Photo Film Co Ltd | 画像データ形成方法および画像データ記録装置 |
US20050259729A1 (en) * | 2004-05-21 | 2005-11-24 | Shijun Sun | Video coding with quality scalability |
Non-Patent Citations (6)
Title |
---|
"TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU", RECOMMENDATION ITU-T H.264, SERIES H:AUDIOVISUAL AND MULTIMEDIA SYSTEMS, April 2013 (2013-04-01), pages 337,P.366 - 368, XP055232958 * |
"TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU, Recommendation ITU-T H.265", SERIES H:AUDIOVISUAL AND MULTIMEDIA SYSTEMS, April 2013 (2013-04-01), pages 234,P.260 - 262, XP055232953 * |
PIERRE ANDRIVON ET AL.: "SEI message for Colour Mapping Information", JOINT COLLABORATIVE TEAM ON VIDEO CODING(JCT-VC) OF ITU-T SG 16 WP3 AND ISO/IEC JTC1/SC29/WG11 17TH MEETING:VALENCIA, ES ,27 MARCH- 4 APRIL 2014, pages 1 - 14, XP030115975 * |
SALLY HATTORI ET AL.: "HLS:SEI message for Knee Function Information", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG 16 WP3 AND ISO/IEC JTC1/SC29/WG11 16TH MEETING: SAN JOSE, US , 9- 17 JAN.2014, 31 December 2013 (2013-12-31), pages 1 - 21, XP030115514 * |
SALLY HATTORI ET AL.: "HLS:SEI message for transfer function information", JOINT COLLABORATIVE TEAM ON VIDEO CODING(JCT-VC) OF ITU-T SG 16 WP3 AND ISO/IEC JTC1/SC29/WG11 15TH MEETING:GENEVA, CH ,23 OCT.- 1 NOV. 2013, 16 October 2013 (2013-10-16), pages 1 - 4, XP030115039 * |
See also references of EP3136731A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018047753A1 (ja) * | 2016-09-09 | 2018-03-15 | パナソニックIpマネジメント株式会社 | 表示装置および信号処理方法 |
JP2021510964A (ja) * | 2018-01-11 | 2021-04-30 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | ビデオコーディングのための等しいレンジおよび他のdraパラメータのためのシグナリング機構 |
Also Published As
Publication number | Publication date |
---|---|
US20190238862A1 (en) | 2019-08-01 |
SG11201608755VA (en) | 2016-11-29 |
KR20160145003A (ko) | 2016-12-19 |
US20170094287A1 (en) | 2017-03-30 |
RU2016140575A (ru) | 2018-04-16 |
US10638139B2 (en) | 2020-04-28 |
RU2016140575A3 (ja) | 2018-11-06 |
EP3136731A4 (en) | 2017-12-13 |
RU2691084C2 (ru) | 2019-06-10 |
JPWO2015163264A1 (ja) | 2017-04-13 |
US10356423B2 (en) | 2019-07-16 |
JP6536573B2 (ja) | 2019-07-03 |
EP3136731A1 (en) | 2017-03-01 |
EP3136731B1 (en) | 2021-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2021061602A (ja) | 表示装置 | |
JP6477692B2 (ja) | 通信装置及び通信方法、並びにコンピューター・プログラム | |
US10638139B2 (en) | Encoding device, encoding method, sending device, sending method, receiving device, receiving method, and program | |
JP2009124349A (ja) | 表示装置、表示装置における映像信号送信方法、送信装置および映像信号の送信方法 | |
JP2009272791A (ja) | 送信装置、情報送信方法、受信装置および情報処理方法 | |
JP6551401B2 (ja) | 通信装置又は通信方法、並びにコンピューター・プログラム | |
JP6307856B2 (ja) | 送信装置、広色域画像データ送信方法、受信装置、広色域画像データ受信方法およびプログラム | |
US10623805B2 (en) | Sending device, method of sending high dynamic range image data, receiving device, and method of receiving high dynamic range image data | |
US10504552B2 (en) | Transmission device, transmission method, reception device, and reception method | |
JP6777071B2 (ja) | 送信装置、送信方法、受信装置および受信方法 | |
JP5474253B1 (ja) | 受信装置および信号受信方法 | |
JP5706012B2 (ja) | 受信装置および信号受信方法 | |
JP5433102B2 (ja) | 送信装置および信号送信方法 | |
JP2013229885A (ja) | 送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 | |
JP2015133732A (ja) | 受信装置および信号受信方法 | |
JP2015233320A (ja) | 送信方法および送信装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15783507 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016514908 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15126477 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20167028086 Country of ref document: KR Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015783507 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015783507 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016140575 Country of ref document: RU Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |