US20190230407A1 - Method for transmitting appropriate meta data to display device according to transmission protocol version - Google Patents
Method for transmitting appropriate meta data to display device according to transmission protocol version Download PDFInfo
- Publication number
- US20190230407A1 US20190230407A1 US16/371,607 US201916371607A US2019230407A1 US 20190230407 A1 US20190230407 A1 US 20190230407A1 US 201916371607 A US201916371607 A US 201916371607A US 2019230407 A1 US2019230407 A1 US 2019230407A1
- Authority
- US
- United States
- Prior art keywords
- hdr
- luminance
- luminance value
- video signal
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 371
- 230000005540 biological transmission Effects 0.000 title abstract description 47
- 230000008569 process Effects 0.000 claims description 142
- 230000008859 change Effects 0.000 claims description 68
- 230000015654 memory Effects 0.000 claims description 8
- 235000019557 luminance Nutrition 0.000 description 651
- 238000006243 chemical reaction Methods 0.000 description 303
- 230000003068 static effect Effects 0.000 description 127
- 230000006870 function Effects 0.000 description 28
- 230000000694 effects Effects 0.000 description 14
- 238000007726 management method Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 12
- 230000009977 dual effect Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 8
- 230000008901 benefit Effects 0.000 description 7
- 238000013500 data storage Methods 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- 210000003127 knee Anatomy 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 238000013139 quantization Methods 0.000 description 5
- 238000012546 transfer Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000003672 processing method Methods 0.000 description 3
- FFBHFFJDDLITSX-UHFFFAOYSA-N benzyl N-[2-hydroxy-4-(3-oxomorpholin-4-yl)phenyl]carbamate Chemical compound OC1=C(NC(=O)OCC2=CC=CC=C2)C=CC(=C1)N1CCOCC1=O FFBHFFJDDLITSX-UHFFFAOYSA-N 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- RRAMGCGOFNQTLD-UHFFFAOYSA-N hexamethylene diisocyanate Chemical compound O=C=NCCCCCCN=C=O RRAMGCGOFNQTLD-UHFFFAOYSA-N 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000000153 supplemental effect Effects 0.000 description 2
- 230000001934 delay Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000009418 renovation Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440218—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/44029—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display for generating different versions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/57—Control of contrast or brightness
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
Definitions
- the present disclosure relates to a transmitting method, a playback method and a playback device.
- the techniques disclosed here feature a method used by a playback device, including: when a version of a transmission protocol is a first version, transmitting first meta data to a display device without transmitting second meta data to the display device, the transmission protocol being used to transmit a signal between the playback device and the display device, the first meta data including information that is commonly used for a plurality of images included in a continuous playback unit of a first video signal and relates to a luminance range of the first video signal, the second meta data including information that is commonly used for a unit subdivided compared to the continuous playback unit of the first video signal and relates to the luminance range of the first video signal; and when the version of the transmission protocol is a second version, transmitting the first meta data and the second meta data to the display device.
- FIG. 1 is a view for explaining development of a video technology
- FIG. 2 is a view for explaining an HDR (High-dynamic-range imaging) position
- FIG. 3 is a view illustrating an image example indicating an HDR effect
- FIG. 4 is a view for explaining a relationship between masters, distribution methods and display devices in case of introduction of the HDR;
- FIG. 5 is an explanatory view of a method for determining a code value of a luminance signal to be stored in content, and a process for restoring a luminance value from a code value during playback;
- FIG. 6 is a view illustrating an example indicating HDR meta data
- FIG. 7 is a view illustrating a storage example of static HDR meta data
- FIG. 8 is a view illustrating a storage example of dynamic HDR meta data
- FIG. 9 is a view illustrating a storage example of dynamic HDR meta data
- FIG. 10 is a flowchart of a method for transmitting static HDR meta data
- FIG. 11 is a flowchart of a method for processing HDR meta data
- FIG. 12 is a block diagram illustrating a configuration of a data output device
- FIG. 13 is a view illustrating a data structure example of a SEI (Supplemental Enhancement Information) message in which HDR meta data is stored;
- SEI Supplemental Enhancement Information
- FIG. 14 is a view illustrating a data structure example of a SEI message in which HDR meta data is stored
- FIG. 15 is a view illustrating a data structure example of a SEI message in which HDR meta data is stored
- FIG. 16 is a block diagram illustrating a configuration example of a data output device
- FIG. 17 is a block diagram illustrating a configuration example of a DR (Dynamic Range) converter
- FIG. 18 is a block diagram illustrating a configuration example of the DR converter
- FIG. 19 is a view illustrating an example of instruction contents of an HDR meta interpreter
- FIG. 20 is a view illustrating an example of instruction contents of the HDR meta interpreter
- FIG. 21 is a view illustrating an example of instruction contents of the HDR meta interpreter
- FIG. 22 is a block diagram illustrating a configuration example of the data output device
- FIG. 23 is a view illustrating a combination example of characteristics of a video signal and the display device, and an output signal of the data output device;
- FIG. 24 is a view illustrating an example of an operation model of playing back various signals and outputting the signals to various TVs
- FIG. 25 is a view illustrating a storage example of static HDR meta data and dynamic HDR meta data
- FIG. 26 is a view illustrating an example of a method for displaying a user guidance
- FIG. 27 is a view illustrating an example of the method for displaying the user guidance
- FIG. 28 is a view illustrating an example of the method for displaying the user guidance
- FIG. 29 is a view illustrating an example of the method for displaying the user guidance
- FIG. 30 is a flowchart of a method for transmitting dynamic HDR meta data which depends on a HDMI (High-Definition Multimedia Interface) (registered trademark and the same applies likewise below) version;
- HDMI High-Definition Multimedia Interface
- FIG. 31 is a flowchart of a method for transmitting static HDR meta data which depends on a HDMI version
- FIG. 32 is a flowchart of a method for controlling a luminance value during playback of an HDR signal
- FIG. 33 is a view for explaining a dual disk playback operation
- FIG. 34A is a view illustrating an example of a display process of converting an HDR signal in an HDR TV and displaying an HDR;
- FIG. 34B is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR (Standard Dynamic Range) TV;
- FIG. 34C is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR TV which are connected to each other via a standard interface;
- FIG. 35 is a view for explaining a process of converting an HDR into a pseudo HDR
- FIG. 36A is a view illustrating an example of an EOTF (Electro-Optical Transfer Function) which supports the HDR and the SDR, respectively;
- EOTF Electro-Optical Transfer Function
- FIG. 36B is a view illustrating an example of an inverse EOTF which supports the HDR and the SDR, respectively;
- FIG. 37 is a block diagram illustrating a configuration of a converting device and the display device according to an exemplary embodiment
- FIG. 38 is a flowchart illustrating a converting method and a display method performed by the converting device and the display device according to the exemplary embodiment
- FIG. 39A is a view for explaining first luminance conversion
- FIG. 39B is a view for explaining another example of the first luminance conversion
- FIG. 40 is a view for explaining second luminance conversion
- FIG. 41 is a view for explaining third luminance conversion.
- FIG. 42 is a flowchart illustrating a detailed process of display settings.
- a transmitting method is a method used by a playback device, and includes: when a version of a transmission protocol is a first version, transmitting first meta data to a display device without transmitting second meta data to the display device, the transmission protocol being used to transmit a signal between the playback device and the display device, the first meta data including information that is commonly used for a plurality of images included in a continuous playback unit of a first video signal and relates to a luminance range of the first video signal, the second meta data including information that is commonly used for a unit subdivided compared to the continuous playback unit of the first video signal and relates to the luminance range of the first video signal; and when the version of the transmission protocol is a second version, transmitting the first meta data and the second meta data to the display device.
- this transmitting method it is possible to transmit appropriate meta data of the first meta data and the second meta data, to the display device according to the version of the transmission protocol.
- a conversion process of converting the luminance range of the first video signal may be performed by using the second meta data to obtain a second video signal, and the second video signal may be transmitted to the display device.
- the playback device can perform the conversion process.
- the conversion processing may be performed to obtain a second video signal, and the second video signal may be transmitted to the display device, and when the version of the transmission protocol is the second version and the display device includes the function of the conversion process, the first video signal may be transmitted to the display device without performing the conversion process.
- the playback device when the playback device does not include a function of the conversion process of converting the luminance range of the video signal by using the second meta data, the conversion processing may not be performed, and the second meta data may not be transmitted to the display device.
- a luminance value of the first video signal may be encoded as a code value
- the first meta data may be the information for specifying an EOTF (Electro-Optical Transfer Function) of associating a plurality of luminance values and a plurality of code values.
- EOTF Electro-Optical Transfer Function
- the second meta data may indicate mastering characteristics of the first video signal.
- a playback method for playing back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback method including: determining whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold; and adjusting the luminance value of the video signal when it is determined that the change amount exceeds the first threshold.
- the playback method when a luminance value of a video signal exceeds display capability of the display device, it is possible to generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, according to the playback method, when a large change amount of a luminance value of a video signal is likely to negatively influence viewers, it is possible to reduce the negative influence by lowering the luminance value of the video signal.
- the adjustment may include adjusting, for a pixel whose change amount exceeds the first threshold, a luminance value of the pixel such that the change amount of the pixel is the first threshold or less.
- the determination may include determining whether or not a difference exceeds the first threshold, the difference being a difference between a peak luminance of a first image included in the video signal, and each of luminance values of a plurality of pixels included in the video signal and included in a second image subsequent to the first image, and the adjustment may include adjusting, for a pixel whose difference exceeds the first threshold, a luminance value of the pixel such that the difference of the pixel is the first threshold or less.
- the determination may include determining whether or not the change amount of the luminance value at a reference time interval exceeds the first threshold, the reference time interval being an integer multiple of a reciprocal of a frame rate of the video signal.
- the determination may include determining whether or not a rate of pixels whose change amounts exceed the first threshold with respect to a plurality of pixels exceeds a second threshold, the plurality of pixels being included in an image included in the video signal, and the adjustment may include adjusting, when the rate exceeds the second threshold, the luminance values of a plurality of pixels such that the rate is the second threshold or less.
- the determination may include determining, for each of a plurality of areas obtained by dividing a screen, whether or not the inter-screen change amount of the luminance value of each of a plurality of areas exceeds the first threshold, and the adjustment may include performing adjustment process of lowering a luminance value of an area for which it is determined that the change amount exceeds the first threshold.
- a playback method for playing back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback method including: determining whether or not a luminance value of an image of the video signal exceeds a predetermined first threshold; and adjusting the luminance value of the image when determining that the luminance value exceeds the first threshold.
- the playback method when a luminance value of a video signal exceeds display capability of the display device, it is possible to generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, according to the playback method, when a high luminance value of a video signal is likely to negatively influence viewers, it is possible to reduce the negative influence by lowering the luminance value of the video signal.
- the determination may include determining, whether or not a number of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image exceeds the first threshold, and the adjustment may include lowering, when the number of pixels exceeds a third thresholds, the luminance value of the image such that the number of pixels is the third threshold or less.
- the determination may include determining, a rate of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image, and the adjustment may include lowering, when the rate exceeds a third threshold, the luminance value of the image such that the rate is the third threshold or less.
- the first threshold may be a value calculated based on an upper limit value of a voltage which is simultaneously applicable to a plurality of pixels in a display device that displays the video signal.
- a playback device that transmits a video signal to a display device, and includes one or more memories and circuitry which, in operation, transmits, when a version of a transmission protocol is a first version, first meta data to the display device without transmitting second meta data to the display device, the transmission protocol being used to transmit a signal between the playback device and the display device, the first meta data including information that is commonly used for a plurality of images included in a continuous playback unit of a first video signal and relates to a luminance range of the first video signal, the second meta data including information that is commonly used for a unit subdivided compared to the continuous playback unit of the first video signal and relates to the luminance range of the first video signal; and transmits, when the version of the transmission protocol is a second version, the first meta data and the second meta data to the display device.
- the playback device can transmit appropriate meta data of the first meta data and the second meta data, to the display device according to the version of the transmission protocol.
- a playback device is a playback device that plays back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback device including one or more memories and circuitry which, in operation, determines whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold; and adjusts the luminance value of the video signal when it is determined that the change amount exceeds the first threshold.
- the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a large change amount of a luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- a playback device is a playback device that plays back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback device including one or more memories and circuitry which, in operation, determines whether or not a luminance value of an image included in the video signal exceeds a predetermined first threshold; and adjusts the luminance value of the image when it is determined that the luminance value exceeds the first threshold.
- the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a high luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- these comprehensive or specific aspects may be realized by a system, a method, an integrated circuit, a computer program or a computer-readable recording medium such as a CD-ROM, and may be realized by an arbitrary combination of the system, the method, the integrated circuit, the computer program and the recording medium.
- FIG. 1 is a view for explaining development of the video technology.
- an HDR High Dynamic Range
- SDR Standard Dynamic Range
- ITU-R International Telecommunications Union Radiocommunications Sector
- HDR application targets include broadcasts, package media (Blu-ray (registered trademark which applies likewise below) Discs), and Internet distribution.
- a luminance of a video image which supports the HDR takes a luminance value of an HDR luminance range, and a luminance signal obtained by quantizing the luminance value of the video image will be referred to as an HDR signal.
- a luminance of a video image which supports the SDR takes a luminance value of an SDR luminance range, and a luminance signal obtained by quantizing the luminance value of the video image will be referred to as an SDR signal.
- An HDR (High Dynamic Range) signal which is an image signal of a higher luminance range than that of conventional image signals is distributed via a package medium such as a Blu-ray disc in which HDR signals are stored, broadcasting or a distribution medium such as a OTT (Over The Top).
- the OTT means Web sites provided on the Internet, content and services such as moving images and audios or providers which provide the content and services.
- Distributed HDR signals are decoded by a Blu-ray device or the like. Further, decoded HDR signals are sent to an HDR supporting display device (TVs, projectors, tablets or smartphones), and the HDR supporting display device plays back HDR video images.
- the HDR technique is only at an early stage, and it is assumed that, after an HDR technique which is introduced first is adopted, a new HDR method is developed. In this case, it is possible to adopt a new HDR method by storing an HDR signal (and meta data) of a newly created HDR method, in an HDR distribution medium. In this case, what is important is “Forward Compatibility” which means that an original device (e.g. Blu-ray device) which does not support a new function can play back an HDR distribution medium in which signals of a new HDR method are stored.
- the present disclosure realizes a method and a device which maintain, for a distribution medium in which a new HDR signal format (meta data) is stored, compatibility by guaranteeing HDR playback of an original technique without changing a decoding device (e.g. Blu-ray device) designed for an original distribution medium, and enable an HDR decoding device (e.g. Blu-ray device) which supports a new method to support a process of a new HDR method.
- a decoding device e.g. Blu-ray device
- an HDR decoding device e.g. Blu-ray device
- FIG. 2 is a view illustrating an HDR position (expansion of a luminance). Further, FIG. 3 illustrates an image example indicating an HDR effect.
- FIG. 4 is a view illustrating a relationship between a flow of creating SDR and HDR home entertainment masters, distribution media and display devices.
- An HDR concept has been proposed, and effectiveness at a level of the HDR concept has been confirmed. Further, a first method for implementing an HDR has been proposed. In this regard, this does not mean that a great number of items of HDR content has been created by using this method, and the first implementation method has not been substantiated. Hence, when creation of HDR content is earnestly advanced in future, meta data for an existing HDR creation method, an HDR-SDR converting method or a tone mapping conversion method of a display device is likely to change.
- FIG. 5 is an explanatory view of a method for determining a code value of a luminance signal to be stored in content, and a process for restoring a luminance value from a code value during playback.
- a luminance signal indicating a luminance in the present exemplary embodiment is an HDR signal which supports an HDR.
- a graded image is quantized by an HDR inverse EOTF, and a code value associated with a luminance value of the image is determined. The image is encoded based on this code value, and a video stream is generated. During playback, a stream decoding result is inversely quantized based on the HDR EOTF and is converted into a linear signal, and a luminance value of each pixel is restored.
- Quantization performed by using an HDR inverse EOTF will be referred to as “HDR inverse EOTF conversion” below.
- Inverse quantization performed by using an HDR EOTF will be referred to as “HDR EOTF conversion” below.
- quantization performed by using an SDR inverse EOTF will be referred to as “SDR inverse EOTF conversion” below.
- Inverse quantization performed by using an SDR EOTF will be referred to as “SDR
- a video conversion processor performs conversion into a luminance value which can be displayed by a video display, by using this luminance value and meta data, and, consequently, the video display can display HDR video images.
- a peak luminance of an original HDR video image is 2000 nit and a peak luminance of the video display is 800 nit, it is possible to perform conversion and lower a luminance.
- an HDR master method is realized by a combination of an EOTF, meta data and an HDR signal. Consequently, it is likely that a more efficient EOTF and meta data are developed, and a time to adopt an HDR method using such an EOTF and meta data comes.
- the present disclosure intends to make the HDR popular by reducing a risk that, even when an HDR transmission format is changed as described above, customers who have bought HDR supporting devices have to buy new devices.
- FIG. 6 is a view illustrating an example of HDR meta data.
- HDR meta data includes conversion auxiliary information used to change (DR conversion) a luminance range of a video signal, and HDR control information.
- Each information is one of static HDR meta data provided in title units, for example, and dynamic HDR meta data provided in frame units, for example. Further, static HDR meta data is classified into one of required meta data (basic data) and selected meta data (extension data), and dynamic HDR meta data is classified as selected meta data.
- each information will be described in detail below.
- each extension method is designed so as not to influence a playback device (e.g. Blu-ray) of the basic method.
- Parameters of HDR content indicating characteristics during mastering include static HDR meta data which is fixed per title or per playlist, and dynamic HDR meta data which is variable per scene.
- a title and a playlist are information indicating video signals which are continuously played back.
- video signals which are continuously played back will be referred to as continuous playback units.
- static HDR meta data includes at least one of a type of an EOTF function (curve), a 18% Gray value, a Diffuse White value, Knee point and Clip point.
- the EOTF is information obtained by associating a plurality of luminance values and a plurality of code values, and is information for changing a luminance range of a video signal.
- Other pieces of information are attribute information related to a luminance of a video signal. Therefore, it may be said that static HDR meta data is information related to a luminance range of a video signal and is information for specifying the luminance range of the video signal.
- the 18% Gray value and the Diffuse White value indicate a luminance value (nit) of a video image whose brightness is a predetermined reference, in other words, indicate a reference brightness of a video image. More specifically, the 18% Gray value indicates a mastered luminance value (nit) obtained of an object whose brightness is 18 nit before the mastering.
- the Diffuse White value indicates a luminance value (nit) corresponding to white.
- Knee point and Clip point are parameters of the EOTF function, and indicate points at which EOTF characteristics change. More specifically, Knee point indicates a change point at which an increase of a luminance value (output luminance) mapped as a luminance of a video signal to the EOTF with respect to an increase of an original luminance value (input luminance) during shooting is a value different from 1:1.
- Knee point is information for specifying a point which is off from a linear change in FIG. 39A described below.
- Clip point indicates a point at which clipping starts in the EOTF function. In this regard, clipping refers to converting an input luminance value of a given value or more into an identical output luminance value.
- Clip point indicates a point at which an output luminance value stops changing in FIG. 39B described below.
- types of the EOTF function include an HDR EOTF and an SDR EOTF illustrated in FIG. 36A , for example.
- a content data generating method is a content data generating method for generating content data, and includes: a first generating step of generating video signals, and static HDR meta data (first meta data) including information which is commonly used for a plurality of images (video signals configuring continuous playback units) included in the continuous playback units of the video signals, and which relates to a luminance range of the video signals; and a second generating step of generating content data by associating the continuous playback units and the static HDR meta data.
- the information which relates to the luminance range of the video signals is information for converting the luminance range of the video signals.
- the static HDR meta data includes information for specifying an EOTF of associating a plurality of luminance values and a plurality of code values. Furthermore, a luminance value of the video signal is encoded as a code value.
- the static HDR meta data further includes information indicating a luminance value of a video signal whose brightness is a predetermined reference or information indicating a point at which EOTF characteristics change.
- the static HDR meta data includes information (Diffuse White value) indicating a luminance value corresponding to white of a video signal.
- dynamic HDR meta data (second meta data) which is information commonly used for units subdivided compared to the continuous playback units and which is information related to the luminance range of the video signals is further generated.
- the information related to the luminance range of the video signals is information for converting the luminance range of the video signals.
- the dynamic HDR meta data is a parameter indicating mastering characteristics which are different per scene.
- the mastering characteristics described herein indicate a relationship between an original (pre-mastering) luminance and a mastered luminance.
- the parameter indicating the mastering characteristics is same information as the above static HDR meta data, in other words, at least one of pieces of information included in the static HDR meta data.
- FIG. 7 is a view illustrating a storage example of static HDR meta data. This example is an example where static HDR meta data is stored in a playlist in a package medium such as a Blu-ray disc.
- the static HDR meta data is stored as one of items of meta data of each stream to which a reference is made from a playlist.
- the static HDR meta data is fixed in playlist units. That is, the static HDR meta data is associated with each playlist and stored.
- static HDR meta data may be stored in a manifest file to which a reference is made before a stream is obtained. That is, according to the content data generating method according to the present exemplary embodiment, a video signal may be generated as a video stream, and static HDR meta data may be stored in a manifest file to which a reference is made before the video stream is obtained.
- static HDR meta data may be stored in a descriptor indicating a stream attribute. That is, according to the content data generating method according to the present exemplary embodiment, content data may be generated as a video stream, and static HDR meta data may be stored as an identifier indicating a video stream attribute independently from the video stream.
- the static HDR meta data can be stored as a descriptor according to MPEG2-TS (Moving Picture Experts Group 2-Transport Stream).
- static HDR meta data when static HDR meta data is fixed per title, static HDR meta data may be stored as management information indicating a title attribute.
- static HDR meta data for an HDR is stored by using a mechanism which stores various items of meta data in a playlist in a Blu-ray disc.
- a presence of static HDR meta data needs to be defined in a playlist from a viewpoint of application standards or a device such of Blu-ray or the like.
- a capacity is defined, and therefore it is difficult to limitlessly store items of static HDR meta data for an HDR option technique.
- FIG. 8 is a view illustrating an example where dynamic HDR meta data is stored in a video stream.
- MPEG-4 AVC Advanced Video Coding
- HEVC High Efficiency Video Coding
- SEI Supplemental Enhancement Information
- the dynamic HDR meta data is assumed to be updated per scene.
- a scene head is a head access unit (AU) in random access units such as GOP (Group Of Pictures).
- dynamic HDR meta data may be stored in a head access unit in a decoding order in the random access units.
- the head access unit in the random access units is an IDR (Instantaneous Decoder Refresh) picture or a non-IDR I picture to which a SPS (Sequence Parameter Set) is added.
- a receiving-side device can obtain dynamic HDR meta data by detecting a NAL (Network Abstraction Layer) unit configuring a head access unit in the random access units.
- a unique type may be allocated to SEI in which dynamic HDR meta data is stored.
- a type of the EOTF function may be stored as stream attribute information of a SPS. That is, according to the content data generating method according to the present exemplary embodiment, content data may be generated as a video stream encoded according to HEVC, and information for specifying the EOTF may be stored in a SPS included in a video stream.
- a mechanism which stores option data according to MPEG is used, and dynamic HDR meta data is stored in a video elementary stream.
- dynamic HDR meta data is not known from a viewpoint of application standards or a device of Blu-ray or the like.
- an area to be used is a SEI area, so that it is also possible to store a plurality of items of option dynamic HDR meta data.
- FIG. 9 is a view illustrating an example where dynamic HDR meta data is stored in a TS stream format different from that of a main video image.
- Blu-ray has a function of synchronizing and playing back two TS streams.
- This two TS stream synchronizing/playback function includes a 2TS playback function of synchronizing and playing back two individually managed TS streams in a disk, and a 1TS playback function of interleaving two streams to use as one TS stream.
- the playback device can use dynamic HDR meta data in synchronization with main HDR video images. Consequently, a normal HDR player can play back only main HDR video images, and obtain video images of standard HDR quality. Further, an option supporting HDR player can play back high gradation HDR quality video images by using dynamic HDR meta data stored in a TS.
- dynamic HDR meta data is stored in an auxiliary TS stream by using a mechanism which stores two TS streams of Blu-ray.
- a presence of dynamic HDR meta data is recognized as a TS stream from a viewpoint of application standards or a device of Blu-ray or the like.
- FIG. 10 is a view illustrating a method for transmitting static HDR meta data.
- FIG. 10 illustrates a flowchart illustrating an example of an operation of transmitting an HDR signal to a display device from a playback device such as a BD player (Blu-ray device) or a recorder according to a transmission protocol such as HDMI.
- a playback device such as a BD player (Blu-ray device) or a recorder according to a transmission protocol such as HDMI.
- the playback device obtains the static HDR meta data from content management information during a start of playback of a title or a playlist, and stores and transmits the obtained static HDR meta data as HDMI control information. That is, prior to a start of transmission of a video signal configuring a title or a playlist, the playback device obtains static HDR meta data corresponding to the title or the playlist, and transmits the obtained static HDR meta data as HDMI control information (S 402 ). More generally, the playback device may transmit static HDR meta data as initialization information when a HDMI initialization process between the playback device and the display device is performed.
- the playback device transmits a video stream corresponding to the static HDR meta data (S 403 ).
- transmitted static HDR meta data is effective for this video stream.
- a video stream transmitting method is a video stream transmitting method for transmitting a video stream (video stream), and includes: an obtaining step of obtaining content data including video signals, and static HDR meta data (first meta data) which is information which is commonly used for a plurality of images included in the continuous playback units, and which relates to a luminance range of the video signals; and a transmitting step of transmitting a video stream corresponding to the video signals, and static HDR meta data.
- first meta data static HDR meta data
- the video stream and the static HDR meta data are transmitted according to a HDMI communication protocol.
- dynamic HDR meta data is transmitted as part of a video stream (SEI).
- the playback device may transmit dynamic HDR meta data as a HDMI control signal at a timing at which the dynamic HDR meta data becomes effective.
- the playback device provides identifiers for the static HDR meta data and the dynamic HDR meta data to be identified from each other, and transmit the static HDR meta data and the dynamic HDR meta data.
- a data structure of a container for storing dynamic HDR meta data may be defined in a control signal, and a copy of contents of SEI may be enabled as payload data of the container. Consequently, it is possible to support even an update of a syntax of dynamic HDR meta data included in SEI without changing the mounted playback device such as a BD player.
- dynamic HDR meta data stored in a TS stream is synthesized with a main HDR video signal by some method, and is transmitted as a new video signal (high gradation HDR video images in an example in FIG. 9 ) according to HDMI.
- FIG. 11 is a flowchart illustrating an example of a method for processing HDR meta data in case where the display device displays an HDR signal.
- the display device obtains static HDR meta data from HDMI control information (S 411 ), and determines a method for displaying an HDR signal based on the obtained static HDR meta data (S 412 ).
- the display device determines a method for displaying an HDR signal based on a predetermined value of application standards or default settings of the display device. That is, according to the video display method according to the present exemplary embodiment, when static HDR meta data cannot be obtained, a video display method matching a video signal is determined based on the predetermined value or the settings.
- the display device updates a method for displaying an HDR signal based on the dynamic HDR meta data (S 414 ). That is, according to the video display method according to the present exemplary embodiment, when static HDR meta data is obtained, a display method is determined based on the obtained static HDR meta data to display video images. Further, when dynamic HDR meta data is obtained, the display method determined based on the static HDR meta data is updated to the display method determined based on dynamic HDR meta data to display video images. Alternatively, a display method may be determined based on both of static HDR meta data and dynamic HDR meta data.
- the display device may operate based only on static HDR meta data. Further, even when the display device supports obtaining dynamic HDR meta data, the display device cannot update a method for displaying an HDR signal in synchronization with a presentation time stamp (PTS) of an access unit in which meta data is stored in some cases. In this case, after obtaining meta data, the display device may update a display method from an access unit displayed subsequent to the earliest time at which the display method can be updated.
- PTS presentation time stamp
- HDR meta data may be configured by a basic portion and an extension portion, and a parameter may be updated or added by changing the extension portion without updating the basic portion. That is, each of static HDR meta data and dynamic HDR meta data may include a plurality of versions, and may include a basic portion which is commonly used between a plurality of versions, and an extension portion which differs per version. By so doing, it is possible to secure backward compatibility of the display device based on HDR meta data of the basic portion.
- the video display method is a video display method for displaying video images based on video streams, and includes: an obtaining step of obtaining a video stream corresponding to the video signals, and static HDR meta data (first meta data); and a display step of determining a display method for displaying the video images corresponding to the video signals based on the static HDR meta data and displaying the video image.
- a luminance value of the video signal is encoded as a code value.
- Static HDR meta data includes information for specifying an EOTF of associating a plurality of luminance values and a plurality of code values.
- video images are generated by using the EOTF specified by the static HDR meta data and converting the code value indicated by the video signal into a luminance value
- FIG. 12 is a block diagram illustrating a configuration of data output device 400 such as a BD layer which outputs HDR signals.
- HDR meta data input to data output device 400 includes characteristics data indicating mastering characteristics of an HDR signal, and conversion auxiliary data indicating a tone mapping method for converting an HDR signal into an SDR signal or for converting a dynamic range of the HDR signal.
- These two types of items of meta data are stored as static HDR meta data or dynamic HDR meta data as described with reference to FIGS. 7 and 8 .
- the static HDR meta data is stored in at least one of content management information and a video stream.
- Data output device 400 includes video decoder 401 , external meta obtaining unit 402 , HDR meta interpreter 403 , HDR control information generator 404 , DR converter 405 and HDMI output unit 406 .
- Video decoder 401 generates a video signal (first video signal) by decoding a video stream which is a video encoded stream, and outputs the resulting video signal to DR converter 405 . Further, video decoder 401 obtains HDR meta data (second meta data) (static HDR meta data or dynamic HDR meta data) in the video stream. More specifically, video decoder 401 outputs to HDR meta interpreter 403 HDR meta data stored in a SEI message or the like according to MPEG-4 AVC or HEVC.
- External meta obtaining unit 402 obtains static HDR meta data (first meta data) stored in the content management information such as a playlist, and outputs the obtained static HDR meta data to HDR meta interpreter 403 .
- static HDR meta data first meta data
- dynamic HDR meta data which can be changed in predetermined units, such as a playitem, which enable a random access may be stored.
- external meta obtaining unit 402 obtains dynamic HDR meta data from the content management information, and outputs the obtained dynamic HDR meta data to HDR meta interpreter 403 .
- HDR meta interpreter 403 determines a type of HDR meta data output from video decoder 401 or external meta obtaining unit 402 , outputs characteristics data to HDR control information generator 404 and outputs conversion auxiliary data to DR converter 405 .
- both of video decoder 401 and external meta obtaining unit 402 obtain static HDR meta data
- only the static HDR meta data output from external meta obtaining unit 402 may be used as effective meta data. That is, there is a case where the first meta data obtained by external meta obtaining unit 402 and the second meta data obtained by video decoder 401 are static HDR meta data which is commonly used for a plurality of images included in continuous playback units of the first video signal.
- HDR meta interpreter 403 obtains characteristics data and conversion auxiliary data by analyzing the first meta data.
- HDR meta interpreter 403 may use static HDR meta data as effective meta data when external meta obtaining unit 402 obtains the static HDR meta data, and may overwrite static HDR meta data over the effective meta data when video decoder 401 obtains the static HDR meta data. That is, there is a case where the first meta data obtained by external meta obtaining unit 402 and the second meta data obtained by video decoder 401 are static HDR meta data which is commonly used for a plurality of images included in continuous playback units of the first video signal. In this case, when only the first meta data of the first meta data and the second meta data is obtained, HDR meta interpreter 403 obtains characteristics data and conversion auxiliary data by analyzing the first meta data. When the second meta data is obtained, HDR meta interpreter 403 switches meta data to use from the first meta data to the second meta data.
- HDR control information generator 404 generates HDR control information according to HDMI based on the characteristics data, and outputs the generated HDR control information to HDMI output unit 406 .
- an output timing of HDR control information in HDMI output unit 406 is determined such that it is possible to output HDR control information in synchronization with a video signal whose meta data is effective. That is, HDMI output unit 406 outputs HDR control information in synchronization with a video signal (video signal) whose meta data is effective.
- DR converter 405 converts a decoded video signal into an SDR signal and converts a dynamic range based on conversion auxiliary data.
- DR converter 405 does not need to perform conversion.
- data output device 400 may determine whether or not conversion process is necessary. When it is determined that the conversion process is unnecessary, a first video signal obtained by video decoder 401 is input to HDMI output unit 406 without DR converter 405 .
- HDMI output unit 406 outputs the first video signal and the HDR control information to the display device when the display device connected to data output device 400 supports a video output of a luminance range of the HDR signal (first video signal). Further, HDMI output unit 406 outputs a second video signal obtained by converting an HDR into an SDR, and the HDR control information to the display device when the display device connected to data output device 400 does not support a video output of a luminance range of the HDR signal (first video signal). Furthermore, HDMI output unit 406 determines whether or not the display device supports a video output of a luminance range of an HDR signal (first video signal) by a transmission protocol (e.g. HDMI) initialization process.
- a transmission protocol e.g. HDMI
- HDMI output unit 406 outputs the video signal output from DR converter 405 or video decoder 401 and the HDR control information according to a HDMI protocol.
- data output device 400 can use the same configuration even when receiving and outputting broadcast or OTT content. Further, when data output device 400 and the display device are included in a single device, HDMI output unit 406 is not necessary.
- data output device 400 includes external meta obtaining unit 402 which obtains meta data from control information or the like, and video decoder 401 includes a function of obtaining meta data from a video stream.
- data output device 400 may include one of external meta obtaining unit 402 and the function.
- data output device 400 may output data according to an arbitrary transmission protocol.
- data output device 400 includes: a decoder (video decoder 401 ) which generates a first video signal of a first luminance range (HDR) by decoding a video stream; an obtaining unit (at least one of video decoder 401 and external meta obtaining unit 402 ) which obtains first meta data related to a luminance range of the first video signal; an interpreter (HDR meta interpreter 403 ) which obtains characteristics data indicating the luminance range of the first video signal by interpreting the first meta data;
- a decoder video decoder 401
- HDR luminance range
- HDR control information generator 404 which converts the characteristics data into HDR control information according to a predetermined transmission protocol (e.g. HMDI); and an output unit (HDMI output unit 406 ) which outputs the HDR control information according to the predetermined transmission protocol.
- HMDI a predetermined transmission protocol
- HDMI output unit 406 an output unit which outputs the HDR control information according to the predetermined transmission protocol.
- data output device 400 can generate the control information based on the characteristics data included in the meta data.
- the interpreter (HDR meta interpreter 403 ) further obtains conversion auxiliary data for converting a luminance range of a first video signal, by interpreting the first meta data.
- Data output device 400 further includes a converter (DR converter 405 ) which generates a second video signal of a luminance range narrower than the luminance range of the first video signal by converting the luminance range of the first video signal based on the conversion auxiliary data.
- the output unit (HDMI output unit 406 ) further outputs at least one of the first video signal and the second video signal according to the predetermined transmission protocol.
- data output device 400 can change the luminance range of the first video signal by using the conversion auxiliary data included in the meta data.
- the decoder (video decoder 401 ) obtains the second meta data (HDR meta data) related to the luminance range of the first video signal from the video stream.
- the interpreter (HDR meta interpreter 403 ) obtains characteristics data and the conversion auxiliary data by analyzing at least one of the first meta data and the second meta data.
- static HDR meta data includes required meta data and selected meta data
- dynamic HDR meta data includes only selected meta data. That is, static HDR meta data is used at all times, and dynamic HDR meta data is selectively used.
- the first meta data obtained by external meta obtaining unit 402 or the second meta data obtained by video decoder 401 includes static HDR meta data (static meta data) which is commonly used for a plurality of images included in continuous playback units of a video signal and includes characteristics data.
- HDR control information generator 404 converts the characteristics data included in the static HDR meta data into HDR control information according to the predetermined transmission protocol.
- HDMI output unit 406 outputs the HDR control information based on the static HDR meta data when outputting the first video signal (HDR signal).
- the first meta data obtained by external meta obtaining unit 402 or the second meta data obtained by video decoder 401 further includes dynamic HDR meta data (dynamic meta data) which is commonly used for units subdivided compared to the continuous playback units of the video signal and includes characteristics data.
- HDR control information generator 404 converts the characteristics data included in the static HDR meta data and the characteristics data included in the dynamic HDR meta data, into HDR control information according to the predetermined transmission protocol.
- HDMI output unit 406 outputs the HDR control information based on the static HDR meta data and the dynamic HDR meta data when outputting the first video signal (HDR signal).
- a data generating method is a data generating method performed by a data generating device, and includes: a first generating step of generating meta data related to a luminance range of a video signal; and a second generating step of generating a video stream including a video signal and meta data.
- Meta data includes characteristics data indicating the luminance range of the video signal, and conversion auxiliary data for converting the luminance range of the video signal.
- FIG. 13 is a view illustrating a data structure example of a SEI message in which HDR meta data is stored.
- an HDR meta data dedicated SEI message may be defined. That is, meta data may be stored in a meta data dedicated message.
- HDR meta data may be stored in a general-purpose user data storage SEI message, and information (HDR extension identification information described below) indicating that the HDR meta data is stored in a payload portion of the message may be provided.
- HDR meta data includes static HDR meta data and dynamic HDR meta data. Further, flag information indicating whether or not static HDR meta data is stored, and flag information indicating whether or not dynamic HDR meta data is stored may be provided. Thus, it is possible to use three types of storage methods including a method for storing only static HDR meta data, a method for storing only dynamic HDR meta data and a method for storing both of the static HDR meta data and the dynamic HDR meta data.
- meta data basic data (basic portion) which needs to be interpreted, and extension data (extension portion) which is optionally interpreted (whose interpretation is optional)
- type information indicating a type of meta data (basic data or extension data) and a size are included in header information, and a format of a container in which meta data is stored in a payload is defined. That is, meta data includes a payload, information indicating whether payload data is basic data or extension data, and information indicating a payload data size.
- meta data includes type information indicating a type of meta data. For example, basic data is stored in a container whose type value is 0. Further, a value equal to or more than 1 is allocated as a type value to the extension data, and this value indicates a type of the extension data.
- the data output device and the display device refer to the type value, and obtain data of the container which the data output device and the display device can interpret. That is, the data output device (or the display device) determines whether or not the data output device (or the display device) can interpret meta data, by using the type information, and obtains characteristics data and conversion auxiliary data by interpreting the meta data when the data output device (or the display device) can interpret the meta data.
- meta data may be generated such that a maximum size of HDR meta data is set in advance and a total sum of sizes of basic data and extension data is the maximum size or less. That is, a maximum value of a data size of meta data is defined, and, according to the data generating method according to the present disclosure, the meta data is generated such that a total data size of the basic data and the extension data is the maximum value or less.
- the data output device and the display device include memories which support this maximum size and, consequently, can guarantee that all HDR meta data can be stored in the memories.
- memories which support this maximum size and, consequently, can guarantee that all HDR meta data can be stored in the memories.
- Such a data structure may be used to store HDR meta data in content management information.
- FIG. 14 is a view illustrating an example of a data structure in a case where HDR meta data is stored in a user data storage SEI message.
- the data structure is the same as the data structure in FIG. 14 except that a message includes HDR extension identification information and an extension type ID.
- the HDR extension identification information indicates that the message includes HDR meta data.
- An extension type ID indicates an HDR meta data version or the like. That is, meta data is stored in a SEI message according to HEVC, and the SEI message includes HDR extension identification information indicating whether or not the SEI message includes meta data.
- the data output device copies and outputs the received SEI message according to a protocol of an output I/F such as HDMI for the display device. That is, when a SEI message including HDR extension identification information indicating that meta data is included in the SEI message is obtained, and the data output destination display device supports an input of the HDR control information, the data output device outputs the SEI message as is according to a predetermined transmission protocol (e.g. HDMI).
- a predetermined transmission protocol e.g. HDMI
- the data output device can output HDR meta data to the display device. According to this configuration, even when a new DR conversion process is developed in future to define new HDR meta data and a display device which supports this new HDR meta data is connected to a data output device which does not support new HDR meta data, it is possible to output new HDR meta data from the data output device to the display device. Further, the display device can perform a DR conversion process matching new HDR meta data.
- FIG. 15 is a view illustrating an example of a data structure in a case where a plurality of items of HDR meta data is stored in one user data storage SEI message.
- SEI message a plurality of items of HDR meta data for a plurality of conversion modes (methods) related to conversion of a dynamic range (luminance range) is stored.
- a field (a number of conversion modes) indicating the number of conversion modes of providing HDR meta data is added to the data structure illustrated in FIG. 15 compared to the data structure illustrated in FIG. 14 . Further, a plurality of items of HDR meta data corresponding to each conversion mode is stored in order subsequent to the number of conversion modes.
- the data generating method is a data generating method performed by the data generating device, and includes: a first generating step of generating one or more items of meta data (HDR meta data) matching one or more conversion modes of converting a luminance range of a video signal; and a second generating step of generating a video stream including a video signal, the one or more items of meta data, and the number of conversion modes indicating the number of one or more conversion modes.
- HDR meta data meta data matching one or more conversion modes of converting a luminance range of a video signal
- FIG. 16 is a block diagram illustrating a configuration example of data output device 500 according to the present exemplary embodiment.
- This data output device 500 includes video decoder 501 , external meta obtaining unit 502 , HDR meta interpreter 503 , HDR control information generator 504 , DR converter 505 and HDMI output unit 506 .
- operations of HDR meta interpreter 503 and DR converter 505 are different from those of data output device 400 illustrated in FIG. 12 . That is, operations of video decoder 501 , external meta obtaining unit 502 , HDR control information generator 504 and HDMI output unit 506 are the same as operations of video decoder 401 , external meta obtaining unit 402 , HDR control information generator 404 and HDMI output unit 406 .
- data output device 500 is connected with display device 510 (display), and outputs generated video signals and HDR control information to display device 510 according to a predetermined transmission protocol such as HDMI.
- HDR meta interpreter 503 obtains static HDR meta data and dynamic HDR meta data from external meta obtaining unit 502 and video decoder 501 . In content management information or encoded video stream, a plurality of items of HDR meta data for a plurality of conversion modes is stored. HDR meta interpreter 503 determines a plurality of conversion modes matching a plurality of HDR meta data as a plurality of usable conversion modes.
- HDR meta interpreter 503 obtains information of a conversion mode of an HDR signal supported by display device 510 by communicating with display device 510 or via a network. Furthermore, HDR meta interpreter 503 determines (1) which one of data output device 500 and display device 510 performs a dynamic range conversion process and (2) a conversion mode to use, based on (1) a conversion mode matching HDR meta data, (2) a conversion mode supported by DR converter 505 and (3) a conversion mode supported by display device 510 .
- DR converter 505 converts an HDR signal into an SDR signal according to the conversion mode instructed by HDR meta interpreter 503 .
- data output device 500 transmits a video signal (HDR signal) to display device 510 , and transmits HDR meta data which is necessary for conversion as a HDMI control signal (HDR control information) to display device 510 .
- DR converter 505 supports a plurality of conversion modes. However, DR converter 505 only needs to support one or more conversion modes. In this case, data output device 500 only needs to obtain one or more items of HDR meta data matching one or more conversion modes.
- data output device 500 includes: a decoder (video decoder 501 ) which generates a first video signal by decoding a video stream; an obtaining unit (at least one of video decoder 501 and external meta obtaining unit 502 ) which obtains one or more items of meta data matching one or more first conversion modes of converting a luminance range of the video signal; an interpreter (HDR meta interpreter 503 ) which obtains characteristics data indicating a luminance range of the first video signal and conversion auxiliary data for converting the luminance range of the first video signal, by interpreting one of one or more items of first meta data; a control information generator (HDR control information generator 504 ) which converts the characteristics data into HDR control information according to a predetermined transmission protocol (e.g.
- a converter which supports one or more second conversion modes of converting a luminance range of a video signal, and generates a second video signal of a luminance range narrower than the luminance range of the first video signal by performing a process of converting the luminance range of the first video signal according to one of the one or more second conversion modes based on the conversion auxiliary data; and an output unit (HDMI output unit 506 ) which outputs the second video signal and the HDR control information to display device 510 according to the predetermined transmission protocol.
- DR converter 505 which supports one or more second conversion modes of converting a luminance range of a video signal, and generates a second video signal of a luminance range narrower than the luminance range of the first video signal by performing a process of converting the luminance range of the first video signal according to one of the one or more second conversion modes based on the conversion auxiliary data
- an output unit (HDMI output unit 506 ) which outputs the second video signal and the HDR control information to display device 510 according to the pre
- the interpreter (HDR meta interpreter 503 ) further determines which one of data output device 500 and display device 510 performs the above conversion process based on the one or more first conversion modes, the one or more second conversion modes and a third conversion mode which is supported by display device 510 and converts a luminance range of a video signal.
- data output device 500 can determine which one of data output device 500 and display device 510 performs a conversion process based on the first conversion mode matching one or more items of meta data, the second conversion mode supported by data output device 500 and the third conversion mode supported by display device 510 . Consequently, data output device 500 can determine a device which appropriately performs a conversion process.
- the one or more second conversion modes supported by data output device 500 may include at least part of a plurality of first conversion modes matching the one or more items of meta data, or may not include any one of the one or more first conversion modes.
- the third conversion mode supported by display device 510 may include at least part of the one or more first conversion modes, or may not include any one of the one or more first conversion modes.
- the third conversion mode may include at least part of the one or more second conversion modes, or may not include any one of the one or more second conversion modes.
- FIG. 17 is a block diagram illustrating the configuration example of DR converter 505 .
- This DR converter 505 includes mode determining unit 511 , N mode processors 512 and conversion result output unit 513 .
- N mode processors 512 each support each of N conversion modes (processing methods), and perform a process of a corresponding conversion mode.
- Mode determining unit 511 obtains a conversion mode instructed by HDR meta interpreter 503 , and determines mode processor 512 which performs a conversion process. That is, mode determining unit 511 selects mode processor 512 which supports the conversion mode instructed by HDR meta interpreter 503 .
- Determined mode processor 512 generates an SDR signal (converted video signal) by performing a process of converting an HDR signal (video signal).
- Conversion result output unit 513 outputs the converted SDR signal.
- FIG. 18 is a block diagram illustrating a configuration example of DR converter 505 A which is another example of DR converter 505 .
- This DR converter 505 includes mode determining unit 521 , basic processor 522 , N extension mode processors 523 and conversion result output unit 524 .
- Basic processor 522 performs a default conversion process which is a common process among N conversion modes.
- N extension mode processors 523 perform a process performed by basic processor 522 and, in addition, an extension process of dynamically controlling parameters of a conversion process by using dynamic HDR meta data. Further, N extension mode processors 523 each support each of N conversion modes, and perform a corresponding conversion mode extension process. For example, basic processor 522 operates by using only static HDR meta data, and extension mode processor 523 operates by using static HDR meta data and, in addition, dynamic HDR meta data.
- FIGS. 19 and 20 are views illustrating examples of instruction contents of HDR meta interpreter 503 based on a conversion mode of providing HDR meta data, whether or not data output device 500 supports each mode and whether or not display device 510 supports each mode.
- HDR meta interpreter 503 basically selects an operation which maximizes reproducibility for a master image, from selectable combinations.
- the master image refers to an image output without changing a luminance range.
- data output device 500 supports mode 1 and mode 2 , and display device 510 does not support any conversion mode.
- mode 2 has higher reproducibility for the master image.
- HDR meta interpreter 503 learns reproducibility of each mode for a master image in advance. In this case, HDR meta interpreter 503 determines that data output device 500 performs a conversion process, and selects mode 2 of the higher reproducibility between mode 1 and mode 2 .
- data output device 500 supports mode 1
- display device 510 supports mode 1 and mode 2
- HDR meta interpreter 503 determines that display device 510 performs a conversion process, and selects mode 2 of the higher reproducibility between mode 1 and mode 2 .
- data output device 500 outputs HDR meta data matching a conversion process of mode 2 as HDMI control information (HDR control information) to display device 510 .
- Display device 510 performs a conversion process of mode 2 by using the control information.
- HDR meta interpreter 503 further determines as a conversion mode of a conversion process to be performed by data output device 500 a conversion mode which is included in the one or more first conversion modes matching the one or more items of meta data and which is included in the one or more second conversion modes supported by data output device 500 . More specifically, HDR meta interpreter 503 further determines as a conversion mode of a conversion process to be performed by data output device 500 or display device 510 a conversion mode which is included in the one or more first conversion modes matching the one or more items of meta data and which is included in at least one of the one or more second conversion modes supported by data output device 500 and the third conversion mode supported by display device 510 .
- HDR meta interpreter 503 determines as a conversion mode of a conversion process to be performed by data output device 500 or display device 510 a conversion mode of the highest reproducibility for a master image among a plurality of conversion modes included in a plurality of first conversion modes and included in at least one of a plurality of second conversion modes and the third conversion mode.
- data output device 500 selects a mode of the highest reproducibility among conversion modes supported by data output device 500 and display device 510 , and determines that one device of data output device 500 and display device 510 supporting the selected mode performs a conversion process.
- HDR meta interpreter 503 determines that data output device 500 performs a conversion process when the determined conversion mode of the conversion process is included in the second conversion modes and is not included in the third conversion mode. Further, as illustrated in FIG. 20 , HDR meta interpreter 503 determines that display device 510 performs a conversion process when the determined conversion mode of the conversion process is included in the third conversion mode and is not included in the second conversion modes.
- data output device 500 can determine a conversion mode to use based on the first conversion modes matching one or more items of meta data, the second conversion modes supported by the data output device and the third conversion mode supported by the display device. Further, data output device 500 can select the conversion mode of the highest reproducibility for a master image and, consequently, can improve quality of video images to be displayed.
- FIG. 21 is a view illustrating an example where a conversion process is determined according to whether or not data output device 500 can obtain parameters of display device 510 .
- a parameter of display device 510 is a peak luminance of display device 510 (a maximum value of a luminance range which display device 510 can display) or a display mode which display device 510 can display. More specifically, this parameter indicates a currently viewing display mode as a display mode.
- the display modes include a normal mode, a dynamic mode and a cinema mode.
- data output device 500 supports mode 1 , mode 2 and mode 3 , and display device 510 supports mode 1 . Further, data output device 500 can obtain parameters of display device 510 for mode 1 and mode 2 , and cannot obtain a parameter of display device 510 for mode 3 . Furthermore, mode 2 has higher reproducibility than that of mode 1 , and mode 3 has higher reproducibility than that of mode 2 .
- a mode of the highest reproducibility among the modes supported by data output device 500 and display device 510 is mode 3 .
- data output device 500 cannot obtain the parameter of display device 510 for mode 3 , and therefore mode 3 is excluded.
- data output device 500 selects mode 2 whose reproducibility is the second highest to that of mode 3 and whose parameter can be obtained, as a conversion mode to use.
- data output device 500 obtains parameters which is necessary for mode 2 , from display device 510 , and performs a conversion process of mode 2 by using the obtained parameters.
- HDR meta interpreter 503 further determines a conversion mode of a conversion process performed by data output device 500 or display device 510 according to whether or not it is possible to obtain from display device 510 parameters for each of a plurality of first conversion modes matching a plurality of items of meta data. More specifically, HDR meta interpreter 503 determines as a conversion mode of a conversion process to be performed by data output device 500 or display device 510 a conversion mode which is included in a plurality of first conversion modes and included in at least one of a plurality of second conversion modes and the third conversion mode, and which makes it possible to obtain the parameters from display device 510 .
- data output device 500 selects a mode of the highest reproducibility among the conversion modes supported by data output device 500 and display device 510 , and determines whether or not it is possible to obtain a parameter of display device 510 for the selected mode when only data output device 500 supports the selected mode. When the parameter can be obtained, data output device 500 selects this mode. Meanwhile, when the parameter cannot be obtained, data output device 500 selects another mode (a mode of the second highest reproducibility).
- data output device 500 determines a conversion mode to use according to whether or not it is possible to obtain the parameter of display device 510 and, consequently, can select a more appropriate conversion mode.
- FIG. 22 is a block diagram illustrating a configuration of data output device 500 A.
- This data output device 500 A further includes DC 507 compared to data output device 500 illustrated in FIG. 16 .
- DC 507 down-converts a resolution of a video signal obtained by video decoder 501 . For example, when a video signal is 4K, DC 507 down-converts a 4K video signal into a 2K video signal.
- data output device 500 A can selectively perform an operation of (1) converting a 4K HDR signal into a 2K HDR signal to output, (2) converting the 4K HDR signal into the 2K HDR signal, and then changing the dynamic range in DR converter 505 to output and (3) converting the 4K SDR signal into a 2K SDR signal to output, according to a resolution and a dynamic range supported by display device 510 . That is, data output device 500 A can switch an operation according to a resolution of display device 510 and whether or not display device 510 supports an HDR signal.
- FIG. 23 is a view illustrating an example of combinations of characteristics of a video signal of content (a resolution and a dynamic range (luminance range)), characteristics of display device 510 and an output signal of data output device 500 A.
- Data output device 500 A selects a format of an output signal to match a resolution of display device 510 and whether or not display device 510 supports an HDR signal, and controls DC 507 and DR converter 505 to generate an output signal of the selected format.
- a video signal of content is an HDR signal of a 4K resolution
- display device 510 does not support displaying the HDR signal of the 4K resolution and supports displaying an HDR signal of a 2K resolution
- data output device 500 A converts the video signal of the content into an HDR signal of the 2K resolution to output (see the combination example in the second row in FIG. 23 ).
- DC 507 converts a resolution of a video signal.
- a video signal of content is an HDR signal of a 4K resolution
- display device 510 does not support displaying the HDR signal of the 4K resolution and an HDR signal of a 2K resolution, and supports displaying a 2K SDR signal
- data output device 500 A converts the video signal of the content into an SDR signal of the 2K resolution to output (see the combination example in the third row in FIG. 23 ).
- DC 507 converts a resolution of a video signal
- DR converter 505 converts a luminance range.
- display device 510 can more faithfully reproduce video signals of content.
- data output device 500 A may convert a resolution or display device 510 may operate to convert a dynamic range as described with reference to FIG. 16 .
- data output device 500 A includes a down-converter (DC 507 ) which generates a third video signal by lowering a resolution of the first video signal obtained by video decoder 501 .
- the converter (DR converter 505 ) further generates a fourth video signal of a luminance range narrower than a luminance range of the third video signal by performing a process of converting the luminance range of the third video signal according to one of a plurality of second conversion modes based on the conversion auxiliary data.
- the output unit (HDMI output unit 506 ) further outputs the third video signal or the fourth video signal to display device 510 .
- data output device 500 A can change a resolution of a video signal to, for example, a resolution suitable to display device 510 or the like.
- the down-converter (DC 507 ) generates the third video signal and (2) the output unit (HDMI output unit 506 ) outputs the third video signal to display device 510 .
- the output unit (HDMI output unit 506 ) outputs the third video signal to display device 510 .
- a resolution of a video signal is 4K and a resolution of display device 510 is 2K, a 2K output signal is output.
- display device 510 when display device 510 does not support displaying a video image of a luminance range (HDR) of the first video signal, (1) the converter (DR converter 505 ) generates the second video signal of a luminance range (SDR) narrower than the luminance range (HDR) of the first video signal, and (2) the output unit (HDMI output unit 506 ) outputs the second video signal and HDR control information to display device 510 .
- a dynamic range (luminance range) of a video signal is an HDR and display device 510 does not support the HDR (in case of an SDR) as illustrated in FIG. 23 , an HDR video signal is converted into an SDR video signal and the SDR video signal (output signal) is output.
- display device 510 does not support displaying a video image of the first video signal, and does not support displaying a video image of the luminance range (HDR) of the first video signal
- the down-converter (DC 507 ) generates a third video signal
- the converter (DR converter 505 ) generates the fourth video signal of a luminance range (SDR) narrower than the luminance range (HDR) of the third video signal
- the output unit (HDMI output unit 506 ) outputs the fourth video signal to display device 510 .
- a resolution of a video signal is 4K
- a dynamic range (luminance range) of the video signal is an HDR
- the resolution of display device 510 is 2K
- display device 510 does not support the HDR (in case of an SDR) as illustrated in FIG. 23
- the 2K and SDR output signal is output.
- FIG. 24 is a view illustrating an example of an operation model of playing back a 4K HDR signal, a 2K HDR signal and a 4K SDR signal in a next-generation Blu-ray playback device, and outputting playback signals to an HDR supporting 4K TV, an HDR non-supporting 4K TV and an SDR supporting 2K TV.
- the Blu-ray playback device obtains static HDR meta data stored in content management information, and dynamic HDR meta data stored in a video encoded stream. By using these items of HDR meta data, the Blu-ray playback device converts a video HDR signal into an SDR signal to output according to characteristics of an output destination TV connected according to HDMI, or outputs HDR meta data as a HDMI control signal.
- Each process of converting an HDR signal into an SDR signal and a process of converting an HDR signal into a video signal of a luminance range matching a display device can be selected from a plurality of methods and implemented.
- the content management information or the encoded stream can store a plurality of items of HDR meta data per converting method.
- the Blu-ray playback device may include a plurality of conversion processors such as option conversion module B or option conversion module D, may include only one conversion processor by taking into account a balance between device cost and performance or may not include a conversion processor.
- an HDR supporting TV may include a plurality of conversion processors, may include only one conversion processor or may not include a conversion processor.
- HDR meta data is stored in a predetermined container which defines a format and an operation during an input. Consequently, even when a new conversion process is developed in future, new HDR meta data is defined and a display device which supports this new HDR meta data is connected to a Blu-ray device which does not support the new HDR meta data, it is possible to output new HDR meta data from the Blu-ray playback device to the display device. Further, the display device can perform a conversion process matching new HDR meta data. Consequently, when a new technique is developed, it is possible to support the new technique by a simple process of assigning an ID to new HDR meta data.
- the Blu-ray playback device which supports new HDR meta data may perform the new conversion process on video data in the playback device, and output the processed video data to the display device.
- the playback device may down-convert a 4K signal into a 2K signal according to a resolution of the TV to output.
- FIG. 25 is a view illustrating an example of a method for storing static HDR meta data and two items of dynamic HDR meta data.
- an extendable HDR method according to the present exemplary embodiment, three items of (a) static HDR meta data, (b) dynamic HDR meta data clip (dynamic HDR meta data) and (c) dynamic HDR meta data are used.
- Static HDR meta data is stored in a meta data storage area of each stream (a playlist in case of a BDA (Blu-ray Disc Association)) defined by application standards of the BDA or the like or a distribution system.
- a dynamic HDR meta data clip (dynamic HDR meta data) is stored in a secondary use TS stream defined by application standards of the BDA or the like or a distribution system.
- Dynamic HDR meta data is stored as a SEI message included in a video stream such as HEVC.
- the proponent of the new HDR technique considers compatibility to widely spread the new technique.
- only the items of meta data (a) and (b) are used.
- the proponent discloses details of the technique.
- a draft of a specification for Blu-ray to adapt the technique to Blu-ray is submitted.
- a draft of a test specification for Blu-ray to adapt the technique to Blu-ray is submitted.
- a test stream is provided.
- a test disk is provided.
- a verifier is updated.
- the BDA registers the new technique as an official option, annexes the new technique to written standards and tests compatibility at minimum.
- the BDA permits an announcement that the new technique is adopted as an official option by the BDA.
- FIG. 26 is a view illustrating a method for displaying a user guidance in a Blu-ray device which executes an HDR-SDR conversion process.
- An algorithm of an HDR-SDR conversion process is not established, and therefore it is difficult to accurately perform HDR-SDR conversion in a current situation. Further, it is also possible to implement a plurality of algorithms of an HDR-SDR conversion process.
- a guide message such as “The disk is an HDR non-supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image.” is displayed.
- the data output device (Blu-ray device) outputs the second video signal (SDR signal) converted from a first luminance range into a second luminance range, and HDR control information to the display device, and causes the display device to display something to the effect that the second video signal converted from the first luminance range into the second luminance range is displayed.
- FIG. 27 is a view illustrating a method for displaying a user guidance during execution of a process of converting an HDR stored in a disk into an SDR.
- a message (menu) which needs to be displayed by a Blu-ray device when an HDR-SDR conversion process is performed is stored in an HDR disk or a non-volatile memory in the Blu-ray device. Consequently, the Blu-ray device can display a message during execution of an HDR-SDR conversion process. In this case, for example, a message such as “The disk is an HDR supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image.” is displayed.
- FIG. 28 is a view illustrating a method for displaying a user guidance menu during execution of a process of converting an HDR stored in a disk into an SDR.
- the Blu-ray device can display a message such as “The disk is HDR supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image. Would you like to play back SDR video image?”.
- the Blu-ray device starts displaying converted image. Further, when the user selects “Do not play back”, the Blu-ray device stops playback, and displays a message which encourages the user to insert an HDR non-supporting Blu-ray disc.
- the data output device causes the display device to display a message which encourages the user to select whether or not to display the second video signal (SDR signal) converted from the first luminance range into the second luminance range.
- FIG. 29 is a view illustrating a method for displaying a user guidance menu which enables selection of a processing method during execution of a process of converting an HDR stored in a disk into an SDR.
- the Blu-ray device displays something to the effect that meta data for an HDR-SDR conversion process is stored in Blu-ray when the meta data is stored in Blu-ray.
- the Blu-ray device displays a message indicating that more beautiful conversion is possible. That is, according to a Java (registered trademark) command in a disk, what HDR-SDR conversion process is implemented in the Blu-ray device is determined. Consequently, the Blu-ray device can display a selection menu of an HDR-SDR conversion processing method, such as “The disk is HDR supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image. Which method do you choose? (Play back by process 1 ), (Play back by process 3 ) and (Do not play back)”. In addition, in this regard, process 1 and process 3 are different types of HDR-SDR conversion processes.
- the data output device causes the display device to display a message which encourages the user to select one of a plurality of converting methods for converting the first luminance range into the second luminance range.
- a TV or a playback device which does not support an HDR signal displays a message by using a data broadcast application that a broadcast program uses HDR signals and cannot be accurately displayed when the program is viewed. Further, a TV or a playback device which supports an HDR signal may not display this message. Furthermore, a tag value indicating a message attribute indicates that the message is a warning message for HDR signals. The TV or the playback device which supports HDR signals determines that it is not necessary to display a message by referring to a tag value.
- dynamic HDR meta data or static HDR meta data adopts a data structure which can be transmitted according to HDMI.
- a transmission protocol such as HDMI
- whether or not it is possible to transmit HDR meta data to the display device according to the transmission protocol is determined.
- the playback device such as an HDR supporting Blu-ray device or a broadcast receiving device is connected with a display device such as a TV via HDMI2.1
- the playback device can transmit dynamic HDR meta data to the display device.
- the playback device and the display device are connected according to HDMI of an older version than 2.1, the playback device cannot transmit the dynamic HDR meta data to the display device.
- the playback device determines whether or not a HDMI version which can establish connection with the display device supports transmission of dynamic HDR meta data.
- the playback device performs an HDR-SDR conversion process by using dynamic HDR meta data, and then outputs the converted signal to the display device according to HDMI.
- the playback device may operate also based on whether or not the display device supports a conversion process performed by using dynamic HDR meta data. That is, when the display device does not support a conversion process, and even when the playback device can transmit dynamic HDR meta data according to a HDMI version, the playback device may perform a conversion process. Further, when the playback device does not support a conversion process performed by using dynamic HDR meta data, the playback device may not perform a conversion process and may not transmit the dynamic HDR meta data to the display device, either.
- FIG. 30 is a flowchart illustrating a method of the playback device for transmitting dynamic HDR meta data.
- the playback device determines whether or not the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (S 501 ).
- the playback device determines whether or not the playback device and the display device can be connected according to HDMI2.1 which supports transmission of dynamic HDR meta data. More specifically, the playback device determines whether or not both of the playback device and the display device support HDMI2.1.
- the playback device When the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (Yes in S 501 ), the playback device performs a conversion process by using dynamic HDR meta data and transmits converted image data to the display device according to HDMI (S 502 ).
- the conversion process described herein is a process of changing a luminance range of image data, and is a process of converting an HDR into an SDR to match a luminance range supported by the display device or a process of converting an HDR into an HDR signal of a narrower luminance range.
- the playback device transmits image data for which a conversion process is not yet performed, and dynamic HDR meta data to the display device according to HDMI by using different types of packets (S 503 ).
- Infoframe such as AVI (Auxiliary Video Information) Infoframe to transmit static HDR meta data according to HDMI.
- AVI Advanced Video Information
- a maximum data size which can be stored in AVI Infoframe is 27 bytes according to HDMI2.0, and therefore data having a larger size than this maximum data size cannot be processed.
- the playback device transmits data for which a conversion process has been performed, to the display device.
- the playback device determines whether to transmit static HDR meta data to the display device based on a HDMI version for connecting the playback device and the display device, and perform a conversion process in the playback device.
- the static HDR meta data may be classified into a required portion and an extension portion, and a size of the required portion may be set to a size or less which can be transmitted according to a specific version of a specific transmission protocol such as existing HDMI2.0.
- the playback device may transmit only the required portion to the display device when using HDMI2.0, and may transmit the required portion and the extension portion together when using HDMI2.1.
- identification information indicating that static HDR meta data includes a necessary portion and an extension portion or indicating that at least the required portion can be transmitted according to a specific version such as HDMI2.0 may be stored in a database such as PlayList or PlayItem in a Blu-ray disc.
- static HDR meta data may be set to a size or less which can be transmitted according to a lowest version such as HDMI2.0 which enables transmission of static HDR meta data.
- a syntax of static HDR meta data in a disk stored in management information such as a playlist or video stream SEI, and a syntax of static HDR meta data which is transmitted according to HDMI may be different.
- the playback device converts the static HDR meta data in the disk into the syntax of the static HDR meta data according to the transmission protocol to output.
- FIG. 31 is a flowchart illustrating a method of the playback device for transmitting static HDR meta data.
- the playback device determines whether or not the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (S 511 ).
- the playback device When the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (Yes in S 511 ), the playback device transmits only a required portion of static HDR meta data to the display device according to HDMI (S 512 ).
- the playback device transmits both of a required portion and an extension portion of static HDR meta data to the display device according to HDMI (S 513 ).
- the playback device switches whether or not to transmit dynamic HDR meta data to the display device according to a HDMI version, yet transmits at least a required portion of static HDR meta data to the display device at all times irrespectively of the HDMI version.
- the playback device transmits a video signal to the display device.
- a version of a transmission protocol which connects the playback device and the display device is a first version (e.g. HDMI2.0)
- the playback device transmits, to the display device, first meta data (static HDR meta data) which is information which is commonly used for a plurality of images included in continuous playback units of the video signal and relates to a luminance range of the video signal, without transmitting, to the display device, second meta data (dynamic HDR meta data) which is information which is commonly used for units subdivided compared to the continuous playback units of the video signal and relates to the luminance range of the video signal.
- the version of the transmission protocol is the second version (e.g. HDMI2.1)
- the playback device transmits both of the first meta data (static HDR meta data) and the second meta data (dynamic HDR meta data) to the display device.
- the playback device can transmit appropriate meta data to the display device according to the version of the transmission protocol.
- the playback device performs a process of converting a luminance range of a video signal by using the second meta data (dynamic HDR meta data), and transmits the converted video signal to the display device (S 502 ).
- the version of the transmission protocol is the first version (e.g. HDMI2.0) (Yes in S 501 )
- the playback device performs a process of converting a luminance range of a video signal by using the second meta data (dynamic HDR meta data), and transmits the converted video signal to the display device (S 502 ).
- the playback device can perform a conversion process.
- the playback device when the version of the transmission protocol is the second version (e.g. HDMI2.1) and the display device does not support a conversion process, the playback device performs a conversion process, transmits the converted video signal to the display device and does not transmit the second meta data to the display device. Furthermore, when the version of the transmission protocol is the second version (e.g. HDMI2.1) and the display device supports a conversion process, the playback device transmits a video signal and the second meta data to the display device without performing a conversion process.
- the version of the transmission protocol is the second version (e.g. HDMI2.1) and the display device does not support a conversion process
- the playback device transmits a video signal and the second meta data to the display device without performing a conversion process.
- one appropriate device of the playback device and the display device can execute a conversion process.
- the playback device when the playback device does not support a conversion process of converting a luminance range of a video signal by using the second meta data (dynamic HDR meta data), the playback device transmits the video signal to the display device without performing the conversion process, and does not transmit the second meta data (dynamic HDR meta data) to the display device.
- the second meta data dynamic HDR meta data
- the playback device may control a peak luminance of a video image by taking into account performance of a panel or a signal processing circuit of the display device such as a TV or an influence on a human body.
- the process described below may be performed by the playback device such as a Blu-ray device or may be performed by the display device such as a TV.
- the playback device described below only needs to have a function of playing back video images, and includes the above-described playback device (e.g. Blu-ray device) and the display device (e.g. TV).
- the playback device may control a luminance value of each pixel to play back an HDR signal based on following playback conditions.
- the playback device adjusts a luminance value such that an inter-screen luminance change amount at reference time interval T is threshold P or less.
- Reference time interval T described herein is, for example, an integer multiple of a reciprocal of a video frame rate.
- Threshold P is an absolute value of a luminance or a rate of a change of a luminance value. This threshold P is determined based on an influence which a flash of an image has on a human body or following performance of a TV panel for a change of a signal value.
- conditions may be set such that a number of pixels whose intra-screen luminance value change amounts exceeds threshold P is a predetermined rate or less.
- the screen may be divided into a plurality of areas, and the same or different conditions may be set per area.
- the playback device adjusts a luminance value such that a number of pixels which have luminances of reference luminance S or more or a rate that these pixels occupy in total pixels in a screen is threshold Q or less.
- Reference luminance S and threshold Q are determined based on an influence on a human body or an upper limit value of a voltage which is simultaneously applicable to each pixel of a TV panel.
- threshold P threshold P
- reference luminance S and threshold Q threshold P
- a method for controlling a pixel value according to the first method will be described below.
- a peak luminance of a plurality of pixels configuring a frame at time t is assumed to be L1.
- the playback device adjusts a luminance value for each pixel whose absolute value of a difference between I(i,j) and L1 exceeds threshold P such that the difference is threshold P or less.
- This process may be performed on an entire screen or may be performed per area by dividing the screen to perform processes in parallel. For example, the playback device divides the screen in a horizontal direction and a vertical direction, respectively, and adjusts a luminance value such that a change amount of a luminance in each area is threshold P or less.
- a frame interval to display images on a TV panel is assumed to be as reference time interval T.
- a predetermined time constant may be set, and the playback device may determine a luminance value (above L1) by adding a weight to a peak luminance of each frame in a range of the set time constant.
- a time constant and a weighting coefficient are set in advance such that a change amount of a luminance value is threshold P or less.
- the first method is a method for clipping a luminance value for each pixel whose luminance value exceeds a predetermined value. For example, a luminance value of each pixel whose luminance value exceeds the predetermined value is adjusted to the predetermined value.
- the second method is a method for entirely lowering a luminance value of each pixel in the screen such that a relative luminance value rate between pixels is held as much as possible by, for example, setting Knee point instead of uniformly clipping each luminance value.
- a luminance value of a high luminance portion may be lowered while a luminance value of a low luminance portion is held.
- a luminance value of an HDR signal of content is 400 nit in area A (4 mega pixels) which is half of the screen, and is 1000 nit in area B (4 mega pixels) which is the rest of the half.
- a luminance value of each pixel in a frame of a video or a still image may be determined such that the conditions of the above first method or second method are satisfied.
- FIG. 32 is a flowchart of a method for controlling a luminance value during playback of an HDR signal.
- the playback device determines whether or not an inter-screen luminance value change amount or an intra-screen luminance value satisfies playback conditions (S 521 ). More specifically, as described above, the playback device determines whether or not the inter-screen luminance value change amount is the threshold or less or the intra-screen luminance value is the threshold or less.
- the playback device When the inter-screen luminance value change amount or an intra-screen luminance value satisfies the playback conditions, i.e., when the inter-screen luminance change amount is the threshold or less or the intra-screen luminance value is the threshold or less (Yes in S 521 ), the playback device outputs a signal of the same luminance value as a luminance value of an input HDR signal (S 522 ). That is, the playback device outputs a luminance value of an HDR signal without adjusting the luminance value.
- the playback device adjusts a luminance value of each pixel and outputs an adjusted luminance value to satisfy the playback conditions (S 523 ). That is, the playback device adjusts the luminance value of each pixel such that the inter-screen luminance value change amount is the threshold or less or the intra-screen luminance value is the threshold or less.
- a luminance of a video signal is a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit. That is, the video signal is an HDR signal.
- the playback device determines whether or not an inter-screen luminance value change amount of a video signal exceeds a predetermined first threshold (S 521 ). For example, the playback device determines whether or not the luminance value change amount at a reference time interval which is an integer multiple of a reciprocal of a frame rate of the video signal exceeds the first threshold.
- the playback device When it is determined that the luminance value change amount exceeds the first threshold (No in S 521 ), the playback device performs an adjustment process of lowering the luminance value of the video signal (S 523 ). More specifically, for a pixel whose luminance value change amount exceeds the first threshold, the playback device adjusts a luminance value of the pixel such that the luminance value change amount of the pixel is the first threshold or less.
- the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a large change amount of a luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- step S 521 the playback device determines whether or not a difference between a peak luminance of a first image included in the video signal, and each of luminance values of a plurality of pixels included in a second image in the video signal subsequent to the first image exceeds the first threshold may be determined.
- step S 523 for a pixel whose difference exceeds the first threshold, the playback device adjusts a luminance value of the pixel such that the difference of the pixel is the first threshold or less.
- step S 521 the playback device determines whether or not a rate of pixels whose luminance value change amounts exceed the first threshold with respect to a plurality of pixels included in an image included in the video signal exceeds a second threshold.
- step S 523 when the rate exceeds the second threshold, the playback device adjusts luminance values of the plurality of pixels such that the rate is the second threshold or less.
- step S 521 for each of a plurality of areas obtained by dividing the screen, the playback device determines whether or not an inter-screen luminance value change amount of each area exceeds the first threshold.
- step S 523 the playback device performs an adjustment process of lowering a luminance value of an area for which it is determined that the luminance value change amount exceeds the first threshold.
- the playback device determines whether or not a luminance value of an image included in a video signal exceeds the predetermined first threshold (S 521 ). When it is determined that the luminance value of each pixel exceeds the first threshold (No in S 521 ), the playback device performs an adjustment process of lowering the luminance value of the image (S 523 ).
- the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a high luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- step S 521 the playback device determines the number of pixels whose luminance values exceed the first threshold among a plurality of pixels included in an image.
- step S 523 when the number of pixels whose luminance values exceed the first threshold exceeds the third threshold, the playback device lowers the luminance value of the image such that the number of pixels whose luminance values exceed the first threshold is a third threshold or less.
- step S 521 the playback device determines a rate of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image.
- step S 523 when the rate exceeds the third threshold, the playback device lowers the luminance value of the image such that the rate is the third threshold or less.
- the first threshold, the second threshold and the third threshold are values calculated based on an upper limit value of a voltage which is simultaneously applicable to a plurality of pixels in a display device which displays video signals.
- Static HDR meta data may be stored in a head access unit in a decoding order in random access units such as GOP to store the static HDR meta data in a video stream by using SEI.
- a NAL unit including SEI is arranged prior to a NAL unit in which a video encoded data is stored in the decoding order.
- the same meta data is used as these two items of dynamic HDR meta data to store the dynamic HDR meta data in both of management information such as a playlist, and a video stream.
- dynamic HDR meta data can be switched in random access units and is fixed in the random access units.
- SEI in which dynamic HDR meta data is stored is stored in a head access unit in the random access units. Decoding starts from a head of the random access units to start playback from a middle of a stream. Further, during special playback such as high-speed playback of playing back only picture I and picture P, a head access unit in the random access units is decoded at all times.
- the playback device can obtain HDR meta data at all times.
- SPS Sequence Parameter Set
- static HDR meta data and dynamic HDR meta data may be stored in different SEI messages. Both of the SEI messages are identified based on identification information included in a type of the SEI message or the payload of the SEI message.
- the playback device can extract only a SEI message including the static HDR meta data, and transmit meta data included in a payload as it is according to HDMI. Consequently, the playback device does not need to perform a process of analyzing a payload of the SEI message, and obtaining static HDR meta data.
- FIG. 33 is a view for explaining multiplexed data stored in a dual disk.
- HDR signals and SDR signals are stored as different multiplexed streams in the dual disk.
- items of data of a plurality of media such as a video, an audio, a caption and graphics are stored as one multiplexed stream in an optical disk such as Blu-ray according to a MPEG-2 TS-based multiplexing method called M2TS.
- M2TS MPEG-2 TS-based multiplexing method
- a reference is made to these multiplexed streams from playback control meta data such as a playlist.
- a player analyzes meta data to select a multiplexed stream to play back or individual language data stored in the multiplexed stream.
- This example is a case where HDR and SDR playlists are individually stored, and the respective playlists refer to HDR signals or SDR signals.
- identification information indicating that both of HDR signals and SDR signals are stored may be additionally indicated.
- Data such as an audio, a caption or graphics needs to be stored for each multiplexed stream, and a data amount increases compared to a case where data is multiplexed into one video stream.
- MPEG-4 AVC which is conventionally used for Blu-ray
- HEVC High Efficiency Video Coding
- storing a combination of a 2K HDR and a 2K SDR or a combination of a 4K SDR and a 2K HDR i.e., by storing two 2Ks or a combination of 2K and 4K in a dual disk, storing two 4Ks may be banned to permit only a combination which can be stored in an optical disk.
- a Blu-ray device which plays back a 4K supporting BD or an HDR supporting BD needs to support four TVs of a 2K_SDR supporting TV, a 2K_HDR supporting TV, a 4K_SDR supporting TV and a 4K_HDR supporting TV. More specifically, the Blu-ray device needs to support three pairs of HDMI/HDCP (High-bandwidth Digital Content Protection) standards (HDMI1.4/HDCP1.4, HDMI2.0/HDCP2.1 and HDMI2.1/HDCP2.2).
- HDMI/HDCP High-bandwidth Digital Content Protection
- the Blu-ray device when playing back four types of Blu-ray discs (a 2K_SDR supporting BD, a 2K_HDR supporting BD, a 4K_SDR supporting BD and a 4K_HDR supporting BD), the Blu-ray device needs to select an appropriate process and HDMI/HDCP per BD (content) and per connected display device (TV). Furthermore, when graphics are synthesized with a video, too, it is necessary to change a process according to a BD type and a connected display device (TV) type.
- a graphic stream is limited, and types of combinations of video streams and graphic streams are reduced.
- a dual streams disk and a graphic stream are limited to substantially reduce a number of combinations of a complex process in a Blu-ray device.
- “HDR ⁇ pseudo HDR conversion process” of converting an HDR video image to keep a gradation of an area exceeding 100 nit to some degree, converting the HDR video image into a pseudo HDR video image close to the original HDR and enabling the SDR TV to display the HDR video image is realized to display an HDR video image on an SDR TV instead of converting an HDR video image into an SDR video image of 100 nit or less by using a peak luminance of the SDR TV which is displayed and exceeds 100 nit (generally, 200 nit or more).
- the converting method of “HDR ⁇ pseudo HDR conversion process” may be switched according to display characteristics (a maximum luminance, input/output characteristics and a display mode) of the SDR TV.
- a method for obtaining display characteristics information includes (1) automatically obtaining the display characteristics information via HDMI (registered trademark) or a network, (2) generating the display characteristics information by having a user input information such as a manufacturer name or a model and (3) obtaining the display characteristics information from a cloud or the like by using information of the manufacturer name or the model.
- a timing to obtain the display characteristics information of converting device 100 includes (1) obtaining the display characteristics information immediately before pseudo HDR conversion, and (2) obtaining when connection with display device 200 (e.g. SDR TV) is established for the first time (when connection is established).
- display device 200 e.g. SDR TV
- the converting method may be switched according to HDR video image luminance information (CAL (Content Average Luminance) and CPL (Content Peak Luminance)).
- CAL Content Average Luminance
- CPL Content Peak Luminance
- a method for obtaining the HDR video image luminance information of converting device 100 includes (1) obtaining the HDR video image luminance information as meta information accompanying an HDR video image, (2) obtaining the HDR video image luminance information by having the user input title information of content and (3) obtaining the HDR video image luminance information by using input information input by the user, from a cloud or the like.
- details of the converting method include (1) performing conversion such that a luminance does not exceed a DPL (Display Peak Luminance), (2) performing conversion such that CPL is DPL, (3) not changing a CAL and a luminance around the CAL, (4) performing conversion by using a natural logarithm and (5) performing a clip process by using the DPL.
- DPL Display Peak Luminance
- the converting method it is also possible to transmit display settings such as a display mode and a display parameter of the SDR TV to display device 200 to switch to enhance a pseudo HDR effect. For example, a message which encourages the user to make the display settings may be displayed on a screen.
- FIG. 34A is a view illustrating an example of a display process of converting an HDR signal in an HDR TV and displaying an HDR.
- FIG. 34B is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR TV.
- the SDR TV does not have means for receiving a direct input of such a signal from an outside, and therefore cannot realize the same effect as that of the HDR TV.
- FIG. 34C is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR TV which are connected with each other via a standard interface.
- an HDR supporting Blu-ray device generates a signal (pseudo HDR signal) which can cancel “SDR EOTF conversion” and “luminance conversion of each mode” which the signal passes immediately after the input interface of the SDR TV.
- the HDR supporting Blu-ray device can realize in a pseudo manner the same effect as that obtained when a signal obtained immediately after “luminance conversion” is input to the “display device” (a broken line arrow in FIG. 34C ).
- An input signal of a normal SDR TV is 100 nit yet has capability of expressing video images of 200 nit or more according to viewing environment (a dark room: a cinema mode, and a bright room: a dynamic mode).
- a luminance upper limit of an input signal to the SDR TV is determined as 100 nit, and therefore it has not been possible to directly use this capability.
- the SDR TV displays HDR video images
- that a peak luminance of the SDR TV which displays the HDR video images exceeds 100 nit (generally 200 nit or more) is used.
- “HDR ⁇ pseudo HDR conversion process” is performed to keep a gradation of a luminance range exceeding 100 nit to some degree. Consequently, the SDR TV can display pseudo HDR video images close to the original HDR.
- HDR ⁇ pseudo HDR conversion process When this “HDR ⁇ pseudo HDR conversion process” technique is applied to Blu-ray, as illustrated in FIG. 35 , only HDR signals are stored in an HDR disk.
- the Blu-ray device When the SDR TV is connected to a Blu-ray device, the Blu-ray device performs “HDR ⁇ pseudo HDR conversion process”, converts an HDR signal into a pseudo HDR signal and outputs the pseudo HDR signal to the SDR TV. Consequently, by converting the received pseudo HDR signal into a luminance value, the SDR TV can display video images having a pseudo HDR effect.
- an HDR supporting BD and an HDR supporting Blu-ray device are prepared, even an SDR TV can display pseudo HDR video images having higher quality than that of SDR video images.
- an HDR supporting TV is necessary to view HDR video images.
- an existing SDR TV can display pseudo HDR video images which realize an HDR effect. Consequently, it can be expected that HDR-supporting Blu-ray spreads.
- an HDR signal is converted into a pseudo HDR signal. Consequently, an existing SDR TV can display the HDR signal as a pseudo HDR video image.
- FIG. 36A is a view illustrating an example of the EOTF (Electro-Optical Transfer Function) which supports the HDR and the SDR, respectively.
- EOTF Electro-Optical Transfer Function
- the EOTF is a generally called gamma curve, indicates each correspondence between a code value and a luminance value and converts the code value into a luminance value. That is, the EOTF is association information indicating a correspondence relationship between a plurality of code values and luminance values.
- FIG. 36B is a view illustrating an example of an inverse EOTF which supports the HDR and the SDR, respectively.
- the inverse EOTF indicates each correspondence between a luminance value and a code value, and quantizes a luminance value contrary to the EOTF and converts the luminance value into a code value. That is, the inverse EOTF is association information indicating a correspondence relationship between luminance value and a plurality of code values.
- a luminance value of an HDR supporting video image is expressed by a 10-bit code value of a gradation
- a luminance value of an HDR luminance range up to 10,000 nit is quantized and mapped on 1024 integer values of 0 to 1023.
- the luminance value of the luminance range up to 10,000 nit (a luminance value of an HDR supporting video image) is quantized based on the inverse EOTF and thereby is converted into an HDR signal of the 10-bit code value.
- An HDR supporting EOTF (referred to as a “HDR EOTF” below) or an HDR supporting inverse EOTF (referred to as a “HDR inverse EOTF” below) can express a higher luminance value than that of an SDR supporting EOTF (referred to as a “SDR EOTF” below) or an SDR supporting inverse EOTF (referred to as a “SDR inverse EOTF” below).
- SDR EOTF SDR supporting EOTF
- SDR inverse EOTF SDR supporting inverse EOTF
- a maximum value of a luminance is 10,000 nit. That is, an HDR luminance range includes an entire SDR luminance range, and an HDR peak luminance is higher than an SDR peak luminance.
- An HDR luminance range is a luminance range obtained by expanding a maximum value from 100 nit which is a maximum value of the SDR luminance range to 10,000 nit.
- examples of the HDR EOTF and the HDR inverse EOTF include SMPTE 2084 standardized by Society of Motion Picture & Television Engineers (SMPTE).
- a luminance range from 0 nit to 100 nit which is a peak luminance illustrated in FIGS. 36A and 36B will be described as a first luminance range in some cases.
- a luminance range from 0 nit to 10,000 nit which is a peak luminance illustrated in FIGS. 36A and 36B will be described as a second luminance range in some cases.
- FIG. 37 is a block diagram illustrating a configuration of the converting device and the display device according to the exemplary embodiment.
- FIG. 38 is a flowchart illustrating a converting method and a display method performed by the converting device and the display device according to the exemplary embodiment.
- converting device 100 includes HDR EOTF converter 101 , luminance converter 102 , luminance inverse converter 103 and SDR inverse EOTF converter 104 .
- display device 200 includes display setting unit 201 , SDR EOTF converter 202 , luminance converter 203 and display 204 .
- Each component of converting device 100 and display device 200 will be described in detail during description of the converting method and the display method.
- the converting method performed by converting device 100 will be described with reference to FIG. 38 .
- the converting method includes step S 101 to step S 104 described below.
- HDR EOTF converter 101 of converting device 100 obtains an HDR video image for which HDR inverse EOTF conversion has been performed.
- HDR EOTF converter 101 of converting device 100 performs HDR EOTF conversion on an HDR signal of the obtained HDR video image (S 101 ).
- HDR EOTF converter 101 converts the obtained HDR signal into a linear signal indicating a luminance value.
- the HDR EOTF is, for example, SMPTE 2084.
- luminance converter 102 of converting device 100 performs first luminance conversion of converting the linear signal converted by HDR EOTF converter 101 by using display characteristics information and content luminance information (S 102 ).
- a luminance value corresponding to an HDR luminance range (referred to as a “HDR luminance value” below) is converted into a luminance value corresponding to a display luminance range (referred to as a “display luminance value” below). Details will be described below.
- HDR EOTF converter 101 functions as an obtaining unit which obtains an HDR signal as a first luminance signal indicating a code value obtained by quantizing the luminance value of a video image. Further, HDR EOTF converter 101 and luminance converter 102 function as converters which determine the code value indicated by the HDR signal obtained by the obtaining unit, based on a display (display device 200 ) luminance range, and converts the code value into a display luminance value corresponding to the display luminance range which is a maximum value (DPL) which is smaller than a maximum value (HPL) of the HDR luminance range and is larger than 100 nit.
- DPL maximum value
- HPL maximum value
- HDR EOTF converter 101 determines for the HDR code value which is a first code value indicated by the obtained HDR signal an HDR luminance value associated with an HDR code value by the HDR EOTF by using the obtained HDR signal and the HDR EOTF.
- the HDR signal indicates the HDR code value obtained by quantizing video (content) luminance value by using the HDR inverse EOTF of associating luminance values of the HDR luminance range and a plurality of HDR code values.
- step S 102 luminance converter 102 determines for the HDR luminance value determined in step S 101 a display luminance value which is associated in advance with the HDR luminance value and corresponds to the display luminance range, and performs first luminance conversion of converting the HDR luminance value corresponding to the HDR luminance range into a display luminance value corresponding to the display luminance range.
- converting device 100 obtains content luminance information including at least one of a luminance maximum value (CPL: Content Peak luminance) of the video image (content) and average luminance value (CAL: Content Average luminance) of a video image, as information related to an HDR signal.
- CPL first maximum luminance value
- CAL average luminance value
- the CPL first maximum luminance value
- the CAL is, for example, an average luminance value which is an average of luminance values of a plurality of images configuring an HDR video image.
- converting device 100 obtains display characteristics information of display device 200 from display device 200 .
- the display characteristics information is information indicating a maximum value (DPL) of a luminance which can be displayed by display device 200 , a display mode (described below) of display device 200 and display characteristics of display device 200 such as input/output characteristics (an EOTF supported by the display device).
- converting device 100 may transmit recommended display setting information (which will be described below and also referred to as “setting information” below) to display device 200 .
- luminance inverse converter 103 of converting device 100 performs luminance inverse conversion matching a display mode of display device 200 . Consequently, luminance inverse converter 103 performs second luminance conversion of converting a luminance value corresponding to the display luminance range into a luminance value corresponding to an SDR luminance range (0 to 100 [nit]) (S 103 ). Details will be described below.
- luminance inverse converter 103 determines for the display luminance value obtained in step S 102 an SDR luminance value which is a luminance value (referred to as a “SDR luminance value” below) associated in advance with the display luminance value and corresponding to an SDR as a third luminance value corresponding to an SDR luminance range whose maximum value is 100 nit, and performs second luminance conversion of converting the display luminance value corresponding to the display luminance range into the SDR luminance value corresponding to the SDR luminance range.
- SDR luminance value a luminance value
- SDR inverse EOTF converter 104 of converting device 100 generates a pseudo HDR video image by performing SDR inverse EOTF conversion (S 104 ). That is, SDR inverse EOTF converter 104 quantizes the determined SDR luminance value by using an inverse EOTF (Electro-Optical Transfer Function) of an SDR (Standard Dynamic Range) which is third association information which associates luminance values of an HDR luminance range and a plurality of third code values, determines a third code value obtained by the quantization, converts the SDR luminance value corresponding to the SDR luminance range into an SDR signals as the third luminance signal indicating the third code value and thereby generates a pseudo HDR signal.
- SDR inverse EOTF converter 104 quantizes the determined SDR luminance value by using an inverse EOTF (Electro-Optical Transfer Function) of an SDR (Standard Dynamic Range) which is third association information which associates luminance values of an HDR luminance range and
- the third code value is a code value supporting the SDR, and will be referred to as a “SDR code value” below. That is, an SDR signal is expressed by an SDR code value obtained by quantizing a luminance value of a video image by using an SDR inverse EOTF of associating luminance values of an SDR luminance range and a plurality of SDR code values. Further, converting device 100 outputs a pseudo HDR signal (SDR signal) generated in step S 104 to display device 200 .
- SDR signal pseudo HDR signal
- Converting device 100 generates an SDR luminance value corresponding to a pseudo HDR by performing first luminance conversion and second luminance conversion on an HDR luminance value obtained by inversely quantizing an HDR signal, and generates an SDR signal corresponding to the pseudo HDR by quantizing the SDR luminance value by using an EOTF.
- the SDR luminance value is a numerical value in a luminance range of 0 to 100 nit corresponding to the SDR.
- the SDR luminance value is converted based on the display luminance range, and therefore takes a numerical value which is obtained by performing luminance conversion on an HDR luminance value by using an HDR EOTF and an SDR EOTF and which is different from the luminance value in the luminance range of 0 to 100 nit corresponding to the SDR.
- the display method performed by display device 200 will be described with reference to FIG. 38 .
- the display method includes step S 105 to step S 108 described below.
- display setting unit 201 of display device 200 sets display settings of display device 200 by using setting information obtained from converting device 100 (S 105 ).
- display device 200 is an SDR TV.
- the setting information is information indicating display settings which is recommended for the display device, and is information indicating how an EOTF is performed on a pseudo HDR video image and what settings can display beautiful video images (i.e., information for switching the display settings of display device 200 to optimal display settings).
- the setting information includes, for example, gamma curve characteristics during an output of the display device, a display mode such as a living mode (normal mode) or a dynamic mode, or a numerical value of a backlight (brightness).
- display device 200 also referred to as a “SDR display” below
- step S 105 display device 200 obtains an SDR signal (pseudo HDR signal), and setting information indicating display settings which are recommended for display device 200 to display video images.
- SDR signal prseudo HDR signal
- display device 200 may obtain the SDR signal (pseudo HDR signal) before step S 106 , and may obtain the SDR signal after step S 105 .
- SDR EOTF converter 202 of display device 200 performs SDR EOTF conversion on the obtained pseudo HDR signal (S 106 ). That is, SDR EOTF converter 202 inversely quantizes the SDR signal (pseudo HDR signal) by using an SDR EOTF. Thus, SDR EOTF converter 202 converts an SDR code value indicated by an SDR signal into the SDR luminance value.
- luminance converter 203 of display device 200 performs luminance conversion according to the display mode set to display device 200 . Consequently, luminance converter 203 performs third luminance conversion of converting an SDR luminance value corresponding to an SDR luminance range (0 to 100 [nit]) into a display luminance value corresponding to the display luminance range (0 to DPL [nit]) (S 107 ). Details will be described below.
- step S 106 and step S 107 display device 200 converts a third code value indicated by the obtained SDR signal (pseudo HDR signal) into a display luminance value corresponding to the display luminance range (0 to DPL [nit]) by using the setting information obtained in step S 105 .
- the SDR signal (pseudo HDR signal) is converted into the display luminance value by, in step S 106 , determining for the SDR code value indicated by the obtained SDR signal an SDR luminance value associated with the SDR code value by an SDR EOTF by using the EOTF of associating the luminance values in the SDR luminance range and a plurality of third code values.
- step S 107 an SDR signal is converted into a display luminance value by, in step S 107 , determining a display luminance value which is associated in advance with the determined SDR luminance value and corresponds to the display luminance range, and performing third luminance conversion of converting the SDR luminance value corresponding to the SDR luminance range into a display luminance value corresponding to the display luminance range.
- display 204 of display device 200 displays pseudo HDR video images on display device 200 based on the converted display luminance value (S 108 ).
- FIG. 39A is a view for explaining an example of the first luminance conversion.
- Luminance converter 102 of converting device 100 performs the first luminance conversion of converting the linear signal (HDR luminance value) obtained in step S 101 by using display characteristics information and content luminance information of HDR video images.
- an HDR luminance value input luminance value
- a display luminance value output luminance value
- the DPL is determined by using a maximum luminance and a display mode of an SDR display which are the display characteristics information.
- the display mode is, for example, mode information such as a theater mode of displaying video images darkly on the SDR display, and a dynamic mode of displaying video images brightly.
- the DPL is 750 nit.
- the DPL (second maximum luminance value) is a luminance maximum value which can be displayed by a display mode currently set to the SDR display. That is, according to the first luminance conversion, the DPL which is the second maximum luminance value is determined by using display characteristics information which is information indicating display characteristics of the SDR display.
- each luminance value equal to or less than the CAL is regarded as the same before and after conversion, and only each luminance value equal to or more than the CPL is changed. That is, as illustrated in FIG. 39A , according to the first luminance conversion, when the HDR luminance value is the CAL or less, the HDR luminance value is not converted, the HDR luminance value is determined as a display luminance value. Further, when the HDR luminance value is the CPL or more, the DPL which is the second maximum luminance value is determined as a display luminance value.
- a peak luminance (CPL) of an HDR video image of luminance information is used, and the DPL is determined as a display luminance value when the HDR luminance value is the CPL.
- the linear signal (HDR luminance value) obtained in step S 101 may be converted to clip to a value which does not exceed the DPL.
- the linear signal (HDR luminance value) obtained in step S 101 may be converted to clip to a value which does not exceed the DPL.
- FIG. 40 is a view for explaining the second luminance conversion.
- Luminance inverse converter 103 of converting device 100 performs luminance inverse conversion corresponding to a display mode, on the display luminance value of the display luminance range (0 to DPL [nit]) converted by the first luminance conversion in step S 102 .
- the luminance inverse conversion is a process of making it possible to obtain the display luminance value of the display luminance range (0 to DPL [nit]) after the process in step S 102 when the SDR display performs luminance conversion process (step S 107 ) corresponding to the display mode. That is, the second luminance conversion is luminance inverse conversion of the third luminance conversion.
- the display luminance value (input luminance value) of the display luminance range is converted into an SDR luminance value (output luminance value of the SDR luminance range.
- a converting method is switched according to a display mode of the SDR display.
- the display mode of the SDR display is the normal mode
- a luminance is converted into a direct proportional value which is directly proportional to a display luminance value.
- an inverse function of the second luminance conversion is used to convert an SDR luminance value of each low luminance pixel into a value higher than the direct proportional value which is directly proportional to the display luminance value, and convert an SDR luminance value of each high luminance pixel into a value lower than the direct proportional value which is directly proportional to the display luminance value.
- a luminance value associated with the display luminance value is determined as an SDR luminance value by using luminance association information corresponding to display characteristics information which is information indicating display characteristics of the SDR display, and luminance conversion process is switched according to the display characteristics information.
- the luminance association information corresponding to the display characteristics information is information which is defined per display parameter (display mode) of the SDR display as illustrated in, for example, FIG. 40 and which associates a display luminance value (input luminance value) and an SDR luminance value (output luminance value).
- FIG. 41 is a view for explaining the third luminance conversion.
- Luminance converter 203 of display device 200 converts an SDR luminance value of an SDR luminance range (0 to 100 [nit]) into (0 to DPL [nit]) according to the display mode set in step S 105 . This process is performed to realize an inverse function of luminance inverse conversion of each mode in step S 103 .
- a converting method is switched according to a display mode of the SDR display.
- the display mode of the SDR display is the normal mode (i.e., a set display parameter is a parameter supporting the normal mode)
- luminance conversion is performed to convert the display luminance value into a direct proportional value which is directly proportional to the SDR luminance value.
- luminance conversion in a case where the display mode of the SDR display is the dynamic mode which makes high luminance pixels brighter and makes low luminance pixels darker than those of the normal mode, luminance conversion is performed to convert a display luminance value of each low luminance pixel into a value lower than the direct proportional value which is directly proportional to the SDR luminance value, and convert a display luminance value of each high luminance pixel into a value higher than the direct proportional value which is directly proportional to the SDR luminance value.
- a luminance value associated in advance with the SDR luminance value is determined as a display luminance value by using luminance association information corresponding to a display parameter which indicates display settings of the SDR display, and luminance conversion process is switched according to the display parameter.
- the luminance association information corresponding to the display parameter is information which is defined per display parameter (display mode) of the SDR display as illustrated in, for example, FIG. 41 and which associates an SDR luminance value (input luminance value) and a display luminance value (output luminance value).
- FIG. 42 is a flowchart illustrating detailed process of the display settings.
- Display setting unit 201 of the SDR display performs a following process in step S 201 to step S 208 in step S 105 .
- display setting unit 201 determines whether or not an EOTF (SDR display EOTF) set to the SDR display matches an EOTF assumed during generation of a pseudo HDR video image (SDR signal) by using setting information (S 201 ).
- display setting unit 201 determines whether or not a system side can switch the SDR display EOTF (S 202 ).
- display setting unit 201 switches the SDR display EOTF to an appropriate EOTF (S 203 ).
- step S 201 to step S 203 while the display settings are set (S 105 ), the EOTF set to the SDR display is set to a recommended EOTF corresponding to the obtained setting information. Further, thus, in step S 106 performed after step S 105 , it is possible to determine an SDR luminance value by using the recommended EOTF.
- display setting unit 201 displays on the screen a message which encourages the user to change the EOTF by a manual operation (S 204 ). For example, display setting unit 201 displays on a screen a message such as “Please set a display gamma to 2.4”. That is, while the display settings are set (S 105 ), when it is not possible to switch the EOTF set to the SDR display, display setting unit 201 displays on the SDR display a message which encourages the user to switch to a recommended EOTF an EOTF (SDR display EOTF) set to the SDR display.
- SDR display EOTF EOTF
- the SDR display displays a pseudo HDR video image (SDR signal), yet determines whether or not a display parameter of the SDR display matches setting information by using the setting information before displaying the pseudo HDR video image (S 205 ).
- display setting unit 201 determines whether or not it is possible to switch the display parameter of the SDR display (S 206 ).
- display setting unit 201 switches the display parameter of the SDR display according to the setting information (S 207 ).
- step S 204 to step S 207 while the display settings are set (S 105 ), the display parameter set to the SDR display is set to a recommended display parameter corresponding to the obtained setting information.
- display setting unit 201 displays on the screen a message which encourages the user to change the display parameter set to the SDR display by a manual operation (S 208 ). For example, display setting unit 201 displays on the screen a message such as “Please set a display mode to a dynamic mode and maximize a backlight”. That is, during the setting (S 105 ), when it is not possible to switch the display parameter set to the SDR display, display setting unit 201 displays on the SDR display a message which encourages the user to switch to a recommended display parameter a display parameter set to the SDR display.
- the exemplary embodiment has been described as an exemplary technique disclosed in this application.
- the technique according to the present disclosure is not limited to this, and is applicable to a first exemplary embodiment, too, for which changes, replacement, addition and omission have been optionally carried out.
- An HDR video image is, for example, a video image in a Blu-ray Disc, a DVD, a video distribution website on the Internet, a broadcast and a HDD.
- Converting device 100 may be provided in a disk player, a disk recorder, a set-top box, a television, a personal computer or a smartphone. Converting device 100 may be provided in a server device on the Internet.
- Display device 200 is, for example, a television, a personal computer and a smartphone.
- Display characteristics information obtained by converting device 100 may be obtained from display device 200 via a HDMI (registered trademark) cable or a LAN cable by using HDMI (registered trademark) or another communication protocol.
- display characteristics information obtained by converting device 100 display characteristics information included in model information of display device 200 may be obtained via the Internet. Further, the user may perform a manual operation to set the display characteristics information to converting device 100 .
- the display characteristics information may be obtained by converting device 100 immediately before generation of a pseudo HDR video image (steps S 101 to S 104 ) or at a timing at which default settings of a device are made or at which a display is connected.
- the display characteristics information may be obtained immediately before conversion into a display luminance value or at a timing at which converting device 100 is connected to display device 200 for the first time by a HDMI (registered trademark) cable.
- a CPL and a CAL of an HDR video image may be provided per content or may be provided per scene. That is, according to the converting method, luminance information (a CPL and a CAL) which corresponds to each of a plurality of scenes of a video image, and which includes per scene at least one of a first maximum luminance value which is a maximum value among luminance values for a plurality of images configuring each scene, and an average luminance value which is an average of luminance values for a plurality of images configuring each scene may be obtained. According to first luminance conversion, for each of a plurality of scenes, a display luminance value may be determined according to luminance information corresponding to each scene.
- the CPL and the CAL may be packaged in the same medium (a Blu-ray Disc, a DVD or the like) as that of HDR video images or may be obtained from a location different from HDR video images, for example, by converting device 100 from the Internet. That is, luminance information including at least one of the CPL and the CAL may be obtained as meta information of a video image, or may be obtained via a network.
- a fixed value may be used without using a CPL, a CAL and a display peak luminance (DPL). Furthermore, this fixed value may be changed from an outside. Still further, a CPL, a CAL and a DPL may be switched between a plurality of types. For example, the DPL may be switched between only three types of 200 nit, 400 nit and 800 nit, or may take a value which is the closest to display characteristics information.
- an HDR EOTF may not be SMPTE 2084, and another type of an HDR EOTF may be used.
- a maximum luminance (HPL) of an HDR video image may not be 10,000 nit and may be, for example, 4,000 nit or 1,000 nit.
- bit widths of code values may be, for example, 16, 14, 12, 10 and 8 bits.
- SDR inverse EOTF conversion is determined based on display characteristics information yet a fixed conversion function (which can be changed from an outside, too) may be used.
- a function defined by, for example, Rec. ITU-R BT.1886 may be used.
- types of SDR inverse EOTF conversion may be narrowed down to several types, and a type which is the closest to input/output characteristics of display device 200 may be selected and used.
- a fixed mode may be used for a display mode, and the display mode may not be included in display characteristics information.
- converting device 100 may not transmit setting information, and display device 200 may adopt fixed display settings and may not change the display settings.
- display setting unit 201 is unnecessary.
- setting information may be flag information indicating a pseudo HDR video image or a non-pseudo HDR video image, and may be, for example, changed to settings for displaying the pseudo HDR video image the most brightly in case of the pseudo HDR video image. That is, while the display settings are set (S 105 ), when the obtained setting information indicates a signal indicating a pseudo HDR video image converted by using a DPL, brightness settings of display device 200 may be switched to settings for displaying the pseudo HDR video image the most brightly.
- L represents a luminance value normalized to 0 to 1
- S1, S2, a, b and M are values set based on a CAL, a CPL and a DPL.
- V represents a converted luminance value normalized to 0 to 1.
- a display luminance value corresponding to an HDR luminance value is determined by using a natural logarithm.
- each HDR video image By converting each HDR video image by using information such as a content peak luminance or a content average luminance of each HDR video image, it is possible to change a converting method according to content and convert each HDR video image while keeping an HDR gradation as much as possible. Further, it is possible to suppress a negative influence that each HDR video image is too dark or too bright. More specifically, by mapping a content peak luminance of each HDR video image on a display peak luminance, a gradation is kept as much as possible. Further, each pixel value equal to or less than an average luminance is not changed to prevent an overall brightness from changing.
- each HDR video image by using a peak luminance value and a display mode of an SDR display, it is possible to change a converting method according to display environment of the SDR display, and display each video image (pseudo HDR video image) having HDR quality at the same gradation or brightness as that of an original HDR video image according to capability of the SDR display. More specifically, a display peak luminance is determined based on a maximum luminance and a display mode of the SDR display, and each HDR video image is converted so as not to exceed the peak luminance value. Consequently, each HDR video image is displayed without substantially decreasing a gradation of each HDR video image up to a brightness which the SDR display can display, and a luminance value of a brightness which cannot be displayed is lowered to a brightness which can be displayed.
- each HDR video image is converted into a pseudo HDR video image whose peak luminance is suppressed to 1,000 nit to keep an overall brightness, and a luminance value changes according to a display mode of the display.
- a luminance converting method is changed according to the display mode of the display. If a luminance higher than the peak luminance of the display is permitted for a pseudo HDR video image, there is a case where the high luminance is replaced with a peak luminance of the display side and is displayed.
- the pseudo HDR video image becomes entirely darker than an original HDR video image.
- this low luminance is replaced with the peak luminance of the display side. Therefore, the pseudo HDR video image becomes entirely brighter than an original HDR video image.
- the luminance is lower than the peak luminance of the display side, and therefore capability related to a display gradation is not used at maximum.
- the display side can display each pseudo HDR video image better by switching display settings by using setting information.
- a brightness is set dark, a high luminance cannot be displayed, and therefore HDR quality is undermined.
- the display settings are changed or a message which encourages a change of the display settings is displayed to exhibit display capability and display high gradation video images.
- the playback method and the playback device according to one or a plurality of aspects of the present disclosure have been described based on the exemplary embodiment.
- the present disclosure is not limited to this exemplary embodiment.
- the scope of one or a plurality of exemplary embodiments of the present disclosure may include exemplary embodiments obtained by applying, to the present exemplary embodiment, various deformations one of ordinary skill in the art conceives, and exemplary embodiments obtained by combining the components according to different exemplary embodiments without departing from the spirit of the present disclosure.
- each component may be configured by dedicated hardware such as a circuit or may be realized by executing a software program suitable to each component.
- Each component may be realized by causing a program executing unit such as a CPU or a processor to read a software program recorded on a recording medium such as a hard disk or a semiconductor memory and execute the software program.
- the present disclosure is applicable to content data generating devices, video stream transmitting devices such as Blu-ray devices or video display devices such as televisions.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Signal Processing For Recording (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
A transmitting method according to one aspect of the present disclosure is a transmitting method of a playback device, and includes: when a version of a transmission protocol is a first version, transmitting first meta data to a display device without transmitting second meta data to the display device, the transmission protocol being used to transmit a signal between the playback device and the display device, the first meta data including information that is commonly used for a plurality of images included in a continuous playback unit of a first video signal and relates to a luminance range of the first video signal, and the second meta data including information that is commonly used for a unit subdivided compared to the continuous playback unit of the first video signal and relates to the luminance range of the first video signal; and when the version of the transmission protocol is a second version, transmitting the first meta data and the second meta data to the display device.
Description
- The present disclosure relates to a transmitting method, a playback method and a playback device.
- Conventionally, image signal processing devices which improve luminance levels which can be displayed are disclosed (see, for example, Unexamined Japanese Patent Publication No. 2008-167418).
- In one general aspect, the techniques disclosed here feature a method used by a playback device, including: when a version of a transmission protocol is a first version, transmitting first meta data to a display device without transmitting second meta data to the display device, the transmission protocol being used to transmit a signal between the playback device and the display device, the first meta data including information that is commonly used for a plurality of images included in a continuous playback unit of a first video signal and relates to a luminance range of the first video signal, the second meta data including information that is commonly used for a unit subdivided compared to the continuous playback unit of the first video signal and relates to the luminance range of the first video signal; and when the version of the transmission protocol is a second version, transmitting the first meta data and the second meta data to the display device.
- Additional benefits and advantages of the disclosed embodiments will become apparent from the specification and drawings. The benefits and/or advantages may be individually obtained by the various embodiments and features of the specification and drawings, which need not all be provided in order to obtain one or more of such benefits and/or advantages.
- It should be noted that general or specific embodiments may be implemented as a system, a method, an integrated circuit, a computer program, a storage medium, or any selective combination thereof.
-
FIG. 1 is a view for explaining development of a video technology; -
FIG. 2 is a view for explaining an HDR (High-dynamic-range imaging) position; -
FIG. 3 is a view illustrating an image example indicating an HDR effect; -
FIG. 4 is a view for explaining a relationship between masters, distribution methods and display devices in case of introduction of the HDR; -
FIG. 5 is an explanatory view of a method for determining a code value of a luminance signal to be stored in content, and a process for restoring a luminance value from a code value during playback; -
FIG. 6 is a view illustrating an example indicating HDR meta data; -
FIG. 7 is a view illustrating a storage example of static HDR meta data; -
FIG. 8 is a view illustrating a storage example of dynamic HDR meta data; -
FIG. 9 is a view illustrating a storage example of dynamic HDR meta data; -
FIG. 10 is a flowchart of a method for transmitting static HDR meta data; -
FIG. 11 is a flowchart of a method for processing HDR meta data; -
FIG. 12 is a block diagram illustrating a configuration of a data output device; -
FIG. 13 is a view illustrating a data structure example of a SEI (Supplemental Enhancement Information) message in which HDR meta data is stored; -
FIG. 14 is a view illustrating a data structure example of a SEI message in which HDR meta data is stored; -
FIG. 15 is a view illustrating a data structure example of a SEI message in which HDR meta data is stored; -
FIG. 16 is a block diagram illustrating a configuration example of a data output device; -
FIG. 17 is a block diagram illustrating a configuration example of a DR (Dynamic Range) converter; -
FIG. 18 is a block diagram illustrating a configuration example of the DR converter; -
FIG. 19 is a view illustrating an example of instruction contents of an HDR meta interpreter; -
FIG. 20 is a view illustrating an example of instruction contents of the HDR meta interpreter; -
FIG. 21 is a view illustrating an example of instruction contents of the HDR meta interpreter; -
FIG. 22 is a block diagram illustrating a configuration example of the data output device; -
FIG. 23 is a view illustrating a combination example of characteristics of a video signal and the display device, and an output signal of the data output device; -
FIG. 24 is a view illustrating an example of an operation model of playing back various signals and outputting the signals to various TVs; -
FIG. 25 is a view illustrating a storage example of static HDR meta data and dynamic HDR meta data; -
FIG. 26 is a view illustrating an example of a method for displaying a user guidance; -
FIG. 27 is a view illustrating an example of the method for displaying the user guidance; -
FIG. 28 is a view illustrating an example of the method for displaying the user guidance; -
FIG. 29 is a view illustrating an example of the method for displaying the user guidance; -
FIG. 30 is a flowchart of a method for transmitting dynamic HDR meta data which depends on a HDMI (High-Definition Multimedia Interface) (registered trademark and the same applies likewise below) version; -
FIG. 31 is a flowchart of a method for transmitting static HDR meta data which depends on a HDMI version; -
FIG. 32 is a flowchart of a method for controlling a luminance value during playback of an HDR signal; -
FIG. 33 is a view for explaining a dual disk playback operation; -
FIG. 34A is a view illustrating an example of a display process of converting an HDR signal in an HDR TV and displaying an HDR; -
FIG. 34B is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR (Standard Dynamic Range) TV; -
FIG. 34C is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR TV which are connected to each other via a standard interface; -
FIG. 35 is a view for explaining a process of converting an HDR into a pseudo HDR; -
FIG. 36A is a view illustrating an example of an EOTF (Electro-Optical Transfer Function) which supports the HDR and the SDR, respectively; -
FIG. 36B is a view illustrating an example of an inverse EOTF which supports the HDR and the SDR, respectively; -
FIG. 37 is a block diagram illustrating a configuration of a converting device and the display device according to an exemplary embodiment; -
FIG. 38 is a flowchart illustrating a converting method and a display method performed by the converting device and the display device according to the exemplary embodiment; -
FIG. 39A is a view for explaining first luminance conversion; -
FIG. 39B is a view for explaining another example of the first luminance conversion; -
FIG. 40 is a view for explaining second luminance conversion; -
FIG. 41 is a view for explaining third luminance conversion; and -
FIG. 42 is a flowchart illustrating a detailed process of display settings. - A transmitting method according to one aspect of the present disclosure is a method used by a playback device, and includes: when a version of a transmission protocol is a first version, transmitting first meta data to a display device without transmitting second meta data to the display device, the transmission protocol being used to transmit a signal between the playback device and the display device, the first meta data including information that is commonly used for a plurality of images included in a continuous playback unit of a first video signal and relates to a luminance range of the first video signal, the second meta data including information that is commonly used for a unit subdivided compared to the continuous playback unit of the first video signal and relates to the luminance range of the first video signal; and when the version of the transmission protocol is a second version, transmitting the first meta data and the second meta data to the display device.
- Consequently, according to this transmitting method, it is possible to transmit appropriate meta data of the first meta data and the second meta data, to the display device according to the version of the transmission protocol.
- For example, when the version of the transmission protocol is the first version, a conversion process of converting the luminance range of the first video signal may be performed by using the second meta data to obtain a second video signal, and the second video signal may be transmitted to the display device.
- Consequently, when the second meta data cannot be transmitted to the display device and the display device cannot perform a conversion process, the playback device can perform the conversion process.
- For example, when the version of the transmission protocol is the second version and the display device does not include a function of the conversion process of converting the luminance range of the first video signal by using the second meta data, the conversion processing may be performed to obtain a second video signal, and the second video signal may be transmitted to the display device, and when the version of the transmission protocol is the second version and the display device includes the function of the conversion process, the first video signal may be transmitted to the display device without performing the conversion process.
- Consequently, appropriate one of the playback device and the display device can execute a conversion process.
- For example, when the playback device does not include a function of the conversion process of converting the luminance range of the video signal by using the second meta data, the conversion processing may not be performed, and the second meta data may not be transmitted to the display device.
- For example, a luminance value of the first video signal may be encoded as a code value, and the first meta data may be the information for specifying an EOTF (Electro-Optical Transfer Function) of associating a plurality of luminance values and a plurality of code values.
- For example, the second meta data may indicate mastering characteristics of the first video signal.
- Further, a playback method according to one aspect of the present disclosure is a playback method for playing back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback method including: determining whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold; and adjusting the luminance value of the video signal when it is determined that the change amount exceeds the first threshold.
- Consequently, according to the playback method, when a luminance value of a video signal exceeds display capability of the display device, it is possible to generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, according to the playback method, when a large change amount of a luminance value of a video signal is likely to negatively influence viewers, it is possible to reduce the negative influence by lowering the luminance value of the video signal.
- For example, the adjustment may include adjusting, for a pixel whose change amount exceeds the first threshold, a luminance value of the pixel such that the change amount of the pixel is the first threshold or less.
- For example, the determination may include determining whether or not a difference exceeds the first threshold, the difference being a difference between a peak luminance of a first image included in the video signal, and each of luminance values of a plurality of pixels included in the video signal and included in a second image subsequent to the first image, and the adjustment may include adjusting, for a pixel whose difference exceeds the first threshold, a luminance value of the pixel such that the difference of the pixel is the first threshold or less.
- For example, the determination may include determining whether or not the change amount of the luminance value at a reference time interval exceeds the first threshold, the reference time interval being an integer multiple of a reciprocal of a frame rate of the video signal.
- For example, the determination may include determining whether or not a rate of pixels whose change amounts exceed the first threshold with respect to a plurality of pixels exceeds a second threshold, the plurality of pixels being included in an image included in the video signal, and the adjustment may include adjusting, when the rate exceeds the second threshold, the luminance values of a plurality of pixels such that the rate is the second threshold or less.
- For example, the determination may include determining, for each of a plurality of areas obtained by dividing a screen, whether or not the inter-screen change amount of the luminance value of each of a plurality of areas exceeds the first threshold, and the adjustment may include performing adjustment process of lowering a luminance value of an area for which it is determined that the change amount exceeds the first threshold.
- Further, a playback method according to one aspect of the present disclosure is a playback method for playing back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback method including: determining whether or not a luminance value of an image of the video signal exceeds a predetermined first threshold; and adjusting the luminance value of the image when determining that the luminance value exceeds the first threshold.
- Consequently, according to the playback method, when a luminance value of a video signal exceeds display capability of the display device, it is possible to generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, according to the playback method, when a high luminance value of a video signal is likely to negatively influence viewers, it is possible to reduce the negative influence by lowering the luminance value of the video signal.
- For example, the determination may include determining, whether or not a number of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image exceeds the first threshold, and the adjustment may include lowering, when the number of pixels exceeds a third thresholds, the luminance value of the image such that the number of pixels is the third threshold or less.
- For example, the determination may include determining, a rate of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image, and the adjustment may include lowering, when the rate exceeds a third threshold, the luminance value of the image such that the rate is the third threshold or less.
- For example, the first threshold may be a value calculated based on an upper limit value of a voltage which is simultaneously applicable to a plurality of pixels in a display device that displays the video signal.
- Further, a playback device according to one aspect of the present disclosure is a playback device that transmits a video signal to a display device, and includes one or more memories and circuitry which, in operation, transmits, when a version of a transmission protocol is a first version, first meta data to the display device without transmitting second meta data to the display device, the transmission protocol being used to transmit a signal between the playback device and the display device, the first meta data including information that is commonly used for a plurality of images included in a continuous playback unit of a first video signal and relates to a luminance range of the first video signal, the second meta data including information that is commonly used for a unit subdivided compared to the continuous playback unit of the first video signal and relates to the luminance range of the first video signal; and transmits, when the version of the transmission protocol is a second version, the first meta data and the second meta data to the display device.
- Consequently, the playback device can transmit appropriate meta data of the first meta data and the second meta data, to the display device according to the version of the transmission protocol.
- Furthermore, a playback device according to one aspect of the present disclosure is a playback device that plays back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback device including one or more memories and circuitry which, in operation, determines whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold; and adjusts the luminance value of the video signal when it is determined that the change amount exceeds the first threshold.
- Consequently, when a luminance value of a video signal exceeds display capability of the display device, the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a large change amount of a luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- Furthermore, a playback device according to one aspect of the present disclosure is a playback device that plays back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the playback device including one or more memories and circuitry which, in operation, determines whether or not a luminance value of an image included in the video signal exceeds a predetermined first threshold; and adjusts the luminance value of the image when it is determined that the luminance value exceeds the first threshold.
- Consequently, when a luminance value of a video signal exceeds display capability of the display device, the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a high luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- In addition, these comprehensive or specific aspects may be realized by a system, a method, an integrated circuit, a computer program or a computer-readable recording medium such as a CD-ROM, and may be realized by an arbitrary combination of the system, the method, the integrated circuit, the computer program and the recording medium.
- Further, the above features will be mainly described in [27. HDR Meta Data Transmitting Method] to [28. Adjustment of Luminance Value].
- In addition, the exemplary embodiment described below is a comprehensive or specific example. Numerical values, shapes, materials, components, placement positions and connection modes of the components, steps and a step order described in the following exemplary embodiment are exemplary, and by no means limit the present disclosure. Further, components which are not recited in the independent claims representing the uppermost generic concepts among components in the following exemplary embodiment will be described as arbitrary components.
- (Exemplary Embodiment)
- [1. Background]
- First, a transition of a video technology will be described with reference to
FIG. 1 .FIG. 1 is a view for explaining development of the video technology. - Expanding a number of display pixels has been mainly focused to provide high-quality video images so far. So-called 2K video images of video images of 720×480 pixels of Standard Definition (SD) to 1920×1080 pixels of High Definition (HD) are spreading.
- In recent years, introduction of so-called 4K video images of 3840×1920 pixels of Ultra High Definition (UHD) or 4096×1920 pixels of 4K has been started to provide much higher quality of video images.
- Further, providing higher resolutions of video images by introducing 4K, and providing higher quality video images by extending a dynamic range and expanding a color gamut or by adding and improving a frame rate have been studied.
- Above all, as a dynamic range, an HDR (High Dynamic Range) is a method for supporting a luminance range whose maximum luminance value is expanded to express, with a brightness close to an actual brightness, bright light such as specular reflection light which cannot be expressed by existing TV signals while maintaining a dark part gradation in conventional video images, and is gaining attention. More specifically, a method for a luminance range supported by conventional TV signals is called an SDR (Standard Dynamic Range) and has 100 nit of a maximum luminance value. Meanwhile, the HDR is assumed to expand the maximum luminance value to 1000 nit or more. The HDR is being standardized by SMPTE (Society of Motion Picture & Television Engineers) and ITU-R (International Telecommunications Union Radiocommunications Sector).
- Similar to HD and UHD, specific HDR application targets include broadcasts, package media (Blu-ray (registered trademark which applies likewise below) Discs), and Internet distribution.
- In addition, a luminance of a video image which supports the HDR takes a luminance value of an HDR luminance range, and a luminance signal obtained by quantizing the luminance value of the video image will be referred to as an HDR signal. A luminance of a video image which supports the SDR takes a luminance value of an SDR luminance range, and a luminance signal obtained by quantizing the luminance value of the video image will be referred to as an SDR signal.
- [2. Object and Task]
- An HDR (High Dynamic Range) signal which is an image signal of a higher luminance range than that of conventional image signals is distributed via a package medium such as a Blu-ray disc in which HDR signals are stored, broadcasting or a distribution medium such as a OTT (Over The Top). In this regard, the OTT means Web sites provided on the Internet, content and services such as moving images and audios or providers which provide the content and services. Distributed HDR signals are decoded by a Blu-ray device or the like. Further, decoded HDR signals are sent to an HDR supporting display device (TVs, projectors, tablets or smartphones), and the HDR supporting display device plays back HDR video images.
- The HDR technique is only at an early stage, and it is assumed that, after an HDR technique which is introduced first is adopted, a new HDR method is developed. In this case, it is possible to adopt a new HDR method by storing an HDR signal (and meta data) of a newly created HDR method, in an HDR distribution medium. In this case, what is important is “Forward Compatibility” which means that an original device (e.g. Blu-ray device) which does not support a new function can play back an HDR distribution medium in which signals of a new HDR method are stored. The present disclosure realizes a method and a device which maintain, for a distribution medium in which a new HDR signal format (meta data) is stored, compatibility by guaranteeing HDR playback of an original technique without changing a decoding device (e.g. Blu-ray device) designed for an original distribution medium, and enable an HDR decoding device (e.g. Blu-ray device) which supports a new method to support a process of a new HDR method.
- Further, when a method for adopting a new technique so randomly is adopted without selecting an extension method or appropriately determining a registering method, a flood of multiple non-compatible methods is likely to cause confusion in the market. By contrast with this, introduction of a very strict technique selection mechanism delays determination of a new technique. Thus, the delay of the introduction of technical renovation is likely to make a distribution platform (e.g. Blu-ray) of the technique obsolete. Therefore, there is a risk that it is not possible to maintain competitiveness of this distribution platform against other platforms (e.g. electronic distribution service such as the OTT). Hence, an option introduction method which adopts advantages of both is necessary. In the present exemplary embodiment, a hybrid option introduction method which meets these needs is proposed.
-
FIG. 2 is a view illustrating an HDR position (expansion of a luminance). Further,FIG. 3 illustrates an image example indicating an HDR effect. - [3. Relationship Between Masters, Distributing Methods and Display Devices During Introduction of HDR]
-
FIG. 4 is a view illustrating a relationship between a flow of creating SDR and HDR home entertainment masters, distribution media and display devices. - An HDR concept has been proposed, and effectiveness at a level of the HDR concept has been confirmed. Further, a first method for implementing an HDR has been proposed. In this regard, this does not mean that a great number of items of HDR content has been created by using this method, and the first implementation method has not been substantiated. Hence, when creation of HDR content is earnestly advanced in future, meta data for an existing HDR creation method, an HDR-SDR converting method or a tone mapping conversion method of a display device is likely to change.
- [4. How to Use EOTF]
-
FIG. 5 is an explanatory view of a method for determining a code value of a luminance signal to be stored in content, and a process for restoring a luminance value from a code value during playback. - A luminance signal indicating a luminance in the present exemplary embodiment is an HDR signal which supports an HDR. A graded image is quantized by an HDR inverse EOTF, and a code value associated with a luminance value of the image is determined. The image is encoded based on this code value, and a video stream is generated. During playback, a stream decoding result is inversely quantized based on the HDR EOTF and is converted into a linear signal, and a luminance value of each pixel is restored. Quantization performed by using an HDR inverse EOTF will be referred to as “HDR inverse EOTF conversion” below. Inverse quantization performed by using an HDR EOTF will be referred to as “HDR EOTF conversion” below. Similarly, quantization performed by using an SDR inverse EOTF will be referred to as “SDR inverse EOTF conversion” below. Inverse quantization performed by using an SDR EOTF will be referred to as “SDR EOTF conversion” below.
- A video conversion processor performs conversion into a luminance value which can be displayed by a video display, by using this luminance value and meta data, and, consequently, the video display can display HDR video images. When, for example, a peak luminance of an original HDR video image is 2000 nit and a peak luminance of the video display is 800 nit, it is possible to perform conversion and lower a luminance.
- Thus, an HDR master method is realized by a combination of an EOTF, meta data and an HDR signal. Consequently, it is likely that a more efficient EOTF and meta data are developed, and a time to adopt an HDR method using such an EOTF and meta data comes.
- In this regard, what new method will be like is not known at this point of time, yet it is imaginable that it is likely that an EOTF is changed and meta data is added. In this case, HDR signals also change.
- The present disclosure intends to make the HDR popular by reducing a risk that, even when an HDR transmission format is changed as described above, customers who have bought HDR supporting devices have to buy new devices.
- [5. Meta Data]
-
FIG. 6 is a view illustrating an example of HDR meta data. HDR meta data includes conversion auxiliary information used to change (DR conversion) a luminance range of a video signal, and HDR control information. Each information is one of static HDR meta data provided in title units, for example, and dynamic HDR meta data provided in frame units, for example. Further, static HDR meta data is classified into one of required meta data (basic data) and selected meta data (extension data), and dynamic HDR meta data is classified as selected meta data. In addition, each information will be described in detail below. - Thus, the basic method can be implemented by using only static HDR meta data. Further, each extension method is designed so as not to influence a playback device (e.g. Blu-ray) of the basic method.
- [6. HDR Meta Data 1]
- Parameters of HDR content indicating characteristics during mastering include static HDR meta data which is fixed per title or per playlist, and dynamic HDR meta data which is variable per scene. In this regard, a title and a playlist are information indicating video signals which are continuously played back. Hereinafter, video signals which are continuously played back will be referred to as continuous playback units.
- For example, static HDR meta data includes at least one of a type of an EOTF function (curve), a 18% Gray value, a Diffuse White value, Knee point and Clip point. The EOTF is information obtained by associating a plurality of luminance values and a plurality of code values, and is information for changing a luminance range of a video signal. Other pieces of information are attribute information related to a luminance of a video signal. Therefore, it may be said that static HDR meta data is information related to a luminance range of a video signal and is information for specifying the luminance range of the video signal.
- More specifically, the 18% Gray value and the Diffuse White value indicate a luminance value (nit) of a video image whose brightness is a predetermined reference, in other words, indicate a reference brightness of a video image. More specifically, the 18% Gray value indicates a mastered luminance value (nit) obtained of an object whose brightness is 18 nit before the mastering. The Diffuse White value indicates a luminance value (nit) corresponding to white.
- Further, Knee point and Clip point are parameters of the EOTF function, and indicate points at which EOTF characteristics change. More specifically, Knee point indicates a change point at which an increase of a luminance value (output luminance) mapped as a luminance of a video signal to the EOTF with respect to an increase of an original luminance value (input luminance) during shooting is a value different from 1:1. For example, Knee point is information for specifying a point which is off from a linear change in
FIG. 39A described below. Further, Clip point indicates a point at which clipping starts in the EOTF function. In this regard, clipping refers to converting an input luminance value of a given value or more into an identical output luminance value. For example, Clip point indicates a point at which an output luminance value stops changing inFIG. 39B described below. - Further, types of the EOTF function (curve) include an HDR EOTF and an SDR EOTF illustrated in
FIG. 36A , for example. - Thus, a content data generating method according to the present exemplary embodiment is a content data generating method for generating content data, and includes: a first generating step of generating video signals, and static HDR meta data (first meta data) including information which is commonly used for a plurality of images (video signals configuring continuous playback units) included in the continuous playback units of the video signals, and which relates to a luminance range of the video signals; and a second generating step of generating content data by associating the continuous playback units and the static HDR meta data. For example, the information which relates to the luminance range of the video signals is information for converting the luminance range of the video signals.
- Further, the static HDR meta data includes information for specifying an EOTF of associating a plurality of luminance values and a plurality of code values. Furthermore, a luminance value of the video signal is encoded as a code value.
- Still further, the static HDR meta data further includes information indicating a luminance value of a video signal whose brightness is a predetermined reference or information indicating a point at which EOTF characteristics change. For example, the static HDR meta data includes information (Diffuse White value) indicating a luminance value corresponding to white of a video signal.
- Further, in the first generating step, dynamic HDR meta data (second meta data) which is information commonly used for units subdivided compared to the continuous playback units and which is information related to the luminance range of the video signals is further generated. For example, the information related to the luminance range of the video signals is information for converting the luminance range of the video signals.
- The dynamic HDR meta data is a parameter indicating mastering characteristics which are different per scene. The mastering characteristics described herein indicate a relationship between an original (pre-mastering) luminance and a mastered luminance. For example, the parameter indicating the mastering characteristics is same information as the above static HDR meta data, in other words, at least one of pieces of information included in the static HDR meta data.
-
FIG. 7 is a view illustrating a storage example of static HDR meta data. This example is an example where static HDR meta data is stored in a playlist in a package medium such as a Blu-ray disc. - The static HDR meta data is stored as one of items of meta data of each stream to which a reference is made from a playlist. In this case, the static HDR meta data is fixed in playlist units. That is, the static HDR meta data is associated with each playlist and stored.
- Further, according to the OTT, static HDR meta data may be stored in a manifest file to which a reference is made before a stream is obtained. That is, according to the content data generating method according to the present exemplary embodiment, a video signal may be generated as a video stream, and static HDR meta data may be stored in a manifest file to which a reference is made before the video stream is obtained.
- Further, in case of broadcasting, static HDR meta data may be stored in a descriptor indicating a stream attribute. That is, according to the content data generating method according to the present exemplary embodiment, content data may be generated as a video stream, and static HDR meta data may be stored as an identifier indicating a video stream attribute independently from the video stream. For example, the static HDR meta data can be stored as a descriptor according to MPEG2-TS (Moving Picture Experts Group 2-Transport Stream).
- Further, when static HDR meta data is fixed per title, static HDR meta data may be stored as management information indicating a title attribute.
- Furthermore, in this example, static HDR meta data for an HDR is stored by using a mechanism which stores various items of meta data in a playlist in a Blu-ray disc. Hence, a presence of static HDR meta data needs to be defined in a playlist from a viewpoint of application standards or a device such of Blu-ray or the like. Hence, it is necessary to revise Blu-ray standards to newly provide new HDR static meta data. Further, a capacity is defined, and therefore it is difficult to limitlessly store items of static HDR meta data for an HDR option technique.
- [7. HDR Meta Data 2]
-
FIG. 8 is a view illustrating an example where dynamic HDR meta data is stored in a video stream. According to MPEG-4 AVC (Advanced Video Coding) or HEVC (High Efficiency Video Coding), information related to stream playback control is stored by using a data structure called SEI (Supplemental Enhancement Information). Hence, for example, dynamic HDR meta data is stored in SEI. - The dynamic HDR meta data is assumed to be updated per scene. A scene head is a head access unit (AU) in random access units such as GOP (Group Of Pictures). Hence, dynamic HDR meta data may be stored in a head access unit in a decoding order in the random access units. The head access unit in the random access units is an IDR (Instantaneous Decoder Refresh) picture or a non-IDR I picture to which a SPS (Sequence Parameter Set) is added. Hence, a receiving-side device can obtain dynamic HDR meta data by detecting a NAL (Network Abstraction Layer) unit configuring a head access unit in the random access units. Alternatively, a unique type may be allocated to SEI in which dynamic HDR meta data is stored.
- In addition, a type of the EOTF function may be stored as stream attribute information of a SPS. That is, according to the content data generating method according to the present exemplary embodiment, content data may be generated as a video stream encoded according to HEVC, and information for specifying the EOTF may be stored in a SPS included in a video stream.
- Further, in this example, a mechanism which stores option data according to MPEG is used, and dynamic HDR meta data is stored in a video elementary stream. Hence, a presence of dynamic HDR meta data is not known from a viewpoint of application standards or a device of Blu-ray or the like. Hence, it is possible to record dynamic HDR meta data by using only the mechanism which stores option data according to MPEG without revising the Blu-ray standards. Further, an area to be used is a SEI area, so that it is also possible to store a plurality of items of option dynamic HDR meta data.
- [8. Method For Storing Dynamic HDR Meta Data]
-
FIG. 9 is a view illustrating an example where dynamic HDR meta data is stored in a TS stream format different from that of a main video image. - Blu-ray has a function of synchronizing and playing back two TS streams. This two TS stream synchronizing/playback function includes a 2TS playback function of synchronizing and playing back two individually managed TS streams in a disk, and a 1TS playback function of interleaving two streams to use as one TS stream.
- By using this two TS stream synchronizing/playback function and storing dynamic HDR meta data in a TS stream format, the playback device can use dynamic HDR meta data in synchronization with main HDR video images. Consequently, a normal HDR player can play back only main HDR video images, and obtain video images of standard HDR quality. Further, an option supporting HDR player can play back high gradation HDR quality video images by using dynamic HDR meta data stored in a TS.
- In this example, dynamic HDR meta data is stored in an auxiliary TS stream by using a mechanism which stores two TS streams of Blu-ray. Hence, a presence of dynamic HDR meta data is recognized as a TS stream from a viewpoint of application standards or a device of Blu-ray or the like. Hence, it is necessary to revise the Blu-ray standards. Further, it is possible to simultaneously store TS streams of two types of options.
- [9. Method For Transmitting Static HDR Meta Data]
-
FIG. 10 is a view illustrating a method for transmitting static HDR meta data.FIG. 10 illustrates a flowchart illustrating an example of an operation of transmitting an HDR signal to a display device from a playback device such as a BD player (Blu-ray device) or a recorder according to a transmission protocol such as HDMI. - That static HDR meta data is fixed in title units or playlist units has been described above. Hence, when it is necessary to set static HDR meta data (Yes in S401), the playback device obtains the static HDR meta data from content management information during a start of playback of a title or a playlist, and stores and transmits the obtained static HDR meta data as HDMI control information. That is, prior to a start of transmission of a video signal configuring a title or a playlist, the playback device obtains static HDR meta data corresponding to the title or the playlist, and transmits the obtained static HDR meta data as HDMI control information (S402). More generally, the playback device may transmit static HDR meta data as initialization information when a HDMI initialization process between the playback device and the display device is performed.
- Subsequently, the playback device transmits a video stream corresponding to the static HDR meta data (S403). In addition, transmitted static HDR meta data is effective for this video stream.
- Thus, a video stream transmitting method according to the present exemplary embodiment is a video stream transmitting method for transmitting a video stream (video stream), and includes: an obtaining step of obtaining content data including video signals, and static HDR meta data (first meta data) which is information which is commonly used for a plurality of images included in the continuous playback units, and which relates to a luminance range of the video signals; and a transmitting step of transmitting a video stream corresponding to the video signals, and static HDR meta data.
- For example, in the transmitting step, the video stream and the static HDR meta data are transmitted according to a HDMI communication protocol.
- Further, the dynamic HDR meta data is transmitted as part of a video stream (SEI).
- In addition, the playback device may transmit dynamic HDR meta data as a HDMI control signal at a timing at which the dynamic HDR meta data becomes effective. In this case, the playback device provides identifiers for the static HDR meta data and the dynamic HDR meta data to be identified from each other, and transmit the static HDR meta data and the dynamic HDR meta data.
- Further, only a data structure of a container for storing dynamic HDR meta data may be defined in a control signal, and a copy of contents of SEI may be enabled as payload data of the container. Consequently, it is possible to support even an update of a syntax of dynamic HDR meta data included in SEI without changing the mounted playback device such as a BD player.
- Similarly, by enabling copy and transmission of static HDR meta data in content management information, it is possible to support a change of a syntax of the static HDR meta data without changing the mounted playback device. That is, a data structure of a container for storing static HDR meta data is defined, so that, in the transmitting step, the static HDR meta data included in content data may be copied to a payload of the container to transmit the container.
- Further, dynamic HDR meta data stored in a TS stream is synthesized with a main HDR video signal by some method, and is transmitted as a new video signal (high gradation HDR video images in an example in
FIG. 9 ) according to HDMI. - [10. Method For Processing HDR Meta Data]
-
FIG. 11 is a flowchart illustrating an example of a method for processing HDR meta data in case where the display device displays an HDR signal. First, the display device obtains static HDR meta data from HDMI control information (S411), and determines a method for displaying an HDR signal based on the obtained static HDR meta data (S412). - In addition, when the control information does not include the static HDR meta data, the display device determines a method for displaying an HDR signal based on a predetermined value of application standards or default settings of the display device. That is, according to the video display method according to the present exemplary embodiment, when static HDR meta data cannot be obtained, a video display method matching a video signal is determined based on the predetermined value or the settings.
- Further, when detecting dynamic HDR meta data in SEI in a video stream (Yes in S413), the display device updates a method for displaying an HDR signal based on the dynamic HDR meta data (S414). That is, according to the video display method according to the present exemplary embodiment, when static HDR meta data is obtained, a display method is determined based on the obtained static HDR meta data to display video images. Further, when dynamic HDR meta data is obtained, the display method determined based on the static HDR meta data is updated to the display method determined based on dynamic HDR meta data to display video images. Alternatively, a display method may be determined based on both of static HDR meta data and dynamic HDR meta data.
- In addition, when the display device does not support obtaining dynamic HDR meta data, the display device may operate based only on static HDR meta data. Further, even when the display device supports obtaining dynamic HDR meta data, the display device cannot update a method for displaying an HDR signal in synchronization with a presentation time stamp (PTS) of an access unit in which meta data is stored in some cases. In this case, after obtaining meta data, the display device may update a display method from an access unit displayed subsequent to the earliest time at which the display method can be updated.
- In addition, it is possible to update and add parameters by allocating version information to HDR meta data. Consequently, the display device can determine whether or not it is possible to interpret HDR meta data based on version information of the HDR meta data. Alternatively, HDR meta data may be configured by a basic portion and an extension portion, and a parameter may be updated or added by changing the extension portion without updating the basic portion. That is, each of static HDR meta data and dynamic HDR meta data may include a plurality of versions, and may include a basic portion which is commonly used between a plurality of versions, and an extension portion which differs per version. By so doing, it is possible to secure backward compatibility of the display device based on HDR meta data of the basic portion.
- Thus, the video display method according to the present exemplary embodiment is a video display method for displaying video images based on video streams, and includes: an obtaining step of obtaining a video stream corresponding to the video signals, and static HDR meta data (first meta data); and a display step of determining a display method for displaying the video images corresponding to the video signals based on the static HDR meta data and displaying the video image.
- Further, a luminance value of the video signal is encoded as a code value. Static HDR meta data includes information for specifying an EOTF of associating a plurality of luminance values and a plurality of code values. In the display step, video images are generated by using the EOTF specified by the static HDR meta data and converting the code value indicated by the video signal into a luminance value
- [11. Data Output Device]
-
FIG. 12 is a block diagram illustrating a configuration ofdata output device 400 such as a BD layer which outputs HDR signals. HDR meta data input todata output device 400 includes characteristics data indicating mastering characteristics of an HDR signal, and conversion auxiliary data indicating a tone mapping method for converting an HDR signal into an SDR signal or for converting a dynamic range of the HDR signal. These two types of items of meta data are stored as static HDR meta data or dynamic HDR meta data as described with reference toFIGS. 7 and 8 . Further, the static HDR meta data is stored in at least one of content management information and a video stream. -
Data output device 400 includesvideo decoder 401, externalmeta obtaining unit 402, HDRmeta interpreter 403, HDRcontrol information generator 404,DR converter 405 andHDMI output unit 406. -
Video decoder 401 generates a video signal (first video signal) by decoding a video stream which is a video encoded stream, and outputs the resulting video signal toDR converter 405. Further,video decoder 401 obtains HDR meta data (second meta data) (static HDR meta data or dynamic HDR meta data) in the video stream. More specifically,video decoder 401 outputs to HDRmeta interpreter 403 HDR meta data stored in a SEI message or the like according to MPEG-4 AVC or HEVC. - External
meta obtaining unit 402 obtains static HDR meta data (first meta data) stored in the content management information such as a playlist, and outputs the obtained static HDR meta data to HDRmeta interpreter 403. In this regard, in the content management information, dynamic HDR meta data which can be changed in predetermined units, such as a playitem, which enable a random access may be stored. In this case, externalmeta obtaining unit 402 obtains dynamic HDR meta data from the content management information, and outputs the obtained dynamic HDR meta data to HDRmeta interpreter 403. - HDR
meta interpreter 403 determines a type of HDR meta data output fromvideo decoder 401 or externalmeta obtaining unit 402, outputs characteristics data to HDR controlinformation generator 404 and outputs conversion auxiliary data toDR converter 405. - In addition, when both of
video decoder 401 and externalmeta obtaining unit 402 obtain static HDR meta data, only the static HDR meta data output from externalmeta obtaining unit 402 may be used as effective meta data. That is, there is a case where the first meta data obtained by externalmeta obtaining unit 402 and the second meta data obtained byvideo decoder 401 are static HDR meta data which is commonly used for a plurality of images included in continuous playback units of the first video signal. In this case, when both of the first meta data and the second meta data are obtained, HDRmeta interpreter 403 obtains characteristics data and conversion auxiliary data by analyzing the first meta data. - Alternatively, HDR
meta interpreter 403 may use static HDR meta data as effective meta data when externalmeta obtaining unit 402 obtains the static HDR meta data, and may overwrite static HDR meta data over the effective meta data whenvideo decoder 401 obtains the static HDR meta data. That is, there is a case where the first meta data obtained by externalmeta obtaining unit 402 and the second meta data obtained byvideo decoder 401 are static HDR meta data which is commonly used for a plurality of images included in continuous playback units of the first video signal. In this case, when only the first meta data of the first meta data and the second meta data is obtained, HDRmeta interpreter 403 obtains characteristics data and conversion auxiliary data by analyzing the first meta data. When the second meta data is obtained, HDRmeta interpreter 403 switches meta data to use from the first meta data to the second meta data. - HDR
control information generator 404 generates HDR control information according to HDMI based on the characteristics data, and outputs the generated HDR control information toHDMI output unit 406. In this regard, as for dynamic HDR meta data, an output timing of HDR control information inHDMI output unit 406 is determined such that it is possible to output HDR control information in synchronization with a video signal whose meta data is effective. That is,HDMI output unit 406 outputs HDR control information in synchronization with a video signal (video signal) whose meta data is effective. -
DR converter 405 converts a decoded video signal into an SDR signal and converts a dynamic range based on conversion auxiliary data. In this regard, when the display device connected withdata output device 400 supports an input of HDR signals,DR converter 405 does not need to perform conversion. Hence, by checking whether or not the connection destination display device supports an input of HDR signals by an initialization process according to HDMI,data output device 400 may determine whether or not conversion process is necessary. When it is determined that the conversion process is unnecessary, a first video signal obtained byvideo decoder 401 is input toHDMI output unit 406 withoutDR converter 405. - That is,
HDMI output unit 406 outputs the first video signal and the HDR control information to the display device when the display device connected todata output device 400 supports a video output of a luminance range of the HDR signal (first video signal). Further,HDMI output unit 406 outputs a second video signal obtained by converting an HDR into an SDR, and the HDR control information to the display device when the display device connected todata output device 400 does not support a video output of a luminance range of the HDR signal (first video signal). Furthermore,HDMI output unit 406 determines whether or not the display device supports a video output of a luminance range of an HDR signal (first video signal) by a transmission protocol (e.g. HDMI) initialization process. -
HDMI output unit 406 outputs the video signal output fromDR converter 405 orvideo decoder 401 and the HDR control information according to a HDMI protocol. - In addition,
data output device 400 can use the same configuration even when receiving and outputting broadcast or OTT content. Further, whendata output device 400 and the display device are included in a single device,HDMI output unit 406 is not necessary. - Furthermore, as described above,
data output device 400 includes externalmeta obtaining unit 402 which obtains meta data from control information or the like, andvideo decoder 401 includes a function of obtaining meta data from a video stream. However,data output device 400 may include one of externalmeta obtaining unit 402 and the function. - Further, an example where
data output device 400 outputs data (the video signal and the HDR control information) according to HDMI has been described above. However,data output device 400 may output data according to an arbitrary transmission protocol. - Thus,
data output device 400 includes: a decoder (video decoder 401) which generates a first video signal of a first luminance range (HDR) by decoding a video stream; an obtaining unit (at least one ofvideo decoder 401 and external meta obtaining unit 402) which obtains first meta data related to a luminance range of the first video signal; an interpreter (HDR meta interpreter 403) which obtains characteristics data indicating the luminance range of the first video signal by interpreting the first meta data; - a control information generator (HDR control information generator 404) which converts the characteristics data into HDR control information according to a predetermined transmission protocol (e.g. HMDI); and an output unit (HDMI output unit 406) which outputs the HDR control information according to the predetermined transmission protocol.
- Consequently,
data output device 400 can generate the control information based on the characteristics data included in the meta data. - Further, the interpreter (HDR meta interpreter 403) further obtains conversion auxiliary data for converting a luminance range of a first video signal, by interpreting the first meta data.
Data output device 400 further includes a converter (DR converter 405) which generates a second video signal of a luminance range narrower than the luminance range of the first video signal by converting the luminance range of the first video signal based on the conversion auxiliary data. The output unit (HDMI output unit 406) further outputs at least one of the first video signal and the second video signal according to the predetermined transmission protocol. - Consequently,
data output device 400 can change the luminance range of the first video signal by using the conversion auxiliary data included in the meta data. - Further, the decoder (video decoder 401) obtains the second meta data (HDR meta data) related to the luminance range of the first video signal from the video stream. The interpreter (HDR meta interpreter 403) obtains characteristics data and the conversion auxiliary data by analyzing at least one of the first meta data and the second meta data.
- Further, as illustrated in
FIG. 6 , static HDR meta data includes required meta data and selected meta data, and dynamic HDR meta data includes only selected meta data. That is, static HDR meta data is used at all times, and dynamic HDR meta data is selectively used. Thus, the first meta data obtained by externalmeta obtaining unit 402 or the second meta data obtained byvideo decoder 401 includes static HDR meta data (static meta data) which is commonly used for a plurality of images included in continuous playback units of a video signal and includes characteristics data. HDRcontrol information generator 404 converts the characteristics data included in the static HDR meta data into HDR control information according to the predetermined transmission protocol.HDMI output unit 406 outputs the HDR control information based on the static HDR meta data when outputting the first video signal (HDR signal). - Further, the first meta data obtained by external
meta obtaining unit 402 or the second meta data obtained byvideo decoder 401 further includes dynamic HDR meta data (dynamic meta data) which is commonly used for units subdivided compared to the continuous playback units of the video signal and includes characteristics data. HDRcontrol information generator 404 converts the characteristics data included in the static HDR meta data and the characteristics data included in the dynamic HDR meta data, into HDR control information according to the predetermined transmission protocol.HDMI output unit 406 outputs the HDR control information based on the static HDR meta data and the dynamic HDR meta data when outputting the first video signal (HDR signal). - Further, a data generating method according to the present disclosure is a data generating method performed by a data generating device, and includes: a first generating step of generating meta data related to a luminance range of a video signal; and a second generating step of generating a video stream including a video signal and meta data. Meta data includes characteristics data indicating the luminance range of the video signal, and conversion auxiliary data for converting the luminance range of the video signal.
- [12. Storage Example 1 of HDR Meta Data]
-
FIG. 13 is a view illustrating a data structure example of a SEI message in which HDR meta data is stored. As illustrated inFIG. 13 , an HDR meta data dedicated SEI message may be defined. That is, meta data may be stored in a meta data dedicated message. - Further, HDR meta data may be stored in a general-purpose user data storage SEI message, and information (HDR extension identification information described below) indicating that the HDR meta data is stored in a payload portion of the message may be provided.
- HDR meta data includes static HDR meta data and dynamic HDR meta data. Further, flag information indicating whether or not static HDR meta data is stored, and flag information indicating whether or not dynamic HDR meta data is stored may be provided. Thus, it is possible to use three types of storage methods including a method for storing only static HDR meta data, a method for storing only dynamic HDR meta data and a method for storing both of the static HDR meta data and the dynamic HDR meta data.
- Further, for each meta data, basic data (basic portion) which needs to be interpreted, and extension data (extension portion) which is optionally interpreted (whose interpretation is optional) may be defined. For example, type information indicating a type of meta data (basic data or extension data), and a size are included in header information, and a format of a container in which meta data is stored in a payload is defined. That is, meta data includes a payload, information indicating whether payload data is basic data or extension data, and information indicating a payload data size. In other words, meta data includes type information indicating a type of meta data. For example, basic data is stored in a container whose type value is 0. Further, a value equal to or more than 1 is allocated as a type value to the extension data, and this value indicates a type of the extension data.
- The data output device and the display device refer to the type value, and obtain data of the container which the data output device and the display device can interpret. That is, the data output device (or the display device) determines whether or not the data output device (or the display device) can interpret meta data, by using the type information, and obtains characteristics data and conversion auxiliary data by interpreting the meta data when the data output device (or the display device) can interpret the meta data.
- Further, meta data may be generated such that a maximum size of HDR meta data is set in advance and a total sum of sizes of basic data and extension data is the maximum size or less. That is, a maximum value of a data size of meta data is defined, and, according to the data generating method according to the present disclosure, the meta data is generated such that a total data size of the basic data and the extension data is the maximum value or less.
- The data output device and the display device include memories which support this maximum size and, consequently, can guarantee that all HDR meta data can be stored in the memories. Alternatively, it is also possible to secure a data area corresponding to a fixed size of static HDR meta data or dynamic HDR meta data, and to use an area other than an area in which basic data is stored, for extension in future.
- Such a data structure may be used to store HDR meta data in content management information.
- By using a SEI area in this way, it is possible to relatively freely store option information.
- [13. Storage Example 2 of HDR Meta Data]
-
FIG. 14 is a view illustrating an example of a data structure in a case where HDR meta data is stored in a user data storage SEI message. The data structure is the same as the data structure inFIG. 14 except that a message includes HDR extension identification information and an extension type ID. The HDR extension identification information indicates that the message includes HDR meta data. An extension type ID indicates an HDR meta data version or the like. That is, meta data is stored in a SEI message according to HEVC, and the SEI message includes HDR extension identification information indicating whether or not the SEI message includes meta data. - In this case, when the user data storage SEI message including the HDR extension identification information is received, and when the display device connected to the data output device supports an input of an HDR signal and HDR control information, the data output device copies and outputs the received SEI message according to a protocol of an output I/F such as HDMI for the display device. That is, when a SEI message including HDR extension identification information indicating that meta data is included in the SEI message is obtained, and the data output destination display device supports an input of the HDR control information, the data output device outputs the SEI message as is according to a predetermined transmission protocol (e.g. HDMI).
- Consequently, irrespectively of meta data contents, the data output device can output HDR meta data to the display device. According to this configuration, even when a new DR conversion process is developed in future to define new HDR meta data and a display device which supports this new HDR meta data is connected to a data output device which does not support new HDR meta data, it is possible to output new HDR meta data from the data output device to the display device. Further, the display device can perform a DR conversion process matching new HDR meta data.
- [14. Storage Example of a Plurality of Items of HDR Meta Data]
-
FIG. 15 is a view illustrating an example of a data structure in a case where a plurality of items of HDR meta data is stored in one user data storage SEI message. In this SEI message, a plurality of items of HDR meta data for a plurality of conversion modes (methods) related to conversion of a dynamic range (luminance range) is stored. - A field (a number of conversion modes) indicating the number of conversion modes of providing HDR meta data is added to the data structure illustrated in
FIG. 15 compared to the data structure illustrated inFIG. 14 . Further, a plurality of items of HDR meta data corresponding to each conversion mode is stored in order subsequent to the number of conversion modes. - That is, the data generating method according to the present exemplary embodiment is a data generating method performed by the data generating device, and includes: a first generating step of generating one or more items of meta data (HDR meta data) matching one or more conversion modes of converting a luminance range of a video signal; and a second generating step of generating a video stream including a video signal, the one or more items of meta data, and the number of conversion modes indicating the number of one or more conversion modes.
- [15. Configuration of Data Output Device]
-
FIG. 16 is a block diagram illustrating a configuration example ofdata output device 500 according to the present exemplary embodiment. Thisdata output device 500 includesvideo decoder 501, externalmeta obtaining unit 502, HDRmeta interpreter 503, HDRcontrol information generator 504,DR converter 505 andHDMI output unit 506. In addition, operations of HDRmeta interpreter 503 andDR converter 505 are different from those ofdata output device 400 illustrated inFIG. 12 . That is, operations ofvideo decoder 501, externalmeta obtaining unit 502, HDRcontrol information generator 504 andHDMI output unit 506 are the same as operations ofvideo decoder 401, externalmeta obtaining unit 402, HDRcontrol information generator 404 andHDMI output unit 406. - Further,
data output device 500 is connected with display device 510 (display), and outputs generated video signals and HDR control information to displaydevice 510 according to a predetermined transmission protocol such as HDMI. -
DR converter 505 anddisplay device 510 each support a plurality of dynamic range conversion modes (converting methods). In this regard, “support” means having a function of performing a process of each conversion mode. First, HDRmeta interpreter 503 obtains static HDR meta data and dynamic HDR meta data from externalmeta obtaining unit 502 andvideo decoder 501. In content management information or encoded video stream, a plurality of items of HDR meta data for a plurality of conversion modes is stored. HDRmeta interpreter 503 determines a plurality of conversion modes matching a plurality of HDR meta data as a plurality of usable conversion modes. - Further, HDR
meta interpreter 503 obtains information of a conversion mode of an HDR signal supported bydisplay device 510 by communicating withdisplay device 510 or via a network. Furthermore, HDRmeta interpreter 503 determines (1) which one ofdata output device 500 anddisplay device 510 performs a dynamic range conversion process and (2) a conversion mode to use, based on (1) a conversion mode matching HDR meta data, (2) a conversion mode supported byDR converter 505 and (3) a conversion mode supported bydisplay device 510. - When it is determined that
data output device 500 performs the conversion process,DR converter 505 converts an HDR signal into an SDR signal according to the conversion mode instructed by HDRmeta interpreter 503. When it is determined thatdisplay device 510 determines a conversion process,data output device 500 transmits a video signal (HDR signal) todisplay device 510, and transmits HDR meta data which is necessary for conversion as a HDMI control signal (HDR control information) todisplay device 510. - In addition, as described above,
DR converter 505 supports a plurality of conversion modes. However,DR converter 505 only needs to support one or more conversion modes. In this case,data output device 500 only needs to obtain one or more items of HDR meta data matching one or more conversion modes. - Thus, data output device 500 includes: a decoder (video decoder 501) which generates a first video signal by decoding a video stream; an obtaining unit (at least one of video decoder 501 and external meta obtaining unit 502) which obtains one or more items of meta data matching one or more first conversion modes of converting a luminance range of the video signal; an interpreter (HDR meta interpreter 503) which obtains characteristics data indicating a luminance range of the first video signal and conversion auxiliary data for converting the luminance range of the first video signal, by interpreting one of one or more items of first meta data; a control information generator (HDR control information generator 504) which converts the characteristics data into HDR control information according to a predetermined transmission protocol (e.g. HMDI); a converter (DR converter 505) which supports one or more second conversion modes of converting a luminance range of a video signal, and generates a second video signal of a luminance range narrower than the luminance range of the first video signal by performing a process of converting the luminance range of the first video signal according to one of the one or more second conversion modes based on the conversion auxiliary data; and an output unit (HDMI output unit 506) which outputs the second video signal and the HDR control information to display device 510 according to the predetermined transmission protocol. The interpreter (HDR meta interpreter 503) further determines which one of
data output device 500 anddisplay device 510 performs the above conversion process based on the one or more first conversion modes, the one or more second conversion modes and a third conversion mode which is supported bydisplay device 510 and converts a luminance range of a video signal. - According to this,
data output device 500 can determine which one ofdata output device 500 anddisplay device 510 performs a conversion process based on the first conversion mode matching one or more items of meta data, the second conversion mode supported bydata output device 500 and the third conversion mode supported bydisplay device 510. Consequently,data output device 500 can determine a device which appropriately performs a conversion process. - In addition, the one or more second conversion modes supported by
data output device 500 may include at least part of a plurality of first conversion modes matching the one or more items of meta data, or may not include any one of the one or more first conversion modes. Similarly, the third conversion mode supported bydisplay device 510 may include at least part of the one or more first conversion modes, or may not include any one of the one or more first conversion modes. Further, the third conversion mode may include at least part of the one or more second conversion modes, or may not include any one of the one or more second conversion modes. - [16. Configuration of DR Converter]
- A configuration example of
DR converter 505 will be described below.FIG. 17 is a block diagram illustrating the configuration example ofDR converter 505. ThisDR converter 505 includesmode determining unit 511,N mode processors 512 and conversionresult output unit 513.N mode processors 512 each support each of N conversion modes (processing methods), and perform a process of a corresponding conversion mode.Mode determining unit 511 obtains a conversion mode instructed by HDRmeta interpreter 503, and determinesmode processor 512 which performs a conversion process. That is,mode determining unit 511 selectsmode processor 512 which supports the conversion mode instructed by HDRmeta interpreter 503.Determined mode processor 512 generates an SDR signal (converted video signal) by performing a process of converting an HDR signal (video signal). Conversionresult output unit 513 outputs the converted SDR signal. -
FIG. 18 is a block diagram illustrating a configuration example ofDR converter 505A which is another example ofDR converter 505. ThisDR converter 505 includesmode determining unit 521,basic processor 522, Nextension mode processors 523 and conversionresult output unit 524. -
Basic processor 522 performs a default conversion process which is a common process among N conversion modes. Nextension mode processors 523 perform a process performed bybasic processor 522 and, in addition, an extension process of dynamically controlling parameters of a conversion process by using dynamic HDR meta data. Further, Nextension mode processors 523 each support each of N conversion modes, and perform a corresponding conversion mode extension process. For example,basic processor 522 operates by using only static HDR meta data, andextension mode processor 523 operates by using static HDR meta data and, in addition, dynamic HDR meta data. - [17. Operation Example of HDR Meta Interpreter]
-
FIGS. 19 and 20 are views illustrating examples of instruction contents of HDRmeta interpreter 503 based on a conversion mode of providing HDR meta data, whether or notdata output device 500 supports each mode and whether or not displaydevice 510 supports each mode. HDRmeta interpreter 503 basically selects an operation which maximizes reproducibility for a master image, from selectable combinations. In this regard, the master image refers to an image output without changing a luminance range. - For example, in an example illustrated in
FIG. 19 ,data output device 500 supportsmode 1 andmode 2, anddisplay device 510 does not support any conversion mode. In addition, betweenmode 1 andmode 2,mode 2 has higher reproducibility for the master image. Further, HDRmeta interpreter 503 learns reproducibility of each mode for a master image in advance. In this case, HDRmeta interpreter 503 determines thatdata output device 500 performs a conversion process, and selectsmode 2 of the higher reproducibility betweenmode 1 andmode 2. - Further, in an example illustrated in
FIG. 20 ,data output device 500 supportsmode 1, anddisplay device 510 supportsmode 1 andmode 2. In this case, HDRmeta interpreter 503 determines thatdisplay device 510 performs a conversion process, and selectsmode 2 of the higher reproducibility betweenmode 1 andmode 2. Further,data output device 500 outputs HDR meta data matching a conversion process ofmode 2 as HDMI control information (HDR control information) todisplay device 510.Display device 510 performs a conversion process ofmode 2 by using the control information. - Thus, HDR
meta interpreter 503 further determines as a conversion mode of a conversion process to be performed by data output device 500 a conversion mode which is included in the one or more first conversion modes matching the one or more items of meta data and which is included in the one or more second conversion modes supported bydata output device 500. More specifically, HDRmeta interpreter 503 further determines as a conversion mode of a conversion process to be performed bydata output device 500 or display device 510 a conversion mode which is included in the one or more first conversion modes matching the one or more items of meta data and which is included in at least one of the one or more second conversion modes supported bydata output device 500 and the third conversion mode supported bydisplay device 510. - More specifically, HDR
meta interpreter 503 determines as a conversion mode of a conversion process to be performed bydata output device 500 or display device 510 a conversion mode of the highest reproducibility for a master image among a plurality of conversion modes included in a plurality of first conversion modes and included in at least one of a plurality of second conversion modes and the third conversion mode. - In other words,
data output device 500 selects a mode of the highest reproducibility among conversion modes supported bydata output device 500 anddisplay device 510, and determines that one device ofdata output device 500 anddisplay device 510 supporting the selected mode performs a conversion process. - More specifically, as illustrated in
FIG. 19 , HDRmeta interpreter 503 determines thatdata output device 500 performs a conversion process when the determined conversion mode of the conversion process is included in the second conversion modes and is not included in the third conversion mode. Further, as illustrated inFIG. 20 , HDRmeta interpreter 503 determines thatdisplay device 510 performs a conversion process when the determined conversion mode of the conversion process is included in the third conversion mode and is not included in the second conversion modes. - According to this,
data output device 500 can determine a conversion mode to use based on the first conversion modes matching one or more items of meta data, the second conversion modes supported by the data output device and the third conversion mode supported by the display device. Further,data output device 500 can select the conversion mode of the highest reproducibility for a master image and, consequently, can improve quality of video images to be displayed. -
FIG. 21 is a view illustrating an example where a conversion process is determined according to whether or notdata output device 500 can obtain parameters ofdisplay device 510. A parameter ofdisplay device 510 is a peak luminance of display device 510 (a maximum value of a luminance range whichdisplay device 510 can display) or a display mode whichdisplay device 510 can display. More specifically, this parameter indicates a currently viewing display mode as a display mode. For example, the display modes include a normal mode, a dynamic mode and a cinema mode. - In an example illustrated in
FIG. 21 ,data output device 500 supportsmode 1,mode 2 andmode 3, anddisplay device 510 supportsmode 1. Further,data output device 500 can obtain parameters ofdisplay device 510 formode 1 andmode 2, and cannot obtain a parameter ofdisplay device 510 formode 3. Furthermore,mode 2 has higher reproducibility than that ofmode 1, andmode 3 has higher reproducibility than that ofmode 2. - In this case, a mode of the highest reproducibility among the modes supported by
data output device 500 anddisplay device 510 ismode 3. However,data output device 500 cannot obtain the parameter ofdisplay device 510 formode 3, and thereforemode 3 is excluded. Further,data output device 500 selectsmode 2 whose reproducibility is the second highest to that ofmode 3 and whose parameter can be obtained, as a conversion mode to use. Furthermore,data output device 500 obtains parameters which is necessary formode 2, fromdisplay device 510, and performs a conversion process ofmode 2 by using the obtained parameters. - Thus, HDR
meta interpreter 503 further determines a conversion mode of a conversion process performed bydata output device 500 ordisplay device 510 according to whether or not it is possible to obtain fromdisplay device 510 parameters for each of a plurality of first conversion modes matching a plurality of items of meta data. More specifically, HDRmeta interpreter 503 determines as a conversion mode of a conversion process to be performed bydata output device 500 or display device 510 a conversion mode which is included in a plurality of first conversion modes and included in at least one of a plurality of second conversion modes and the third conversion mode, and which makes it possible to obtain the parameters fromdisplay device 510. - That is,
data output device 500 selects a mode of the highest reproducibility among the conversion modes supported bydata output device 500 anddisplay device 510, and determines whether or not it is possible to obtain a parameter ofdisplay device 510 for the selected mode when onlydata output device 500 supports the selected mode. When the parameter can be obtained,data output device 500 selects this mode. Meanwhile, when the parameter cannot be obtained,data output device 500 selects another mode (a mode of the second highest reproducibility). - Thus,
data output device 500 determines a conversion mode to use according to whether or not it is possible to obtain the parameter ofdisplay device 510 and, consequently, can select a more appropriate conversion mode. - [18. Configuration Example 2 of Data Output Device]
- Another configuration example of the data output device will be described below.
FIG. 22 is a block diagram illustrating a configuration ofdata output device 500A. Thisdata output device 500A further includesDC 507 compared todata output device 500 illustrated inFIG. 16 .DC 507 down-converts a resolution of a video signal obtained byvideo decoder 501. For example, when a video signal is 4K,DC 507 down-converts a 4K video signal into a 2K video signal. - According to this configuration,
data output device 500A can selectively perform an operation of (1) converting a 4K HDR signal into a 2K HDR signal to output, (2) converting the 4K HDR signal into the 2K HDR signal, and then changing the dynamic range inDR converter 505 to output and (3) converting the 4K SDR signal into a 2K SDR signal to output, according to a resolution and a dynamic range supported bydisplay device 510. That is,data output device 500A can switch an operation according to a resolution ofdisplay device 510 and whether or not displaydevice 510 supports an HDR signal. -
FIG. 23 is a view illustrating an example of combinations of characteristics of a video signal of content (a resolution and a dynamic range (luminance range)), characteristics ofdisplay device 510 and an output signal ofdata output device 500A.Data output device 500A selects a format of an output signal to match a resolution ofdisplay device 510 and whether or not displaydevice 510 supports an HDR signal, and controlsDC 507 andDR converter 505 to generate an output signal of the selected format. - When, for example, a video signal of content is an HDR signal of a 4K resolution, and
display device 510 does not support displaying the HDR signal of the 4K resolution and supports displaying an HDR signal of a 2K resolution,data output device 500A converts the video signal of the content into an HDR signal of the 2K resolution to output (see the combination example in the second row inFIG. 23 ). In this case,DC 507 converts a resolution of a video signal. - Further, when a video signal of content is an HDR signal of a 4K resolution, and
display device 510 does not support displaying the HDR signal of the 4K resolution and an HDR signal of a 2K resolution, and supports displaying a 2K SDR signal,data output device 500A converts the video signal of the content into an SDR signal of the 2K resolution to output (see the combination example in the third row inFIG. 23 ). In this case,DC 507 converts a resolution of a video signal, andDR converter 505 converts a luminance range. - Consequently,
display device 510 can more faithfully reproduce video signals of content. In addition,data output device 500A may convert a resolution ordisplay device 510 may operate to convert a dynamic range as described with reference toFIG. 16 . - Thus,
data output device 500A includes a down-converter (DC 507) which generates a third video signal by lowering a resolution of the first video signal obtained byvideo decoder 501. The converter (DR converter 505) further generates a fourth video signal of a luminance range narrower than a luminance range of the third video signal by performing a process of converting the luminance range of the third video signal according to one of a plurality of second conversion modes based on the conversion auxiliary data. The output unit (HDMI output unit 506) further outputs the third video signal or the fourth video signal to displaydevice 510. - Consequently,
data output device 500A can change a resolution of a video signal to, for example, a resolution suitable to displaydevice 510 or the like. - More specifically, when
display device 510 does not support displaying a video image of a resolution of the first video signal, (1) the down-converter (DC 507) generates the third video signal and (2) the output unit (HDMI output unit 506) outputs the third video signal to displaydevice 510. As illustrated in, for example,FIG. 23 , when a resolution of a video signal is 4K and a resolution ofdisplay device 510 is 2K, a 2K output signal is output. - Further, when
display device 510 does not support displaying a video image of a luminance range (HDR) of the first video signal, (1) the converter (DR converter 505) generates the second video signal of a luminance range (SDR) narrower than the luminance range (HDR) of the first video signal, and (2) the output unit (HDMI output unit 506) outputs the second video signal and HDR control information to displaydevice 510. When, for example, a dynamic range (luminance range) of a video signal is an HDR anddisplay device 510 does not support the HDR (in case of an SDR) as illustrated inFIG. 23 , an HDR video signal is converted into an SDR video signal and the SDR video signal (output signal) is output. - Further, when
display device 510 does not support displaying a video image of the first video signal, and does not support displaying a video image of the luminance range (HDR) of the first video signal, (1) the down-converter (DC 507) generates a third video signal, (2) the converter (DR converter 505) generates the fourth video signal of a luminance range (SDR) narrower than the luminance range (HDR) of the third video signal, and (3) the output unit (HDMI output unit 506) outputs the fourth video signal to displaydevice 510. When, for example, a resolution of a video signal is 4K, a dynamic range (luminance range) of the video signal is an HDR, the resolution ofdisplay device 510 is 2K, anddisplay device 510 does not support the HDR (in case of an SDR) as illustrated inFIG. 23 , the 2K and SDR output signal is output. - [19. Operation Model of Playing Back HDR Signal and 4K Signal]
-
FIG. 24 is a view illustrating an example of an operation model of playing back a 4K HDR signal, a 2K HDR signal and a 4K SDR signal in a next-generation Blu-ray playback device, and outputting playback signals to an HDR supporting 4K TV, anHDR non-supporting 4K TV and an SDR supporting 2K TV. - The Blu-ray playback device obtains static HDR meta data stored in content management information, and dynamic HDR meta data stored in a video encoded stream. By using these items of HDR meta data, the Blu-ray playback device converts a video HDR signal into an SDR signal to output according to characteristics of an output destination TV connected according to HDMI, or outputs HDR meta data as a HDMI control signal.
- Each process of converting an HDR signal into an SDR signal and a process of converting an HDR signal into a video signal of a luminance range matching a display device can be selected from a plurality of methods and implemented. By storing HDR meta data matching the implemented conversion process, in content management information or a video encoded stream during creation of content, it is possible to enhance an effect of the conversion process. The content management information or the encoded stream can store a plurality of items of HDR meta data per converting method.
- In addition, the Blu-ray playback device may include a plurality of conversion processors such as option conversion module B or option conversion module D, may include only one conversion processor by taking into account a balance between device cost and performance or may not include a conversion processor. Similarly, an HDR supporting TV may include a plurality of conversion processors, may include only one conversion processor or may not include a conversion processor.
- Further, similar to a user data storage SEI message illustrated in
FIG. 14 or 15 , HDR meta data is stored in a predetermined container which defines a format and an operation during an input. Consequently, even when a new conversion process is developed in future, new HDR meta data is defined and a display device which supports this new HDR meta data is connected to a Blu-ray device which does not support the new HDR meta data, it is possible to output new HDR meta data from the Blu-ray playback device to the display device. Further, the display device can perform a conversion process matching new HDR meta data. Consequently, when a new technique is developed, it is possible to support the new technique by a simple process of assigning an ID to new HDR meta data. Consequently, it is possible to enhance competitiveness of package media standards such as Blu-ray against applications such as OTT whose technical development is fast. In addition, the Blu-ray playback device which supports new HDR meta data may perform the new conversion process on video data in the playback device, and output the processed video data to the display device. - Further, which one of the Blu-ray playback device and a TV performs a conversion process is determined based on the methods illustrated in
FIGS. 19 to 21 . In addition, the playback device may down-convert a 4K signal into a 2K signal according to a resolution of the TV to output. - [20.
Method 1 for Storing HDR Meta Data] -
FIG. 25 is a view illustrating an example of a method for storing static HDR meta data and two items of dynamic HDR meta data. As illustrated inFIG. 20 , an extendable HDR method according to the present exemplary embodiment, three items of (a) static HDR meta data, (b) dynamic HDR meta data clip (dynamic HDR meta data) and (c) dynamic HDR meta data are used. - (a) Static HDR meta data is stored in a meta data storage area of each stream (a playlist in case of a BDA (Blu-ray Disc Association)) defined by application standards of the BDA or the like or a distribution system. (b) A dynamic HDR meta data clip (dynamic HDR meta data) is stored in a secondary use TS stream defined by application standards of the BDA or the like or a distribution system. (c) Dynamic HDR meta data is stored as a SEI message included in a video stream such as HEVC.
- By properly using these three items of data, it is possible to change a combination of items of meta data to use when a new HDR technique is introduced. Consequently, it is possible to change conditions to introduce a new HDR technique. For example, when an original HDR technique needs to be introduced as soon as possible without considering compatibility, it is possible to introduce the original HDR technique without influencing application standards or a distribution system by using only meta data (c). By contrast with this, when a new technique needs to be defined by application standards or a distribution system by considering compatibility even though a time is taken more or less, it is possible to realize both of compatibility and timely introduction of a new technique by using the items of meta data (a) and (b).
- [21.
Method 2 For Storing HDR Meta Data] - An example of how to use the three items of meta data (a) to (c) illustrated in
FIG. 25 will be described in detail by using Blu-ray as an example. - First, a case where a proponent of a new HDR technique wishes early implementation will be described. In this case, only meta data (c) is used. (1) The proponent discloses only an outline of the new HDR technique. (2) A test disk in which meta data of the new technique for checking compatibility with an existing HDR (basic portion) Blu-ray playback device is provided. (3) The BDA registers the new technique as a non-official option, does not test non-compatibility and takes no responsibility. (4) The BDA takes no responsibility.
- Next, a case where the proponent of the new HDR technique considers compatibility to widely spread the new technique will be described. In this case, only the items of meta data (a) and (b) are used. (1) The proponent discloses details of the technique. (2) A draft of a specification for Blu-ray to adapt the technique to Blu-ray is submitted. (3) A draft of a test specification for Blu-ray to adapt the technique to Blu-ray is submitted. (4) A test stream is provided. (5) A test disk is provided. (6) A verifier is updated. (7) The BDA registers the new technique as an official option, annexes the new technique to written standards and tests compatibility at minimum. (8) The BDA permits an announcement that the new technique is adopted as an official option by the BDA.
- [22. User Guidance Display Method 1]
-
FIG. 26 is a view illustrating a method for displaying a user guidance in a Blu-ray device which executes an HDR-SDR conversion process. - An algorithm of an HDR-SDR conversion process is not established, and therefore it is difficult to accurately perform HDR-SDR conversion in a current situation. Further, it is also possible to implement a plurality of algorithms of an HDR-SDR conversion process.
- Hence, when a user inserts an HDR supporting disk into an HDR supporting Blu-ray device connected to an HDR non-supporting TV, it is necessary to appropriately guide users.
- When the HDR supporting Blu-ray device connected to the HDR non-supporting TV detects a start of an HDR-SDR conversion process, a guide message such as “The disk is an HDR non-supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image.” is displayed.
- Thus, when the display device does not support a video output of a luminance range of the first video signal (HDR signal), the data output device (Blu-ray device) outputs the second video signal (SDR signal) converted from a first luminance range into a second luminance range, and HDR control information to the display device, and causes the display device to display something to the effect that the second video signal converted from the first luminance range into the second luminance range is displayed.
- [23. User Guidance Display Method 2]
-
FIG. 27 is a view illustrating a method for displaying a user guidance during execution of a process of converting an HDR stored in a disk into an SDR. - A message (menu) which needs to be displayed by a Blu-ray device when an HDR-SDR conversion process is performed is stored in an HDR disk or a non-volatile memory in the Blu-ray device. Consequently, the Blu-ray device can display a message during execution of an HDR-SDR conversion process. In this case, for example, a message such as “The disk is an HDR supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image.” is displayed.
- [24. User Guidance Display Method 3]
-
FIG. 28 is a view illustrating a method for displaying a user guidance menu during execution of a process of converting an HDR stored in a disk into an SDR. - By using a Blu-ray menu, the Blu-ray device can display a message such as “The disk is HDR supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image. Would you like to play back SDR video image?”. When a user pushes “Play back” button, the Blu-ray device starts displaying converted image. Further, when the user selects “Do not play back”, the Blu-ray device stops playback, and displays a message which encourages the user to insert an HDR non-supporting Blu-ray disc.
- Thus, when the display device does not support a video output of a luminance range of the first video signal (HDR signal), the data output device (Blu-ray device) causes the display device to display a message which encourages the user to select whether or not to display the second video signal (SDR signal) converted from the first luminance range into the second luminance range.
- [25. User Guidance Display Method 4]
-
FIG. 29 is a view illustrating a method for displaying a user guidance menu which enables selection of a processing method during execution of a process of converting an HDR stored in a disk into an SDR. - The Blu-ray device displays something to the effect that meta data for an HDR-SDR conversion process is stored in Blu-ray when the meta data is stored in Blu-ray. When the user selects a specified converting method, the Blu-ray device displays a message indicating that more beautiful conversion is possible. That is, according to a Java (registered trademark) command in a disk, what HDR-SDR conversion process is implemented in the Blu-ray device is determined. Consequently, the Blu-ray device can display a selection menu of an HDR-SDR conversion processing method, such as “The disk is HDR supporting disk. Your TV is HDR non-supporting TV, and SDR video image converted from HDR into SDR by the Blu-ray device is played back instead of HDR video image. Which method do you choose? (Play back by process 1), (Play back by process 3) and (Do not play back)”. In addition, in this regard,
process 1 andprocess 3 are different types of HDR-SDR conversion processes. - Thus, when the display device does not support a video output of a luminance range of the first video signal (HDR signal), the data output device (Blu-ray device) causes the display device to display a message which encourages the user to select one of a plurality of converting methods for converting the first luminance range into the second luminance range.
- [26. User Guidance Display Method 5]
- In addition, it is also possible to display the same message by broadcasting, too. For example, a TV or a playback device which does not support an HDR signal displays a message by using a data broadcast application that a broadcast program uses HDR signals and cannot be accurately displayed when the program is viewed. Further, a TV or a playback device which supports an HDR signal may not display this message. Furthermore, a tag value indicating a message attribute indicates that the message is a warning message for HDR signals. The TV or the playback device which supports HDR signals determines that it is not necessary to display a message by referring to a tag value.
- [27. Method For Transmitting HDR Meta Data]
- For example, dynamic HDR meta data or static HDR meta data adopts a data structure which can be transmitted according to HDMI. In this regard, depending on a specification or a version of a transmission protocol such as HDMI, whether or not it is possible to transmit HDR meta data to the display device according to the transmission protocol is determined.
- First, the method for transmitting dynamic HDR meta data will be described.
- For example, according to existing HDMI2.0, it is not possible to transmit dynamic HDR meta data which is variable in frame or scene units. Therefore, it is necessary to extend standards and newly define a packet for transmitting the dynamic HDR meta data. A version of this extension standards is 2.1.
- In this case, when the playback device such as an HDR supporting Blu-ray device or a broadcast receiving device is connected with a display device such as a TV via HDMI2.1, the playback device can transmit dynamic HDR meta data to the display device. However, when the playback device and the display device are connected according to HDMI of an older version than 2.1, the playback device cannot transmit the dynamic HDR meta data to the display device.
- First, the playback device determines whether or not a HDMI version which can establish connection with the display device supports transmission of dynamic HDR meta data. When the version does not support transmission of dynamic HDR meta data, the playback device performs an HDR-SDR conversion process by using dynamic HDR meta data, and then outputs the converted signal to the display device according to HDMI.
- Further, the playback device may operate also based on whether or not the display device supports a conversion process performed by using dynamic HDR meta data. That is, when the display device does not support a conversion process, and even when the playback device can transmit dynamic HDR meta data according to a HDMI version, the playback device may perform a conversion process. Further, when the playback device does not support a conversion process performed by using dynamic HDR meta data, the playback device may not perform a conversion process and may not transmit the dynamic HDR meta data to the display device, either.
-
FIG. 30 is a flowchart illustrating a method of the playback device for transmitting dynamic HDR meta data. First, the playback device determines whether or not the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (S501). In other words, the playback device determines whether or not the playback device and the display device can be connected according to HDMI2.1 which supports transmission of dynamic HDR meta data. More specifically, the playback device determines whether or not both of the playback device and the display device support HDMI2.1. - When the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (Yes in S501), the playback device performs a conversion process by using dynamic HDR meta data and transmits converted image data to the display device according to HDMI (S502). The conversion process described herein is a process of changing a luminance range of image data, and is a process of converting an HDR into an SDR to match a luminance range supported by the display device or a process of converting an HDR into an HDR signal of a narrower luminance range.
- Meanwhile, when the playback device and the display device are connected according to HDMI2.1 or a newer version than HDMI2.1 (No in S501), the playback device transmits image data for which a conversion process is not yet performed, and dynamic HDR meta data to the display device according to HDMI by using different types of packets (S503).
- Next, a method for transmitting static HDR meta data will be described.
- It is possible to use Infoframe such as AVI (Auxiliary Video Information) Infoframe to transmit static HDR meta data according to HDMI. However, a maximum data size which can be stored in AVI Infoframe is 27 bytes according to HDMI2.0, and therefore data having a larger size than this maximum data size cannot be processed. Hence, when a static HDR meta data size exceeds an upper limit value at which data can be transmitted according to HDMI, the playback device transmits data for which a conversion process has been performed, to the display device. Alternatively, when a static HDR meta data size which can be transmitted differs depending on a HDMI version, the playback device determines whether to transmit static HDR meta data to the display device based on a HDMI version for connecting the playback device and the display device, and perform a conversion process in the playback device.
- Further, the static HDR meta data may be classified into a required portion and an extension portion, and a size of the required portion may be set to a size or less which can be transmitted according to a specific version of a specific transmission protocol such as existing HDMI2.0. For example, the playback device may transmit only the required portion to the display device when using HDMI2.0, and may transmit the required portion and the extension portion together when using HDMI2.1. Further, identification information indicating that static HDR meta data includes a necessary portion and an extension portion or indicating that at least the required portion can be transmitted according to a specific version such as HDMI2.0 may be stored in a database such as PlayList or PlayItem in a Blu-ray disc.
- Alternatively, more simply, static HDR meta data may be set to a size or less which can be transmitted according to a lowest version such as HDMI2.0 which enables transmission of static HDR meta data. In addition, a syntax of static HDR meta data in a disk stored in management information such as a playlist or video stream SEI, and a syntax of static HDR meta data which is transmitted according to HDMI may be different. When both of the syntaxes are different, the playback device converts the static HDR meta data in the disk into the syntax of the static HDR meta data according to the transmission protocol to output.
- In addition, content of Blu-ray has been described as an example. However, the same applies to meta data which is used for broadcasting or the OTT.
-
FIG. 31 is a flowchart illustrating a method of the playback device for transmitting static HDR meta data. First, the playback device determines whether or not the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (S511). - When the playback device and the display device are connected according to HDMI2.0 or an older version than HDMI2.0 (Yes in S511), the playback device transmits only a required portion of static HDR meta data to the display device according to HDMI (S512).
- Meanwhile, when the playback device and the display device are connected according to HDMI2.1 or a newer version than HDMI2.1 (No in S511), the playback device transmits both of a required portion and an extension portion of static HDR meta data to the display device according to HDMI (S513).
- Thus, the playback device switches whether or not to transmit dynamic HDR meta data to the display device according to a HDMI version, yet transmits at least a required portion of static HDR meta data to the display device at all times irrespectively of the HDMI version.
- That is, the playback device transmits a video signal to the display device. When a version of a transmission protocol which connects the playback device and the display device is a first version (e.g. HDMI2.0), the playback device transmits, to the display device, first meta data (static HDR meta data) which is information which is commonly used for a plurality of images included in continuous playback units of the video signal and relates to a luminance range of the video signal, without transmitting, to the display device, second meta data (dynamic HDR meta data) which is information which is commonly used for units subdivided compared to the continuous playback units of the video signal and relates to the luminance range of the video signal. Further, when the version of the transmission protocol is the second version (e.g. HDMI2.1), the playback device transmits both of the first meta data (static HDR meta data) and the second meta data (dynamic HDR meta data) to the display device.
- Consequently, the playback device can transmit appropriate meta data to the display device according to the version of the transmission protocol.
- Further, when the version of the transmission protocol is the first version (e.g. HDMI2.0) (Yes in S501), the playback device performs a process of converting a luminance range of a video signal by using the second meta data (dynamic HDR meta data), and transmits the converted video signal to the display device (S502).
- Thus, when dynamic HDR meta data cannot be transmitted to the display device and the display device cannot perform a conversion process, the playback device can perform a conversion process.
- Further, when the version of the transmission protocol is the second version (e.g. HDMI2.1) and the display device does not support a conversion process, the playback device performs a conversion process, transmits the converted video signal to the display device and does not transmit the second meta data to the display device. Furthermore, when the version of the transmission protocol is the second version (e.g. HDMI2.1) and the display device supports a conversion process, the playback device transmits a video signal and the second meta data to the display device without performing a conversion process.
- Consequently, one appropriate device of the playback device and the display device can execute a conversion process.
- Further, when the playback device does not support a conversion process of converting a luminance range of a video signal by using the second meta data (dynamic HDR meta data), the playback device transmits the video signal to the display device without performing the conversion process, and does not transmit the second meta data (dynamic HDR meta data) to the display device.
- [28. Adjustment of Luminance Value]
- How to use HDR meta data to faithfully reproduce an HDR signal and HDR-SDR conversion in the playback device have been described above. However, an HDR signal has a substantially high peak luminance than that of a conventional SDR signal. Therefore, the playback device may control a peak luminance of a video image by taking into account performance of a panel or a signal processing circuit of the display device such as a TV or an influence on a human body. In addition, the process described below (playback method) may be performed by the playback device such as a Blu-ray device or may be performed by the display device such as a TV. In other words, the playback device described below only needs to have a function of playing back video images, and includes the above-described playback device (e.g. Blu-ray device) and the display device (e.g. TV).
- In this regard, even when an upper limit of a luminance value which can be output from each pixel of a TV panel is 1000 nit, an area which can output a luminance of 1000 nit simultaneously is assumed to be limited to 50% of a screen. In this case, even when 70% of an area of a screen is 1000 nit, a signal value of an HDR signal cannot be output as is. Hence, the playback device may control a luminance value of each pixel to play back an HDR signal based on following playback conditions.
- First, a first method will be described. The playback device adjusts a luminance value such that an inter-screen luminance change amount at reference time interval T is threshold P or less. Reference time interval T described herein is, for example, an integer multiple of a reciprocal of a video frame rate.
- Threshold P is an absolute value of a luminance or a rate of a change of a luminance value. This threshold P is determined based on an influence which a flash of an image has on a human body or following performance of a TV panel for a change of a signal value.
- Further, conditions may be set such that a number of pixels whose intra-screen luminance value change amounts exceeds threshold P is a predetermined rate or less. Furthermore, the screen may be divided into a plurality of areas, and the same or different conditions may be set per area.
- Next, a second method will be described. The playback device adjusts a luminance value such that a number of pixels which have luminances of reference luminance S or more or a rate that these pixels occupy in total pixels in a screen is threshold Q or less.
- Reference luminance S and threshold Q are determined based on an influence on a human body or an upper limit value of a voltage which is simultaneously applicable to each pixel of a TV panel.
- Further, when parameters (threshold P, reference luminance S and threshold Q) used for the first method and the second method are set based on performance of a TV panel, values of the parameters can be set per TV.
- A method for controlling a pixel value according to the first method will be described below. For example, a peak luminance of a plurality of pixels configuring a frame at time t is assumed to be L1. When a luminance value of a pixel whose coordinate is (i,j) in a frame at time t+T is I(i,j), the playback device adjusts a luminance value for each pixel whose absolute value of a difference between I(i,j) and L1 exceeds threshold P such that the difference is threshold P or less. This process may be performed on an entire screen or may be performed per area by dividing the screen to perform processes in parallel. For example, the playback device divides the screen in a horizontal direction and a vertical direction, respectively, and adjusts a luminance value such that a change amount of a luminance in each area is threshold P or less.
- Further, a frame interval to display images on a TV panel is assumed to be as reference time interval T. However, there is a case where, when a luminance value is adjusted based only on a luminance value of a last frame, continuity of luminance values between frames is sometimes lost. Hence, a predetermined time constant may be set, and the playback device may determine a luminance value (above L1) by adding a weight to a peak luminance of each frame in a range of the set time constant. In this case, a time constant and a weighting coefficient are set in advance such that a change amount of a luminance value is threshold P or less.
- Next, a method for controlling a pixel value according to the second method will be described. This control method includes following two types of methods. The first method is a method for clipping a luminance value for each pixel whose luminance value exceeds a predetermined value. For example, a luminance value of each pixel whose luminance value exceeds the predetermined value is adjusted to the predetermined value.
- The second method is a method for entirely lowering a luminance value of each pixel in the screen such that a relative luminance value rate between pixels is held as much as possible by, for example, setting Knee point instead of uniformly clipping each luminance value. Alternatively, a luminance value of a high luminance portion may be lowered while a luminance value of a low luminance portion is held.
- For example, it is assumed that the number of pixels of a TV panel is 8 mega pixels, and that a total sum of luminance values of all pixels in a screen is limited to 8 mega×500 nit=4 giga nit or less. In this regard, it is assumed that a luminance value of an HDR signal of content is 400 nit in area A (4 mega pixels) which is half of the screen, and is 1000 nit in area B (4 mega pixels) which is the rest of the half. In this case, when each luminance value is uniformly clipped, all luminance values in area B are clipped to 600 nit. As a result, a total sum of luminance values of all pixels is 4 mega×400+4 mega×600=4 giga nit and satisfy the above limitation.
- In addition, not only to play back an HDR signal but also to generate an HDR signal, a luminance value of each pixel in a frame of a video or a still image may be determined such that the conditions of the above first method or second method are satisfied.
-
FIG. 32 is a flowchart of a method for controlling a luminance value during playback of an HDR signal. First, the playback device determines whether or not an inter-screen luminance value change amount or an intra-screen luminance value satisfies playback conditions (S521). More specifically, as described above, the playback device determines whether or not the inter-screen luminance value change amount is the threshold or less or the intra-screen luminance value is the threshold or less. - When the inter-screen luminance value change amount or an intra-screen luminance value satisfies the playback conditions, i.e., when the inter-screen luminance change amount is the threshold or less or the intra-screen luminance value is the threshold or less (Yes in S521), the playback device outputs a signal of the same luminance value as a luminance value of an input HDR signal (S522). That is, the playback device outputs a luminance value of an HDR signal without adjusting the luminance value.
- Meanwhile, when the inter-screen luminance value change amount or the intra-screen luminance value does not satisfy the playback conditions, i.e., when the inter-screen luminance change amount exceeds the threshold or the intra-screen luminance value exceeds the threshold (No in S521), the playback device adjusts a luminance value of each pixel and outputs an adjusted luminance value to satisfy the playback conditions (S523). That is, the playback device adjusts the luminance value of each pixel such that the inter-screen luminance value change amount is the threshold or less or the intra-screen luminance value is the threshold or less.
- As described above, the playback device according to the present exemplary embodiment plays back video signals. A luminance of a video signal is a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit. That is, the video signal is an HDR signal.
- As described above with reference to the first method, the playback device determines whether or not an inter-screen luminance value change amount of a video signal exceeds a predetermined first threshold (S521). For example, the playback device determines whether or not the luminance value change amount at a reference time interval which is an integer multiple of a reciprocal of a frame rate of the video signal exceeds the first threshold.
- When it is determined that the luminance value change amount exceeds the first threshold (No in S521), the playback device performs an adjustment process of lowering the luminance value of the video signal (S523). More specifically, for a pixel whose luminance value change amount exceeds the first threshold, the playback device adjusts a luminance value of the pixel such that the luminance value change amount of the pixel is the first threshold or less.
- Consequently, when a luminance value of a video signal exceeds display capability of the display device, the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a large change amount of a luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- More specifically, in step S521, the playback device determines whether or not a difference between a peak luminance of a first image included in the video signal, and each of luminance values of a plurality of pixels included in a second image in the video signal subsequent to the first image exceeds the first threshold may be determined. In step S523, for a pixel whose difference exceeds the first threshold, the playback device adjusts a luminance value of the pixel such that the difference of the pixel is the first threshold or less.
- Alternatively, in step S521, the playback device determines whether or not a rate of pixels whose luminance value change amounts exceed the first threshold with respect to a plurality of pixels included in an image included in the video signal exceeds a second threshold. In step S523, when the rate exceeds the second threshold, the playback device adjusts luminance values of the plurality of pixels such that the rate is the second threshold or less.
- Further, in step S521, for each of a plurality of areas obtained by dividing the screen, the playback device determines whether or not an inter-screen luminance value change amount of each area exceeds the first threshold. In step S523, the playback device performs an adjustment process of lowering a luminance value of an area for which it is determined that the luminance value change amount exceeds the first threshold.
- Alternatively, as described above with reference to the second method, the playback device determines whether or not a luminance value of an image included in a video signal exceeds the predetermined first threshold (S521). When it is determined that the luminance value of each pixel exceeds the first threshold (No in S521), the playback device performs an adjustment process of lowering the luminance value of the image (S523).
- Consequently, when a luminance value of a video signal exceeds display capability of the display device, the playback device can generate a video signal which the display device can appropriately display by lowering the luminance value of the video signal. Further, when a high luminance value of a video signal is likely to negatively influence viewers, the playback device can reduce the negative influence by lowering the luminance value of the video signal.
- More specifically, in step S521, the playback device determines the number of pixels whose luminance values exceed the first threshold among a plurality of pixels included in an image. In step S523, when the number of pixels whose luminance values exceed the first threshold exceeds the third threshold, the playback device lowers the luminance value of the image such that the number of pixels whose luminance values exceed the first threshold is a third threshold or less.
- Alternatively, in step S521, the playback device determines a rate of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image. In step S523, when the rate exceeds the third threshold, the playback device lowers the luminance value of the image such that the rate is the third threshold or less.
- Further, the first threshold, the second threshold and the third threshold are values calculated based on an upper limit value of a voltage which is simultaneously applicable to a plurality of pixels in a display device which displays video signals.
- [29. Method For Arranging Meta Data]
- A method for arranging static HDR meta data and dynamic HDR meta data in a video stream will be described below.
- Static HDR meta data may be stored in a head access unit in a decoding order in random access units such as GOP to store the static HDR meta data in a video stream by using SEI. In this case, a NAL unit including SEI is arranged prior to a NAL unit in which a video encoded data is stored in the decoding order.
- Further, the same meta data is used as these two items of dynamic HDR meta data to store the dynamic HDR meta data in both of management information such as a playlist, and a video stream.
- Furthermore, dynamic HDR meta data can be switched in random access units and is fixed in the random access units. For example, SEI in which dynamic HDR meta data is stored is stored in a head access unit in the random access units. Decoding starts from a head of the random access units to start playback from a middle of a stream. Further, during special playback such as high-speed playback of playing back only picture I and picture P, a head access unit in the random access units is decoded at all times. Hence, by storing HDR meta data in a head access unit in the random access units, the playback device can obtain HDR meta data at all times.
- In a stream according to MPEG-4 AVC or HEVC, only a head access unit in the decoding order in the random access units includes a Sequence Parameter Set (SPS) which is initialization information during decoding. It is possible to use this SPS as information indicating start of the random access units.
- Further, static HDR meta data and dynamic HDR meta data may be stored in different SEI messages. Both of the SEI messages are identified based on identification information included in a type of the SEI message or the payload of the SEI message. When, for example, transmitting only static HDR meta data according to HDMI, the playback device can extract only a SEI message including the static HDR meta data, and transmit meta data included in a payload as it is according to HDMI. Consequently, the playback device does not need to perform a process of analyzing a payload of the SEI message, and obtaining static HDR meta data.
- [30. Dual Disk Playback Operation 1]
- An operation of playing back an HDR disk in which only HDR signals are stored has been described above.
- Next, multiplexed data stored in a dual disk in which both of HDR signals and SDR signals are stored will be described with reference to
FIG. 33 .FIG. 33 is a view for explaining multiplexed data stored in a dual disk. - As illustrated in
FIG. 33 , HDR signals and SDR signals are stored as different multiplexed streams in the dual disk. For example, items of data of a plurality of media such as a video, an audio, a caption and graphics are stored as one multiplexed stream in an optical disk such as Blu-ray according to a MPEG-2 TS-based multiplexing method called M2TS. A reference is made to these multiplexed streams from playback control meta data such as a playlist. During playback, a player analyzes meta data to select a multiplexed stream to play back or individual language data stored in the multiplexed stream. This example is a case where HDR and SDR playlists are individually stored, and the respective playlists refer to HDR signals or SDR signals. Further, identification information indicating that both of HDR signals and SDR signals are stored may be additionally indicated. - It is also possible to multiplex both of HDR signals and SDR signals in the same multiplexed stream. However, it is necessary to perform multiplexing to satisfy a buffer model such as T-STD (System Target Decoder) defined according to MPEG-2 TS. Particularly, it is difficult to multiplex two videos of high bit rates within a range of a predetermined data reading rate. Hence, it is desirable to demultiplex multiplexed streams.
- Data such as an audio, a caption or graphics needs to be stored for each multiplexed stream, and a data amount increases compared to a case where data is multiplexed into one video stream. In this regard, it is possible to reduce an increase in a data amount by reducing a video data amount by using a video encoding method of a high compressibility. For example, by changing MPEG-4 AVC which is conventionally used for Blu-ray to HEVC (High Efficiency Video Coding), the compressibility is expected to improve 1.6 to 2 times. Further, by storing a combination of a 2K HDR and a 2K SDR or a combination of a 4K SDR and a 2K HDR, i.e., by storing two 2Ks or a combination of 2K and 4K in a dual disk, storing two 4Ks may be banned to permit only a combination which can be stored in an optical disk.
- [31. Conclusion]
- A Blu-ray device which plays back a 4K supporting BD or an HDR supporting BD needs to support four TVs of a 2K_SDR supporting TV, a 2K_HDR supporting TV, a 4K_SDR supporting TV and a 4K_HDR supporting TV. More specifically, the Blu-ray device needs to support three pairs of HDMI/HDCP (High-bandwidth Digital Content Protection) standards (HDMI1.4/HDCP1.4, HDMI2.0/HDCP2.1 and HDMI2.1/HDCP2.2).
- Further, when playing back four types of Blu-ray discs (a 2K_SDR supporting BD, a 2K_HDR supporting BD, a 4K_SDR supporting BD and a 4K_HDR supporting BD), the Blu-ray device needs to select an appropriate process and HDMI/HDCP per BD (content) and per connected display device (TV). Furthermore, when graphics are synthesized with a video, too, it is necessary to change a process according to a BD type and a connected display device (TV) type.
- Hence, an internal process in the Blu-ray device becomes very complex. In the third exemplary embodiment, various methods for relatively simplifying a process in the Blu-ray device have been described.
- [1] It is necessary to convert an HDR into an SDR to display an HDR signal on an HDR non-supporting TV. By contrast with this, in the third exemplary embodiment, a configuration of a BD which is a dual streams disk has been proposed to make this conversion optional in a Blu-ray device.
- [2] Further, in the third exemplary embodiment, a graphic stream is limited, and types of combinations of video streams and graphic streams are reduced.
- [3] In the third exemplary embodiment, a dual streams disk and a graphic stream are limited to substantially reduce a number of combinations of a complex process in a Blu-ray device.
- [4] In the third exemplary embodiment, an internal process and a HDMI process which do not produce a contradiction for a process of a dual streams disk even when pseudo HDR conversion is introduced have been described.
- According to a converting method according to the present disclosure, “HDR→pseudo HDR conversion process” of converting an HDR video image to keep a gradation of an area exceeding 100 nit to some degree, converting the HDR video image into a pseudo HDR video image close to the original HDR and enabling the SDR TV to display the HDR video image is realized to display an HDR video image on an SDR TV instead of converting an HDR video image into an SDR video image of 100 nit or less by using a peak luminance of the SDR TV which is displayed and exceeds 100 nit (generally, 200 nit or more).
- Further, according to the converting method, the converting method of “HDR→pseudo HDR conversion process” may be switched according to display characteristics (a maximum luminance, input/output characteristics and a display mode) of the SDR TV.
- A method for obtaining display characteristics information includes (1) automatically obtaining the display characteristics information via HDMI (registered trademark) or a network, (2) generating the display characteristics information by having a user input information such as a manufacturer name or a model and (3) obtaining the display characteristics information from a cloud or the like by using information of the manufacturer name or the model.
- Further, a timing to obtain the display characteristics information of converting
device 100 includes (1) obtaining the display characteristics information immediately before pseudo HDR conversion, and (2) obtaining when connection with display device 200 (e.g. SDR TV) is established for the first time (when connection is established). - Furthermore, as for the converting method, the converting method may be switched according to HDR video image luminance information (CAL (Content Average Luminance) and CPL (Content Peak Luminance)).
- For example, a method for obtaining the HDR video image luminance information of converting
device 100 includes (1) obtaining the HDR video image luminance information as meta information accompanying an HDR video image, (2) obtaining the HDR video image luminance information by having the user input title information of content and (3) obtaining the HDR video image luminance information by using input information input by the user, from a cloud or the like. - Further, details of the converting method include (1) performing conversion such that a luminance does not exceed a DPL (Display Peak Luminance), (2) performing conversion such that CPL is DPL, (3) not changing a CAL and a luminance around the CAL, (4) performing conversion by using a natural logarithm and (5) performing a clip process by using the DPL.
- Furthermore, according to the converting method, it is also possible to transmit display settings such as a display mode and a display parameter of the SDR TV to display
device 200 to switch to enhance a pseudo HDR effect. For example, a message which encourages the user to make the display settings may be displayed on a screen. - [32.
Necessity 1 of Pseudo HDR] - Next, the necessity of a pseudo HDR will be described with reference to
FIGS. 34A to 34C . -
FIG. 34A is a view illustrating an example of a display process of converting an HDR signal in an HDR TV and displaying an HDR. - As illustrated in
FIG. 34A , when an HDR video image is displayed, and even when the display device is an HDR TV, it is not possible to display a maximum value of an HDR luminance range (peak luminance (HPL (HDR Peak Luminance): e.g. 1500 nit)) in some cases. In this case, luminance conversion for adjusting a linear signal which is inversely quantized by using an HDR EOTF, to a maximum value of a luminance range of the display device (peak luminance (DPL (Display Peak Luminance): e.g. 750 nit)) is performed. Further, by inputting to the display device a video signal obtained by performing the luminance conversion, it is possible to display an HDR video image which has been adjusted to the luminance range of the maximum value which is a limit of the display device. -
FIG. 34B is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR TV. - As illustrated in
FIG. 34B , when an HDR video image is displayed, and when the display device is an SDR TV, that a maximum value (peak luminance (DPL: e.g., 300 nit)) of a luminance range of the SDR TV which displays the HDR video image exceeds 100 nit is used. In “HDR→pseudo HDR conversion process” in an HDR supporting playback device (Blu-ray device) inFIG. 34B , “HDR EOTF conversion” performed in an HDR TV and “luminance conversion” performed by using a DPL (e.g.: 300 nit) which is the maximum value of the luminance range of the SDR TV are performed. In this case, when it is possible to directly input a signal obtained by performing “luminance conversion” to the “display device” which is the SDR TV, it is possible to realize the same effect as that of the HDR TV even by using the SDR TV. - However, the SDR TV does not have means for receiving a direct input of such a signal from an outside, and therefore cannot realize the same effect as that of the HDR TV.
-
FIG. 34C is a view illustrating an example of a display process of displaying an HDR by using an HDR supporting playback device and an SDR TV which are connected with each other via a standard interface. - As illustrated in
FIG. 34C , it is generally necessary to input a signal for providing the effect inFIG. 34B , to the SDR TV by using an input interface (HDMI (registered trademark)) of the SDR TV. In the SDR TV, the signal input via the input interface passes in order of “SDR EOTF conversion”, “luminance conversion of each mode” and the “display device”, and displays a video image matching a luminance range of a maximum value of the display device. Hence, an HDR supporting Blu-ray device generates a signal (pseudo HDR signal) which can cancel “SDR EOTF conversion” and “luminance conversion of each mode” which the signal passes immediately after the input interface of the SDR TV. That is, by performing “luminance inverse conversion of each mode” and “SDR inverse EOTF conversion” immediately after “HDR EOTF conversion” and “luminance conversion” performed by using a peak luminance (DPL) of the SDR TV, the HDR supporting Blu-ray device can realize in a pseudo manner the same effect as that obtained when a signal obtained immediately after “luminance conversion” is input to the “display device” (a broken line arrow inFIG. 34C ). - [33.
Necessity 2 of Pseudo HDR] - An input signal of a normal SDR TV is 100 nit yet has capability of expressing video images of 200 nit or more according to viewing environment (a dark room: a cinema mode, and a bright room: a dynamic mode). However, a luminance upper limit of an input signal to the SDR TV is determined as 100 nit, and therefore it has not been possible to directly use this capability.
- When the SDR TV displays HDR video images, that a peak luminance of the SDR TV which displays the HDR video images exceeds 100 nit (generally 200 nit or more) is used. Instead of converting the HDR video images into SDR video images of 100 nit or less, “HDR→pseudo HDR conversion process” is performed to keep a gradation of a luminance range exceeding 100 nit to some degree. Consequently, the SDR TV can display pseudo HDR video images close to the original HDR.
- When this “HDR→pseudo HDR conversion process” technique is applied to Blu-ray, as illustrated in
FIG. 35 , only HDR signals are stored in an HDR disk. When the SDR TV is connected to a Blu-ray device, the Blu-ray device performs “HDR→pseudo HDR conversion process”, converts an HDR signal into a pseudo HDR signal and outputs the pseudo HDR signal to the SDR TV. Consequently, by converting the received pseudo HDR signal into a luminance value, the SDR TV can display video images having a pseudo HDR effect. Thus, even when there is no HDR supporting TV, if an HDR supporting BD and an HDR supporting Blu-ray device are prepared, even an SDR TV can display pseudo HDR video images having higher quality than that of SDR video images. - Hence, it has been thought that an HDR supporting TV is necessary to view HDR video images. However, an existing SDR TV can display pseudo HDR video images which realize an HDR effect. Consequently, it can be expected that HDR-supporting Blu-ray spreads.
- [34. Effect and Others]
- By performing HDR-pseudo HDR conversion process on an HDR signal transmitted by way of broadcasting, a package medium such as Blu-ray or Internet distribution such as the OTT, an HDR signal is converted into a pseudo HDR signal. Consequently, an existing SDR TV can display the HDR signal as a pseudo HDR video image.
- [35. EOTF]
- Hereinafter, the EOTF will be described with reference to
FIGS. 36A and 36B . -
FIG. 36A is a view illustrating an example of the EOTF (Electro-Optical Transfer Function) which supports the HDR and the SDR, respectively. - The EOTF is a generally called gamma curve, indicates each correspondence between a code value and a luminance value and converts the code value into a luminance value. That is, the EOTF is association information indicating a correspondence relationship between a plurality of code values and luminance values.
- Further,
FIG. 36B is a view illustrating an example of an inverse EOTF which supports the HDR and the SDR, respectively. - The inverse EOTF indicates each correspondence between a luminance value and a code value, and quantizes a luminance value contrary to the EOTF and converts the luminance value into a code value. That is, the inverse EOTF is association information indicating a correspondence relationship between luminance value and a plurality of code values. When, for example, a luminance value of an HDR supporting video image is expressed by a 10-bit code value of a gradation, a luminance value of an HDR luminance range up to 10,000 nit is quantized and mapped on 1024 integer values of 0 to 1023. That is, the luminance value of the luminance range up to 10,000 nit (a luminance value of an HDR supporting video image) is quantized based on the inverse EOTF and thereby is converted into an HDR signal of the 10-bit code value. An HDR supporting EOTF (referred to as a “HDR EOTF” below) or an HDR supporting inverse EOTF (referred to as a “HDR inverse EOTF” below) can express a higher luminance value than that of an SDR supporting EOTF (referred to as a “SDR EOTF” below) or an SDR supporting inverse EOTF (referred to as a “SDR inverse EOTF” below). For example, in
FIGS. 36A and 36B , a maximum value of a luminance (peak luminance) is 10,000 nit. That is, an HDR luminance range includes an entire SDR luminance range, and an HDR peak luminance is higher than an SDR peak luminance. An HDR luminance range is a luminance range obtained by expanding a maximum value from 100 nit which is a maximum value of the SDR luminance range to 10,000 nit. - For example, examples of the HDR EOTF and the HDR inverse EOTF include SMPTE 2084 standardized by Society of Motion Picture & Television Engineers (SMPTE).
- In addition, in the following description, a luminance range from 0 nit to 100 nit which is a peak luminance illustrated in
FIGS. 36A and 36B will be described as a first luminance range in some cases. Similarly, a luminance range from 0 nit to 10,000 nit which is a peak luminance illustrated inFIGS. 36A and 36B will be described as a second luminance range in some cases. - [36. Converting Device and Display Device]
-
FIG. 37 is a block diagram illustrating a configuration of the converting device and the display device according to the exemplary embodiment.FIG. 38 is a flowchart illustrating a converting method and a display method performed by the converting device and the display device according to the exemplary embodiment. - As illustrated in
FIG. 37 , convertingdevice 100 includesHDR EOTF converter 101,luminance converter 102,luminance inverse converter 103 and SDRinverse EOTF converter 104. Further,display device 200 includesdisplay setting unit 201,SDR EOTF converter 202,luminance converter 203 anddisplay 204. - Each component of converting
device 100 anddisplay device 200 will be described in detail during description of the converting method and the display method. - [37. Converting Method and Display Method]
- The converting method performed by converting
device 100 will be described with reference toFIG. 38 . In addition, the converting method includes step S101 to step S104 described below. - First,
HDR EOTF converter 101 of convertingdevice 100 obtains an HDR video image for which HDR inverse EOTF conversion has been performed.HDR EOTF converter 101 of convertingdevice 100 performs HDR EOTF conversion on an HDR signal of the obtained HDR video image (S101). Thus,HDR EOTF converter 101 converts the obtained HDR signal into a linear signal indicating a luminance value. The HDR EOTF is, for example, SMPTE 2084. - Next,
luminance converter 102 of convertingdevice 100 performs first luminance conversion of converting the linear signal converted byHDR EOTF converter 101 by using display characteristics information and content luminance information (S102). According to the first luminance conversion, a luminance value corresponding to an HDR luminance range (referred to as a “HDR luminance value” below) is converted into a luminance value corresponding to a display luminance range (referred to as a “display luminance value” below). Details will be described below. - In view of the above,
HDR EOTF converter 101 functions as an obtaining unit which obtains an HDR signal as a first luminance signal indicating a code value obtained by quantizing the luminance value of a video image. Further,HDR EOTF converter 101 andluminance converter 102 function as converters which determine the code value indicated by the HDR signal obtained by the obtaining unit, based on a display (display device 200) luminance range, and converts the code value into a display luminance value corresponding to the display luminance range which is a maximum value (DPL) which is smaller than a maximum value (HPL) of the HDR luminance range and is larger than 100 nit. - More specifically, in step S101,
HDR EOTF converter 101 determines for the HDR code value which is a first code value indicated by the obtained HDR signal an HDR luminance value associated with an HDR code value by the HDR EOTF by using the obtained HDR signal and the HDR EOTF. In addition, the HDR signal indicates the HDR code value obtained by quantizing video (content) luminance value by using the HDR inverse EOTF of associating luminance values of the HDR luminance range and a plurality of HDR code values. - Further, in step S102,
luminance converter 102 determines for the HDR luminance value determined in step S101 a display luminance value which is associated in advance with the HDR luminance value and corresponds to the display luminance range, and performs first luminance conversion of converting the HDR luminance value corresponding to the HDR luminance range into a display luminance value corresponding to the display luminance range. - Furthermore, before step S102, converting
device 100 obtains content luminance information including at least one of a luminance maximum value (CPL: Content Peak luminance) of the video image (content) and average luminance value (CAL: Content Average luminance) of a video image, as information related to an HDR signal. The CPL (first maximum luminance value) is, for example, a maximum value among luminance values of a plurality of images configuring an HDR video image. Further, the CAL is, for example, an average luminance value which is an average of luminance values of a plurality of images configuring an HDR video image. - Furthermore, before step S102, converting
device 100 obtains display characteristics information ofdisplay device 200 fromdisplay device 200. In addition, the display characteristics information is information indicating a maximum value (DPL) of a luminance which can be displayed bydisplay device 200, a display mode (described below) ofdisplay device 200 and display characteristics ofdisplay device 200 such as input/output characteristics (an EOTF supported by the display device). - Further, converting
device 100 may transmit recommended display setting information (which will be described below and also referred to as “setting information” below) todisplay device 200. - Next,
luminance inverse converter 103 of convertingdevice 100 performs luminance inverse conversion matching a display mode ofdisplay device 200. Consequently,luminance inverse converter 103 performs second luminance conversion of converting a luminance value corresponding to the display luminance range into a luminance value corresponding to an SDR luminance range (0 to 100 [nit]) (S103). Details will be described below. That is,luminance inverse converter 103 determines for the display luminance value obtained in step S102 an SDR luminance value which is a luminance value (referred to as a “SDR luminance value” below) associated in advance with the display luminance value and corresponding to an SDR as a third luminance value corresponding to an SDR luminance range whose maximum value is 100 nit, and performs second luminance conversion of converting the display luminance value corresponding to the display luminance range into the SDR luminance value corresponding to the SDR luminance range. - Further, SDR
inverse EOTF converter 104 of convertingdevice 100 generates a pseudo HDR video image by performing SDR inverse EOTF conversion (S104). That is, SDRinverse EOTF converter 104 quantizes the determined SDR luminance value by using an inverse EOTF (Electro-Optical Transfer Function) of an SDR (Standard Dynamic Range) which is third association information which associates luminance values of an HDR luminance range and a plurality of third code values, determines a third code value obtained by the quantization, converts the SDR luminance value corresponding to the SDR luminance range into an SDR signals as the third luminance signal indicating the third code value and thereby generates a pseudo HDR signal. In addition, the third code value is a code value supporting the SDR, and will be referred to as a “SDR code value” below. That is, an SDR signal is expressed by an SDR code value obtained by quantizing a luminance value of a video image by using an SDR inverse EOTF of associating luminance values of an SDR luminance range and a plurality of SDR code values. Further, convertingdevice 100 outputs a pseudo HDR signal (SDR signal) generated in step S104 to displaydevice 200. - Converting
device 100 generates an SDR luminance value corresponding to a pseudo HDR by performing first luminance conversion and second luminance conversion on an HDR luminance value obtained by inversely quantizing an HDR signal, and generates an SDR signal corresponding to the pseudo HDR by quantizing the SDR luminance value by using an EOTF. In addition, the SDR luminance value is a numerical value in a luminance range of 0 to 100 nit corresponding to the SDR. However, the SDR luminance value is converted based on the display luminance range, and therefore takes a numerical value which is obtained by performing luminance conversion on an HDR luminance value by using an HDR EOTF and an SDR EOTF and which is different from the luminance value in the luminance range of 0 to 100 nit corresponding to the SDR. - Next, the display method performed by
display device 200 will be described with reference toFIG. 38 . In addition, the display method includes step S105 to step S108 described below. - First,
display setting unit 201 ofdisplay device 200 sets display settings ofdisplay device 200 by using setting information obtained from converting device 100 (S105). In this regard,display device 200 is an SDR TV. The setting information is information indicating display settings which is recommended for the display device, and is information indicating how an EOTF is performed on a pseudo HDR video image and what settings can display beautiful video images (i.e., information for switching the display settings ofdisplay device 200 to optimal display settings). The setting information includes, for example, gamma curve characteristics during an output of the display device, a display mode such as a living mode (normal mode) or a dynamic mode, or a numerical value of a backlight (brightness). Further, display device 200 (also referred to as a “SDR display” below) may display a message which encourages the user to change the display settings ofdisplay device 200 by a manual operation. Details will be described below. - In addition, before step S105,
display device 200 obtains an SDR signal (pseudo HDR signal), and setting information indicating display settings which are recommended fordisplay device 200 to display video images. - Further,
display device 200 may obtain the SDR signal (pseudo HDR signal) before step S106, and may obtain the SDR signal after step S105. - Next,
SDR EOTF converter 202 ofdisplay device 200 performs SDR EOTF conversion on the obtained pseudo HDR signal (S106). That is,SDR EOTF converter 202 inversely quantizes the SDR signal (pseudo HDR signal) by using an SDR EOTF. Thus,SDR EOTF converter 202 converts an SDR code value indicated by an SDR signal into the SDR luminance value. - Further,
luminance converter 203 ofdisplay device 200 performs luminance conversion according to the display mode set to displaydevice 200. Consequently,luminance converter 203 performs third luminance conversion of converting an SDR luminance value corresponding to an SDR luminance range (0 to 100 [nit]) into a display luminance value corresponding to the display luminance range (0 to DPL [nit]) (S107). Details will be described below. - In view of the above, in step S106 and step S107,
display device 200 converts a third code value indicated by the obtained SDR signal (pseudo HDR signal) into a display luminance value corresponding to the display luminance range (0 to DPL [nit]) by using the setting information obtained in step S105. - More specifically, the SDR signal (pseudo HDR signal) is converted into the display luminance value by, in step S106, determining for the SDR code value indicated by the obtained SDR signal an SDR luminance value associated with the SDR code value by an SDR EOTF by using the EOTF of associating the luminance values in the SDR luminance range and a plurality of third code values.
- Further, in step S107, an SDR signal is converted into a display luminance value by, in step S107, determining a display luminance value which is associated in advance with the determined SDR luminance value and corresponds to the display luminance range, and performing third luminance conversion of converting the SDR luminance value corresponding to the SDR luminance range into a display luminance value corresponding to the display luminance range.
- Finally, display 204 of
display device 200 displays pseudo HDR video images ondisplay device 200 based on the converted display luminance value (S108). - [38. First Luminance Conversion]
- Next, details of the first luminance conversion (HPL→DPL) in step S102 will be described with reference to
FIG. 39A .FIG. 39A is a view for explaining an example of the first luminance conversion. -
Luminance converter 102 of convertingdevice 100 performs the first luminance conversion of converting the linear signal (HDR luminance value) obtained in step S101 by using display characteristics information and content luminance information of HDR video images. According to the first luminance conversion, an HDR luminance value (input luminance value) is converted into a display luminance value (output luminance value) which does not exceed a display peak luminance (DPL). The DPL is determined by using a maximum luminance and a display mode of an SDR display which are the display characteristics information. The display mode is, for example, mode information such as a theater mode of displaying video images darkly on the SDR display, and a dynamic mode of displaying video images brightly. When, for example, the maximum luminance of the SDR display is 1,500 nit and the display mode is a mode providing brightness of 50% of the maximum luminance, the DPL is 750 nit. In this regard, the DPL (second maximum luminance value) is a luminance maximum value which can be displayed by a display mode currently set to the SDR display. That is, according to the first luminance conversion, the DPL which is the second maximum luminance value is determined by using display characteristics information which is information indicating display characteristics of the SDR display. - Further, according to the first luminance conversion, a CAL and a CPL of content luminance information are used, each luminance value equal to or less than the CAL is regarded as the same before and after conversion, and only each luminance value equal to or more than the CPL is changed. That is, as illustrated in
FIG. 39A , according to the first luminance conversion, when the HDR luminance value is the CAL or less, the HDR luminance value is not converted, the HDR luminance value is determined as a display luminance value. Further, when the HDR luminance value is the CPL or more, the DPL which is the second maximum luminance value is determined as a display luminance value. - Furthermore, according to the first luminance conversion, a peak luminance (CPL) of an HDR video image of luminance information is used, and the DPL is determined as a display luminance value when the HDR luminance value is the CPL.
- In addition, according to the first luminance conversion, as illustrated in
FIG. 39B , the linear signal (HDR luminance value) obtained in step S101 may be converted to clip to a value which does not exceed the DPL. By performing such luminance conversion, it is possible to simplify a process in convertingdevice 100, make convertingdevice 100 smaller, reduce power of convertingdevice 100 and increase a processing speed of convertingdevice 100. In addition,FIG. 39B is a view for explaining another example of the first luminance conversion. - [39-1. Second Luminance Conversion]
- Next, details of the second luminance conversion (DPL→100 [nit]) in step S103 will be described with reference to
FIG. 40 .FIG. 40 is a view for explaining the second luminance conversion. -
Luminance inverse converter 103 of convertingdevice 100 performs luminance inverse conversion corresponding to a display mode, on the display luminance value of the display luminance range (0 to DPL [nit]) converted by the first luminance conversion in step S102. The luminance inverse conversion is a process of making it possible to obtain the display luminance value of the display luminance range (0 to DPL [nit]) after the process in step S102 when the SDR display performs luminance conversion process (step S107) corresponding to the display mode. That is, the second luminance conversion is luminance inverse conversion of the third luminance conversion. - As a result of the above process, according to the second luminance conversion, the display luminance value (input luminance value) of the display luminance range is converted into an SDR luminance value (output luminance value of the SDR luminance range.
- According to the second luminance conversion, a converting method is switched according to a display mode of the SDR display. When, for example, the display mode of the SDR display is the normal mode, a luminance is converted into a direct proportional value which is directly proportional to a display luminance value. Further, according to the second luminance conversion, in a case where the display mode of the SDR display is the dynamic mode which makes high luminance pixels brighter and makes low luminance pixels darker than those of the normal mode, an inverse function of the second luminance conversion is used to convert an SDR luminance value of each low luminance pixel into a value higher than the direct proportional value which is directly proportional to the display luminance value, and convert an SDR luminance value of each high luminance pixel into a value lower than the direct proportional value which is directly proportional to the display luminance value. That is, according to the second luminance conversion, for the display luminance value determined in step S102, a luminance value associated with the display luminance value is determined as an SDR luminance value by using luminance association information corresponding to display characteristics information which is information indicating display characteristics of the SDR display, and luminance conversion process is switched according to the display characteristics information. In this regard, the luminance association information corresponding to the display characteristics information is information which is defined per display parameter (display mode) of the SDR display as illustrated in, for example,
FIG. 40 and which associates a display luminance value (input luminance value) and an SDR luminance value (output luminance value). - [39-2. Third Luminance Conversion]
- Next, details of the third luminance conversion (100→DPL [nit]) in step S107 will be described with reference to
FIG. 41 .FIG. 41 is a view for explaining the third luminance conversion. -
Luminance converter 203 ofdisplay device 200 converts an SDR luminance value of an SDR luminance range (0 to 100 [nit]) into (0 to DPL [nit]) according to the display mode set in step S105. This process is performed to realize an inverse function of luminance inverse conversion of each mode in step S103. - According to the third luminance conversion, a converting method is switched according to a display mode of the SDR display. When, for example, the display mode of the SDR display is the normal mode (i.e., a set display parameter is a parameter supporting the normal mode), luminance conversion is performed to convert the display luminance value into a direct proportional value which is directly proportional to the SDR luminance value. Further, according to the third luminance conversion, in a case where the display mode of the SDR display is the dynamic mode which makes high luminance pixels brighter and makes low luminance pixels darker than those of the normal mode, luminance conversion is performed to convert a display luminance value of each low luminance pixel into a value lower than the direct proportional value which is directly proportional to the SDR luminance value, and convert a display luminance value of each high luminance pixel into a value higher than the direct proportional value which is directly proportional to the SDR luminance value. That is, according to the third luminance conversion, for the SDR luminance value determined in step S106, a luminance value associated in advance with the SDR luminance value is determined as a display luminance value by using luminance association information corresponding to a display parameter which indicates display settings of the SDR display, and luminance conversion process is switched according to the display parameter. In this regard, the luminance association information corresponding to the display parameter is information which is defined per display parameter (display mode) of the SDR display as illustrated in, for example,
FIG. 41 and which associates an SDR luminance value (input luminance value) and a display luminance value (output luminance value). - [40. Display Settings]
- Next, details of the display settings in step S105 will be described with reference to
FIG. 42 .FIG. 42 is a flowchart illustrating detailed process of the display settings. -
Display setting unit 201 of the SDR display performs a following process in step S201 to step S208 in step S105. - First,
display setting unit 201 determines whether or not an EOTF (SDR display EOTF) set to the SDR display matches an EOTF assumed during generation of a pseudo HDR video image (SDR signal) by using setting information (S201). - When determining that the EOTF set to the SDR display is different from the EOTF indicated by the setting information (the EOTF matching the pseudo HDR video image) (Yes in S201),
display setting unit 201 determines whether or not a system side can switch the SDR display EOTF (S202). - When determining that the system side can switch the SDR display EOTF,
display setting unit 201 switches the SDR display EOTF to an appropriate EOTF (S203). - In view of step S201 to step S203, while the display settings are set (S105), the EOTF set to the SDR display is set to a recommended EOTF corresponding to the obtained setting information. Further, thus, in step S106 performed after step S105, it is possible to determine an SDR luminance value by using the recommended EOTF.
- When determining that the system side cannot switch the SDR display EOTF (No in S202),
display setting unit 201 displays on the screen a message which encourages the user to change the EOTF by a manual operation (S204). For example,display setting unit 201 displays on a screen a message such as “Please set a display gamma to 2.4”. That is, while the display settings are set (S105), when it is not possible to switch the EOTF set to the SDR display,display setting unit 201 displays on the SDR display a message which encourages the user to switch to a recommended EOTF an EOTF (SDR display EOTF) set to the SDR display. - Next, the SDR display displays a pseudo HDR video image (SDR signal), yet determines whether or not a display parameter of the SDR display matches setting information by using the setting information before displaying the pseudo HDR video image (S205).
- When determining that the display parameter set to the SDR display is different from the setting information (Yes in S205),
display setting unit 201 determines whether or not it is possible to switch the display parameter of the SDR display (S206). - When determining that it is possible to switch the display parameter of the SDR display (Yes in S206),
display setting unit 201 switches the display parameter of the SDR display according to the setting information (S207). - In view of step S204 to step S207, while the display settings are set (S105), the display parameter set to the SDR display is set to a recommended display parameter corresponding to the obtained setting information.
- When determining that the system side cannot switch the display parameter (No in S206),
display setting unit 201 displays on the screen a message which encourages the user to change the display parameter set to the SDR display by a manual operation (S208). For example,display setting unit 201 displays on the screen a message such as “Please set a display mode to a dynamic mode and maximize a backlight”. That is, during the setting (S105), when it is not possible to switch the display parameter set to the SDR display,display setting unit 201 displays on the SDR display a message which encourages the user to switch to a recommended display parameter a display parameter set to the SDR display. - [41. Modified Example 1]
- As described above, the exemplary embodiment has been described as an exemplary technique disclosed in this application. However, the technique according to the present disclosure is not limited to this, and is applicable to a first exemplary embodiment, too, for which changes, replacement, addition and omission have been optionally carried out. Further, it is also possible to provide new exemplary embodiments by combining each component described in the above exemplary embodiment.
- Hereinafter, another exemplary embodiment will be described.
- An HDR video image is, for example, a video image in a Blu-ray Disc, a DVD, a video distribution website on the Internet, a broadcast and a HDD.
- Converting device 100 (HDR→pseudo HDR conversion processor) may be provided in a disk player, a disk recorder, a set-top box, a television, a personal computer or a smartphone. Converting
device 100 may be provided in a server device on the Internet. - Display device 200 (SDR display) is, for example, a television, a personal computer and a smartphone.
- Display characteristics information obtained by converting
device 100 may be obtained fromdisplay device 200 via a HDMI (registered trademark) cable or a LAN cable by using HDMI (registered trademark) or another communication protocol. As the display characteristics information obtained by convertingdevice 100, display characteristics information included in model information ofdisplay device 200 may be obtained via the Internet. Further, the user may perform a manual operation to set the display characteristics information to convertingdevice 100. Furthermore, the display characteristics information may be obtained by convertingdevice 100 immediately before generation of a pseudo HDR video image (steps S101 to S104) or at a timing at which default settings of a device are made or at which a display is connected. For example, the display characteristics information may be obtained immediately before conversion into a display luminance value or at a timing at which convertingdevice 100 is connected to displaydevice 200 for the first time by a HDMI (registered trademark) cable. - Further, a CPL and a CAL of an HDR video image may be provided per content or may be provided per scene. That is, according to the converting method, luminance information (a CPL and a CAL) which corresponds to each of a plurality of scenes of a video image, and which includes per scene at least one of a first maximum luminance value which is a maximum value among luminance values for a plurality of images configuring each scene, and an average luminance value which is an average of luminance values for a plurality of images configuring each scene may be obtained. According to first luminance conversion, for each of a plurality of scenes, a display luminance value may be determined according to luminance information corresponding to each scene.
- Further, the CPL and the CAL may be packaged in the same medium (a Blu-ray Disc, a DVD or the like) as that of HDR video images or may be obtained from a location different from HDR video images, for example, by converting
device 100 from the Internet. That is, luminance information including at least one of the CPL and the CAL may be obtained as meta information of a video image, or may be obtained via a network. - Further, during first luminance conversion (HPL→DPL) of converting
device 100, a fixed value may be used without using a CPL, a CAL and a display peak luminance (DPL). Furthermore, this fixed value may be changed from an outside. Still further, a CPL, a CAL and a DPL may be switched between a plurality of types. For example, the DPL may be switched between only three types of 200 nit, 400 nit and 800 nit, or may take a value which is the closest to display characteristics information. - Further, an HDR EOTF may not be SMPTE 2084, and another type of an HDR EOTF may be used. Furthermore, a maximum luminance (HPL) of an HDR video image may not be 10,000 nit and may be, for example, 4,000 nit or 1,000 nit.
- Still further, bit widths of code values may be, for example, 16, 14, 12, 10 and 8 bits.
- Moreover, SDR inverse EOTF conversion is determined based on display characteristics information yet a fixed conversion function (which can be changed from an outside, too) may be used. For the SDR inverse EOTF conversion, a function defined by, for example, Rec. ITU-R BT.1886 may be used. Further, types of SDR inverse EOTF conversion may be narrowed down to several types, and a type which is the closest to input/output characteristics of
display device 200 may be selected and used. - Furthermore, a fixed mode may be used for a display mode, and the display mode may not be included in display characteristics information.
- Still further, converting
device 100 may not transmit setting information, anddisplay device 200 may adopt fixed display settings and may not change the display settings. In this case,display setting unit 201 is unnecessary. Further, setting information may be flag information indicating a pseudo HDR video image or a non-pseudo HDR video image, and may be, for example, changed to settings for displaying the pseudo HDR video image the most brightly in case of the pseudo HDR video image. That is, while the display settings are set (S105), when the obtained setting information indicates a signal indicating a pseudo HDR video image converted by using a DPL, brightness settings ofdisplay device 200 may be switched to settings for displaying the pseudo HDR video image the most brightly. - [42. Modified Example 2]
- Further, in the first luminance conversion (HPL→DPL) of converting
device 100, conversion is performed according to, for example, a following equation. - In this regard, L represents a luminance value normalized to 0 to 1, and S1, S2, a, b and M are values set based on a CAL, a CPL and a DPL. In represents a natural logarithm. V represents a converted luminance value normalized to 0 to 1. As in an example in
FIG. 39A , when the CAL is 300 nit, the CPL is 2,000 nit, the DPL is 750 nit, conversion is not performed up to CAL+50 nit and conversion is performed in case of 350 nit or more, respective values take, for example, following values. - S1=350/10000
- S2=2000/10000
- M=750/10000
- a=0.023
- b=S1−a*In(S1)=0.112105
- That is, according to the first luminance conversion, when an SDR luminance value is between an average luminance value (CAL) and a first maximum luminance value (CPL), a display luminance value corresponding to an HDR luminance value is determined by using a natural logarithm.
- [40. Effect and Others]
- By converting each HDR video image by using information such as a content peak luminance or a content average luminance of each HDR video image, it is possible to change a converting method according to content and convert each HDR video image while keeping an HDR gradation as much as possible. Further, it is possible to suppress a negative influence that each HDR video image is too dark or too bright. More specifically, by mapping a content peak luminance of each HDR video image on a display peak luminance, a gradation is kept as much as possible. Further, each pixel value equal to or less than an average luminance is not changed to prevent an overall brightness from changing.
- Furthermore, by converting each HDR video image by using a peak luminance value and a display mode of an SDR display, it is possible to change a converting method according to display environment of the SDR display, and display each video image (pseudo HDR video image) having HDR quality at the same gradation or brightness as that of an original HDR video image according to capability of the SDR display. More specifically, a display peak luminance is determined based on a maximum luminance and a display mode of the SDR display, and each HDR video image is converted so as not to exceed the peak luminance value. Consequently, each HDR video image is displayed without substantially decreasing a gradation of each HDR video image up to a brightness which the SDR display can display, and a luminance value of a brightness which cannot be displayed is lowered to a brightness which can be displayed.
- As described above, it is possible to reduce information of a brightness which can be displayed, and display each video image having quality close to an original HDR video image without decreasing a gradation of a brightness which can be displayed. For example, for a display whose peak luminance is 1,000 nit, each HDR video image is converted into a pseudo HDR video image whose peak luminance is suppressed to 1,000 nit to keep an overall brightness, and a luminance value changes according to a display mode of the display. Hence, a luminance converting method is changed according to the display mode of the display. If a luminance higher than the peak luminance of the display is permitted for a pseudo HDR video image, there is a case where the high luminance is replaced with a peak luminance of the display side and is displayed. In this case, the pseudo HDR video image becomes entirely darker than an original HDR video image. By contrast with this, when a luminance lower than the peak luminance of the display is converted as a maximum luminance, this low luminance is replaced with the peak luminance of the display side. Therefore, the pseudo HDR video image becomes entirely brighter than an original HDR video image. Moreover, the luminance is lower than the peak luminance of the display side, and therefore capability related to a display gradation is not used at maximum.
- Further, the display side can display each pseudo HDR video image better by switching display settings by using setting information. When, for example, a brightness is set dark, a high luminance cannot be displayed, and therefore HDR quality is undermined. In this case, the display settings are changed or a message which encourages a change of the display settings is displayed to exhibit display capability and display high gradation video images.
- (Overall Conclusion)
- The playback method and the playback device according to one or a plurality of aspects of the present disclosure have been described based on the exemplary embodiment. However, the present disclosure is not limited to this exemplary embodiment. The scope of one or a plurality of exemplary embodiments of the present disclosure may include exemplary embodiments obtained by applying, to the present exemplary embodiment, various deformations one of ordinary skill in the art conceives, and exemplary embodiments obtained by combining the components according to different exemplary embodiments without departing from the spirit of the present disclosure.
- In the above exemplary embodiment, each component may be configured by dedicated hardware such as a circuit or may be realized by executing a software program suitable to each component. Each component may be realized by causing a program executing unit such as a CPU or a processor to read a software program recorded on a recording medium such as a hard disk or a semiconductor memory and execute the software program.
- The present disclosure is applicable to content data generating devices, video stream transmitting devices such as Blu-ray devices or video display devices such as televisions.
Claims (18)
1-6. (canceled)
7. A playback method for playing back a video signal,
a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit,
the playback method comprising:
determining whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold; and
adjusting the luminance value of the video signal when it is determined that the change amount exceeds the first threshold.
8. The playback method according to claim 7 , wherein the adjustment includes adjusting, for a pixel whose change amount exceeds the first threshold, a luminance value of the pixel such that the change amount of the pixel is the first threshold or less.
9. The playback method according to claim 7 , wherein
the determination includes determining whether or not a difference exceeds the first threshold, the difference being a difference between a peak luminance of a first image included in the video signal, and each of luminance values of a plurality of pixels included in the video signal and included in a second image subsequent to the first image, and
the adjustment includes adjusting, for a pixel the difference of which exceeds the first threshold, a luminance value of the pixel such that the difference of the pixel is the first threshold or less.
10. The playback method according to claim 7 , wherein
the determination includes determining whether or not the change amount of the luminance value at a reference time interval exceeds the first threshold, the reference time interval being an integer multiple of a reciprocal of a frame rate of the video signal.
11. The playback method according to claim 7 , wherein
the determination includes determining whether or not a rate of pixels the change amounts of which exceed the first threshold with respect to a plurality of pixels exceeds a second threshold, the plurality of pixels being included in an image included in the video signal, and
the adjustment includes adjusting, when the rate exceeds the second threshold, the luminance values of a plurality of pixels such that the rate is the second threshold or less.
12. The playback method according to claim 7 , wherein
the determination includes determining, for each of a plurality of areas obtained by dividing a screen, whether or not the inter-screen change amount of the luminance value of each of a plurality of areas exceeds the first threshold, and
the adjustment includes performing, an adjustment process of lowering a luminance value of an area for which it is determined that the change amount exceeds the first threshold.
13. A playback method for playing back a video signal,
a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit,
the playback method comprising:
determining whether or not a luminance value of an image of the video signal exceeds a predetermined first threshold; and
adjusting the luminance value of the image when it is determined that the luminance value exceeds the first threshold.
14. The playback method according to claim 13 , wherein
the determination includes determining a number of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image, and
the adjustment includes lowering, when the number of pixels exceeds a third thresholds, the luminance value of the image such that the number of pixels is the third threshold or less.
15. The playback method according to claim 13 , wherein
the determination includes determining, a rate of pixels whose luminance values exceed the first threshold with respect to a plurality of pixels included in the image, and
the adjustment includes lowering, when the rate exceeds a third threshold, the luminance value of the image such that the rate is the third threshold or less.
16. The playback method according to claim 13 , wherein
the first threshold is a value calculated based on an upper limit value of a voltage which is simultaneously applicable to a plurality of pixels in a display device that displays the video signal.
17. (canceled)
18. A playback device that plays back a video signal,
a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit,
the playback device comprising one or more memories and circuitry which, in operation,
determines whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold; and
adjusts the luminance value of the video signal when it is determined that the change amount exceeds the first threshold.
19. A playback device that plays back a video signal, wherein
a luminance of the video signal is a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, and
the playback device comprises one or more memories and circuitry which, in operation, and
the playback device determines whether or not a luminance value of an image included in the video signal exceeds a predetermined first threshold, and
adjusts the luminance value of the image when it is determined that the luminance value exceeds the first threshold.
20. A display method for displaying a video signal,
a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit,
the playback method comprising:
determining whether or not a luminance value of an image of the video signal exceeds a predetermined first threshold;
adjusting the luminance value of the image when it is determined that the luminance value exceeds the first threshold; and
displaying a signal adjusted in the adjusting.
21. A display method for displaying a video signal,
a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit,
the display method comprising:
determining whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold;
adjusting the luminance value of the video signal when it is determined that the change amount exceeds the first threshold; and
displaying a signal adjusted in the adjusting.
22. A display device that displays back a video signal, a luminance of the video signal is a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the display device comprising:
a display; and
circuitry that, in operation, performs operations including:
determining whether or not a luminance value of an image included in the video signal exceeds a predetermined first threshold;
adjusting the luminance value of the image when it is determined that the luminance value exceeds the first threshold; and
displaying a signal on the display adjusted in the adjusting.
23. A playback device that plays back a video signal, a luminance of the video signal being a first luminance value in a first luminance range whose maximum luminance value is defined as a first maximum luminance value exceeding 100 nit, the display device comprising:
a display; and
circuitry that, in operation, performs operations including:
determining whether or not an inter-screen change amount of a luminance value of the video signal exceeds a predetermined first threshold;
adjusting the luminance value of the video signal when it is determined that the change amount exceeds the first threshold;
displaying a signal on the display adjusted in the adjusting.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/371,607 US20190230407A1 (en) | 2014-08-19 | 2019-04-01 | Method for transmitting appropriate meta data to display device according to transmission protocol version |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462038900P | 2014-08-19 | 2014-08-19 | |
JP2015134733 | 2015-07-03 | ||
JP2015-134733 | 2015-07-03 | ||
PCT/JP2015/003876 WO2016027423A1 (en) | 2014-08-19 | 2015-07-31 | Transmission method, reproduction method and reproduction device |
US15/214,507 US10291955B2 (en) | 2014-08-19 | 2016-07-20 | Method for transmitting appropriate meta data to display device according to transmission protocol version |
US16/371,607 US20190230407A1 (en) | 2014-08-19 | 2019-04-01 | Method for transmitting appropriate meta data to display device according to transmission protocol version |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/214,507 Division US10291955B2 (en) | 2014-08-19 | 2016-07-20 | Method for transmitting appropriate meta data to display device according to transmission protocol version |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190230407A1 true US20190230407A1 (en) | 2019-07-25 |
Family
ID=55350392
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/214,507 Active 2036-01-17 US10291955B2 (en) | 2014-08-19 | 2016-07-20 | Method for transmitting appropriate meta data to display device according to transmission protocol version |
US16/371,607 Abandoned US20190230407A1 (en) | 2014-08-19 | 2019-04-01 | Method for transmitting appropriate meta data to display device according to transmission protocol version |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/214,507 Active 2036-01-17 US10291955B2 (en) | 2014-08-19 | 2016-07-20 | Method for transmitting appropriate meta data to display device according to transmission protocol version |
Country Status (6)
Country | Link |
---|---|
US (2) | US10291955B2 (en) |
EP (1) | EP3185572B1 (en) |
JP (1) | JP6566271B2 (en) |
CN (2) | CN105981396B (en) |
MX (1) | MX366637B (en) |
WO (1) | WO2016027423A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210160401A1 (en) * | 2019-11-27 | 2021-05-27 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the same, and storage medium |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5948619B2 (en) * | 2014-06-10 | 2016-07-06 | パナソニックIpマネジメント株式会社 | Display system, display method, and display device |
EP3196880B1 (en) * | 2014-09-12 | 2019-08-28 | Sony Corporation | Playback device, playback method, information processing device, information processing method, program, and recording medium |
JP6863271B2 (en) * | 2015-02-16 | 2021-04-21 | ソニーグループ株式会社 | Information processing equipment, information recording media, information processing methods, and programs |
US10735755B2 (en) | 2015-04-21 | 2020-08-04 | Arris Enterprises Llc | Adaptive perceptual mapping and signaling for video coding |
JP6535560B2 (en) * | 2015-09-18 | 2019-06-26 | 東芝映像ソリューション株式会社 | Electronic device and display method |
EP3185571A1 (en) * | 2015-12-22 | 2017-06-28 | Thomson Licensing | Hierarchical scene segmentation metadata transmission for content enhancement |
KR102438199B1 (en) * | 2015-12-24 | 2022-08-30 | 삼성전자주식회사 | Display device and method for changing settings of display device |
JP2017139678A (en) * | 2016-02-05 | 2017-08-10 | Necプラットフォームズ株式会社 | Image data converter, image data conversion method, image data conversion program, pos terminal, and server |
JP2017151308A (en) * | 2016-02-25 | 2017-08-31 | キヤノン株式会社 | Information processor and information processing method |
JP6253036B2 (en) * | 2016-03-31 | 2017-12-27 | シャープ株式会社 | Content processing apparatus, television receiving apparatus, information processing method and program in content processing apparatus |
EP3477933B1 (en) * | 2016-06-27 | 2022-03-16 | Sony Group Corporation | Signal processing device, signal processing method, camera system, video system and server |
WO2018003667A1 (en) * | 2016-06-28 | 2018-01-04 | シャープ株式会社 | Transmitting device, receiving device, and display device |
JP6786324B2 (en) * | 2016-09-20 | 2020-11-18 | 株式会社東芝 | Multiplexing device and multiplexing method |
JP6855205B2 (en) | 2016-10-06 | 2021-04-07 | 株式会社ソニー・インタラクティブエンタテインメント | Information processing equipment and image processing method |
KR102554379B1 (en) * | 2016-10-31 | 2023-07-11 | 엘지디스플레이 주식회사 | Image processing method and module for high dynamic range (hdr) and display device using the same |
WO2018193715A1 (en) * | 2017-04-21 | 2018-10-25 | パナソニックIpマネジメント株式会社 | Reproduction device, reproduction method, display device, and display method |
WO2018235338A1 (en) * | 2017-06-21 | 2018-12-27 | パナソニックIpマネジメント株式会社 | Image display system and image display method |
US11049225B2 (en) | 2017-07-07 | 2021-06-29 | Panasonic Intellectual Property Management Co., Ltd. | Video processing system and video processing method |
WO2019012729A1 (en) * | 2017-07-14 | 2019-01-17 | パナソニックIpマネジメント株式会社 | Video display device and video display method |
WO2019053917A1 (en) * | 2017-09-13 | 2019-03-21 | パナソニックIpマネジメント株式会社 | Brightness characteristic generation method |
WO2019059022A1 (en) * | 2017-09-21 | 2019-03-28 | ソニー株式会社 | Reproduction device, reproduction method, program, and recording medium |
JP7108880B2 (en) * | 2017-10-06 | 2022-07-29 | パナソニックIpマネジメント株式会社 | Video display device and video display method |
US10972767B2 (en) * | 2017-11-01 | 2021-04-06 | Realtek Semiconductor Corp. | Device and method of handling multiple formats of a video sequence |
KR102413839B1 (en) * | 2017-11-15 | 2022-06-28 | 삼성전자 주식회사 | Apparatus for providing content, method for controlling thereof and recording media thereof |
WO2019098054A1 (en) * | 2017-11-17 | 2019-05-23 | ソニー株式会社 | Information processing device, information processing method, storage medium, reproduction device, reproduction method, and program |
JP6821269B2 (en) * | 2017-12-05 | 2021-01-27 | 株式会社ソニー・インタラクティブエンタテインメント | Image processing device and image processing method |
US10645199B2 (en) * | 2018-01-22 | 2020-05-05 | Lattice Semiconductor Corporation | Multimedia communication bridge |
US10832613B2 (en) | 2018-03-07 | 2020-11-10 | At&T Intellectual Property I, L.P. | Image format conversion using luminance-adaptive dithering |
CN108447083B (en) * | 2018-03-16 | 2020-06-02 | 北京虚拟映画科技有限公司 | Image transmission method and system based on image decomposition and recombination |
TWI822677B (en) * | 2018-04-30 | 2023-11-21 | 圓剛科技股份有限公司 | Video signal conversion device |
TW201946430A (en) | 2018-04-30 | 2019-12-01 | 圓剛科技股份有限公司 | Video signal conversion device and method thereof |
KR102572432B1 (en) * | 2018-07-03 | 2023-08-30 | 삼성전자주식회사 | Display apparatus, control method thereof and recording media |
US10652512B1 (en) * | 2018-11-20 | 2020-05-12 | Qualcomm Incorporated | Enhancement of high dynamic range content |
CN112261334B (en) * | 2020-10-21 | 2023-04-11 | 广东博华超高清创新中心有限公司 | Transmission method and system supporting HDMI2.1 signal single-channel input and multi-channel output |
US20230031245A1 (en) * | 2021-07-23 | 2023-02-02 | Teradici Co. | Encoder changes |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060215903A1 (en) * | 2005-03-23 | 2006-09-28 | Kabushiki Toshiba | Image processing apparatus and method |
US20070296865A1 (en) * | 2004-11-02 | 2007-12-27 | Fujitsu Ten Limited | Video-Signal Processing Method, Video-Signal Processing Apparatus, and Display Apparatus |
US20080118170A1 (en) * | 2006-11-21 | 2008-05-22 | Samsung Electronics Co., Ltd. | Method and system for quantization layer reduction in digital image processing |
US20090324125A1 (en) * | 2008-06-25 | 2009-12-31 | Shintaro Okada | Image Processing Apparatus and Method, and Program |
US20120026405A1 (en) * | 2010-08-02 | 2012-02-02 | Dolby Laboratories Licensing Corporation | System and Method of Creating or Approving Multiple Video Streams |
US20120026208A1 (en) * | 2010-07-30 | 2012-02-02 | Kabushiki Kaisha Toshiba | Image display apparatus |
US20120287093A1 (en) * | 2010-01-08 | 2012-11-15 | Sharp Kabushiki Kaisha | Display device with optical sensors |
US20140093130A1 (en) * | 2011-06-09 | 2014-04-03 | Utah State University Research Foundation | Systems and Methods For Sensing Occupancy |
US20140293102A1 (en) * | 2011-11-08 | 2014-10-02 | Rambus Inc. | Conditional-reset, temporally oversampled image sensor |
US20140307129A1 (en) * | 2013-04-15 | 2014-10-16 | Htc Corporation | System and method for lens shading compensation |
US20150016681A1 (en) * | 2012-03-02 | 2015-01-15 | Nissan Motor Co., Ltd. | Three-dimensional object detection device |
US20150302586A1 (en) * | 2012-02-23 | 2015-10-22 | Nissan Motor Co., Ltd. | Three-dimensional object detection device |
US20160014433A1 (en) * | 2014-07-09 | 2016-01-14 | Interra Systems, Inc. | Methods and Systems for Detecting Block Errors in a Video |
US10197782B2 (en) * | 2011-06-21 | 2019-02-05 | Hamamatsu Photonics K.K. | Light measurement device, light measurement method, and light measurement program |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3847006B2 (en) * | 1998-08-26 | 2006-11-15 | 富士通株式会社 | Image display control device and recording medium |
JP2002116732A (en) * | 2000-10-05 | 2002-04-19 | Pioneer Electronic Corp | Luminous panel driving method and device |
JP4812008B2 (en) * | 2006-04-07 | 2011-11-09 | 三菱電機株式会社 | Image display device |
JP4210863B2 (en) * | 2006-07-06 | 2009-01-21 | セイコーエプソン株式会社 | Image processing system, display device, program, and information storage medium |
JP5227502B2 (en) * | 2006-09-15 | 2013-07-03 | 株式会社半導体エネルギー研究所 | Liquid crystal display device driving method, liquid crystal display device, and electronic apparatus |
JP5145017B2 (en) | 2006-12-05 | 2013-02-13 | 日本放送協会 | Image signal processing device |
KR101256030B1 (en) * | 2009-03-10 | 2013-04-23 | 돌비 레버러토리즈 라이쎈싱 코오포레이션 | Extended dynamic range and extended dimensionality image signal conversion |
CN101710955B (en) * | 2009-11-24 | 2014-06-25 | 北京中星微电子有限公司 | Method and equipment for adjusting brightness and contrast |
JP2011172146A (en) * | 2010-02-22 | 2011-09-01 | Sharp Corp | Content-reproducing apparatus, setting method, program and recording medium |
JP5665388B2 (en) * | 2010-06-28 | 2015-02-04 | キヤノン株式会社 | Image processing apparatus and control method thereof |
US8736674B2 (en) * | 2010-09-23 | 2014-05-27 | Dolby Laboratories Licensing Corporation | Method and system for 3D display calibration with feedback determined by a camera device |
JP6053767B2 (en) * | 2011-06-14 | 2016-12-27 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Graphic processing for high dynamic range video |
WO2013039730A2 (en) * | 2011-09-15 | 2013-03-21 | Dolby Laboratories Licensing Corporation | Method and system for backward compatible, extended dynamic range encoding of video |
MX2014003556A (en) * | 2011-09-27 | 2014-05-28 | Koninkl Philips Nv | Apparatus and method for dynamic range transforming of images. |
EP2896198B1 (en) * | 2012-09-12 | 2016-11-09 | Dolby Laboratories Licensing Corporation | Display management for images with enhanced dynamic range |
-
2015
- 2015-07-31 CN CN201580007932.1A patent/CN105981396B/en active Active
- 2015-07-31 JP JP2016543801A patent/JP6566271B2/en active Active
- 2015-07-31 MX MX2017000432A patent/MX366637B/en active IP Right Grant
- 2015-07-31 WO PCT/JP2015/003876 patent/WO2016027423A1/en active Application Filing
- 2015-07-31 CN CN201910885200.2A patent/CN110460792B/en active Active
- 2015-07-31 EP EP15834222.0A patent/EP3185572B1/en active Active
-
2016
- 2016-07-20 US US15/214,507 patent/US10291955B2/en active Active
-
2019
- 2019-04-01 US US16/371,607 patent/US20190230407A1/en not_active Abandoned
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070296865A1 (en) * | 2004-11-02 | 2007-12-27 | Fujitsu Ten Limited | Video-Signal Processing Method, Video-Signal Processing Apparatus, and Display Apparatus |
US20060215903A1 (en) * | 2005-03-23 | 2006-09-28 | Kabushiki Toshiba | Image processing apparatus and method |
US20080118170A1 (en) * | 2006-11-21 | 2008-05-22 | Samsung Electronics Co., Ltd. | Method and system for quantization layer reduction in digital image processing |
US20090324125A1 (en) * | 2008-06-25 | 2009-12-31 | Shintaro Okada | Image Processing Apparatus and Method, and Program |
US20120287093A1 (en) * | 2010-01-08 | 2012-11-15 | Sharp Kabushiki Kaisha | Display device with optical sensors |
US20120026208A1 (en) * | 2010-07-30 | 2012-02-02 | Kabushiki Kaisha Toshiba | Image display apparatus |
US20120026405A1 (en) * | 2010-08-02 | 2012-02-02 | Dolby Laboratories Licensing Corporation | System and Method of Creating or Approving Multiple Video Streams |
US20140093130A1 (en) * | 2011-06-09 | 2014-04-03 | Utah State University Research Foundation | Systems and Methods For Sensing Occupancy |
US10197782B2 (en) * | 2011-06-21 | 2019-02-05 | Hamamatsu Photonics K.K. | Light measurement device, light measurement method, and light measurement program |
US20140293102A1 (en) * | 2011-11-08 | 2014-10-02 | Rambus Inc. | Conditional-reset, temporally oversampled image sensor |
US20150302586A1 (en) * | 2012-02-23 | 2015-10-22 | Nissan Motor Co., Ltd. | Three-dimensional object detection device |
US20150016681A1 (en) * | 2012-03-02 | 2015-01-15 | Nissan Motor Co., Ltd. | Three-dimensional object detection device |
US20140307129A1 (en) * | 2013-04-15 | 2014-10-16 | Htc Corporation | System and method for lens shading compensation |
US20160014433A1 (en) * | 2014-07-09 | 2016-01-14 | Interra Systems, Inc. | Methods and Systems for Detecting Block Errors in a Video |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210160401A1 (en) * | 2019-11-27 | 2021-05-27 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the same, and storage medium |
WO2021107496A1 (en) * | 2019-11-27 | 2021-06-03 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the same, and storage medium |
US11881134B2 (en) * | 2019-11-27 | 2024-01-23 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the same, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
EP3185572A1 (en) | 2017-06-28 |
MX366637B (en) | 2019-07-17 |
JPWO2016027423A1 (en) | 2017-06-01 |
CN110460792A (en) | 2019-11-15 |
WO2016027423A1 (en) | 2016-02-25 |
EP3185572A4 (en) | 2018-06-06 |
US10291955B2 (en) | 2019-05-14 |
EP3185572B1 (en) | 2023-03-08 |
JP6566271B2 (en) | 2019-08-28 |
CN110460792B (en) | 2022-03-08 |
MX2017000432A (en) | 2017-05-01 |
CN105981396B (en) | 2020-07-14 |
CN105981396A (en) | 2016-09-28 |
US20160330513A1 (en) | 2016-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10291955B2 (en) | Method for transmitting appropriate meta data to display device according to transmission protocol version | |
JP6671022B2 (en) | Display device, display method, and computer program | |
US20220159264A1 (en) | Data output apparatus, data output method, and data generation method | |
JP6573238B2 (en) | Display device, conversion device, display method, and computer program | |
WO2015198552A1 (en) | Content data generating method, video stream transmission method and video display method | |
JP6928885B2 (en) | Display device, display method and computer program | |
JP7170236B2 (en) | playback device | |
JP2017139511A (en) | Content data generation method, image stream transmission method, and image display method | |
JP6751893B2 (en) | Reproduction method, reproduction device, display method and display device | |
JP6643669B2 (en) | Display device and display method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |