WO2020009365A1 - Appareil d'affichage, son procédé de commande et support d'enregistrement - Google Patents

Appareil d'affichage, son procédé de commande et support d'enregistrement Download PDF

Info

Publication number
WO2020009365A1
WO2020009365A1 PCT/KR2019/007710 KR2019007710W WO2020009365A1 WO 2020009365 A1 WO2020009365 A1 WO 2020009365A1 KR 2019007710 W KR2019007710 W KR 2019007710W WO 2020009365 A1 WO2020009365 A1 WO 2020009365A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
section
content
tone mapping
Prior art date
Application number
PCT/KR2019/007710
Other languages
English (en)
Korean (ko)
Inventor
오승보
Original Assignee
삼성전자(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자(주) filed Critical 삼성전자(주)
Publication of WO2020009365A1 publication Critical patent/WO2020009365A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • G06T5/92
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Definitions

  • the present invention relates to a display apparatus for processing an input content and displaying an image, a control method thereof, and a recording medium. It relates to a control method and a recording medium.
  • a display apparatus generally refers to a device for displaying a content image on a screen of a display panel provided by the device by processing a signal or data of the content according to an image processing process.
  • An example of a general home is a TV.
  • the content provided to the display device is transmitted from a content reproducing device such as an optical disc reproducing device or a relay device such as a set-top box to a display device through a standard such as HDMI.
  • the content may be content of a high luminance image such as a high dynamic range (HDR) image.
  • HDR high dynamic range
  • the display apparatus displays the HDR content
  • the maximum luminance that the display apparatus can express is lower than the maximum luminance of the content, or the luminance range that the display apparatus can express is narrower than the luminance range of the content.
  • the display device uses a technique called tone mapping (TM) to display such HDR content.
  • the tone mapping technique is a method of generating and displaying an output image by mapping each luminance value of the content as an input image to values of a luminance range that can be expressed by the display apparatus. For example, while the image information of the content is in the luminance range of 0 to 2000 nits, the luminance range that can be expressed by the display device may be 0 to 500 nits. If only the luminance value of the content is generated as it is and the output image is generated, the display device cannot express the luminance value larger than 500 nits. Accordingly, the display apparatus converts an input image having a luminance range of the HDR content into an output image having a luminance range that can be expressed by the display apparatus by performing tone mapping. Examples of the tone mapping technique include a static tone mapping (STM) technique and a dynamic tone mapping (DTM) technique.
  • STM static tone mapping
  • DTM dynamic tone mapping
  • a display device supporting only the conventional STM technique or a display apparatus supporting both the conventional STM and DTM techniques is provided to reproduce contents according to any one of the STM and DTM techniques. Therefore, such a conventional display apparatus may not normally play back content in which a playback section according to the STM technique and a playback section according to the DTM technique are mixed.
  • a display device that supports only the STM technique can normally play back the STM playback section, but it does not support the DTM scheme, so that the DTM compatible playback section is played back according to the STM technique. That is, since the display apparatus reproduces the entire section of the content according to the STM technique, the content provider cannot provide an image of a desired level in the case of the DTM-compatible playback section.
  • a display device supporting both STM and DTM techniques can normally play back a DTM playback section, but may not play back an STM compatible playback section normally because the STM playback section is played back according to the DTM technique. That is, since the display apparatus reproduces the entire section of the content according to the DTM technique, the content provider cannot provide an image of the desired level in the case of the STM corresponding playback section.
  • a display apparatus the display unit;
  • a receiving unit for obtaining content data including a plurality of image frames;
  • the picture quality adjustment scheme is one of a first section and a second section which are different from each other, and according to the identified result, common to the first section.
  • a processor for performing one of a first operation of adjusting the image quality based on the image quality adjustment information and a second operation of adjusting the image quality based on the image quality adjustment information corresponding to each image frame for the second section.
  • the image quality adjustment information in the first section includes tone mapping information generated by the processor based on brightness information of the content data obtained from additional data of the content data.
  • the image quality adjustment information in the two sections may include tone mapping information obtained from the additional data.
  • the content data includes high dynamic range (HDR) content
  • the quality control information in the first section is based on a static tone mapping method
  • the quality control information in the second section is dynamic. It may be in accordance with a dynamic tone mapping method.
  • the additional data of the video frame of the first section may not include the tone mapping information, and the additional data of the video frame of the second section may include the tone mapping information.
  • the additional data of the content data may include information indicating a picture quality adjustment method in image frame units, and the processor may identify a picture quality adjustment method based on the information for each picture frame of the content data.
  • the image quality adjustment information may include information indicating a brightness value of an image displayed on the display unit respectively corresponding to the gray value of the image frame.
  • the processor may identify the first section and the second section of which the image quality adjustment method is different based on the video usability information (VUI) or the supplemental enhancement information (SEI) of the additional data.
  • VUI video usability information
  • SEI supplemental enhancement information
  • control method of the display apparatus includes the steps of obtaining the content data including a plurality of image frames; Identifying whether the picture quality adjustment scheme is one of a first section and a second section which are different from each other during reproduction of the plurality of image frames of the content data; According to the identified result, the first operation of adjusting the image quality based on the common image quality adjustment information for the first section and the image quality based on the image quality adjustment information corresponding to each of the image frames for the second section. Performing any one of the second operations of making adjustments.
  • the image quality adjustment information in the first section includes tone mapping information generated by the processor based on the brightness information of the content data obtained from the additional data of the content data.
  • the image quality adjustment information may include tone mapping information obtained from the additional data.
  • the content data may include HDR content
  • the quality control information in the first section may be based on a static tone mapping scheme
  • the quality control information in the second section may be according to a dynamic tone mapping scheme
  • the additional data of the video frame of the first section may not include the tone mapping information, and the additional data of the video frame of the second section may include the tone mapping information.
  • the additional data of the content data may include information indicating a picture quality adjustment method in image frame units, and the picture quality adjustment method may be identified based on the information for each video frame of the content data.
  • the image quality adjustment information may include information indicating a brightness value of an image displayed on the display unit respectively corresponding to the gray value of the image frame.
  • the first section and the second section may be identified based on the VUI or SEI of the additional data.
  • FIG. 1 is an exemplary view of a display device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a display apparatus according to an exemplary embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a method of processing content data by a display apparatus according to an exemplary embodiment of the present invention.
  • FIG. 4 is an exemplary view illustrating a principle of identifying a picture quality adjusting method of an image frame from content data by the display apparatus according to an exemplary embodiment of the present invention.
  • FIG. 5 is an exemplary view illustrating a difference of a tone mapping method between an STM method and a DTM method in a display apparatus according to an exemplary embodiment of the present invention.
  • FIG. 6 is a block diagram illustrating an operation of each processing module in the display device according to an exemplary embodiment of the present invention.
  • FIG. 7 is an exemplary view showing the appearance of a content providing device according to an embodiment of the present invention.
  • FIG. 8 is a block diagram illustrating a content providing apparatus according to an embodiment of the present invention.
  • FIG. 9 is a block diagram showing the operation of each processing module in the content providing apparatus according to the embodiment of the present invention.
  • the present expression is not only all of the plurality of components, but each one or the other except for the remaining of the plurality of components Refers to all combinations of
  • FIG. 1 is an exemplary view of a display device according to an embodiment of the present invention.
  • the display apparatus 100 when the display apparatus 100 according to the present exemplary embodiment receives content data from the outside by wire or wirelessly, the display apparatus 100 may display the content image by processing the content data.
  • the display apparatus 100 includes various devices capable of displaying an image including a TV, a monitor, a large format display (LFD), a micro LED display apparatus, an LED display apparatus, an OLED display apparatus, a projector, a wall mounted display apparatus, and the like.
  • the apparatus or method for providing the content data to the display apparatus 100 is provided in various ways, but is not limited to any one.
  • the display apparatus 100 is communicatively connected to the server 110 through an internet network, and receives a packet of content data streamed from the server 110.
  • the display apparatus 100 is connected to an optical disc player 120 for playing an optical disc such as a Blu-ray disc or a DVD through an HDMI cable, and receives TMDS-type content data from the optical disc player 120.
  • the display apparatus 100 is connected to an external storage device 130 such as a USB memory and obtains content data from the external storage device 130.
  • the display apparatus 100 displays the content image on the display unit by processing the obtained content data according to an image processing process.
  • Image processing processes include, for example, demultiplexing, descrambling, decoding, detail enhancement, scaling, tone mapping, and the like.
  • the content is based on the case where the content is displayed on a predetermined reference monitor or a mastering monitor, and is produced in consideration of the gamut and brightness of the mastering monitor.
  • the content is implemented as high dynamic range (HDR) video content
  • the corresponding content has a wider brightness range than the low dynamic range (LDR) video content and can express precise images.
  • the display apparatus 100 converts an input image, which is HDR image content, into an output image, which is an LDR image, by using a tone mapping technique.
  • the tone mapping technique include a static tone mapping (STM) technique based on an attribute of a display device, and a dynamic tone mapping (DTM) technique based on an attribute of a content.
  • the display apparatus 120 may generate an output image by applying any one of STM and DTM.
  • tone mapping is determined according to the attributes of the display apparatus, and the same tone mapping is applied to the same display apparatus only when the additional data in the content data is the same content.
  • the STM scheme performs the same tone mapping process on a plurality of image frames.
  • tone mapping is determined according to a scene in the content.
  • the "scene” is divided into a set of image frames having similar image quality characteristics by a content generation method when the content creator manually or automatically generates a dynamic HDR content by using a tool. It can be regarded as a set of plural or singular image frames.
  • the DTM technique performs image quality adjustment for each scene in one content or individually adjusts the quality of a plurality of image frames in one content.
  • the DTM technique can typically better reflect the intent of the content creator than the STM technique.
  • the entire playback section may correspond to any one of the STM and the DTM
  • the content data may be a mixture of the static HDR content section and the dynamic HDR content section, that is, the STM corresponding playback section and the DTM corresponding playback section.
  • a description will be given of a method for the display apparatus 100 according to the present embodiment to reproduce content data in which an STM corresponding playback section and a DTM corresponding playback section are mixed.
  • FIG. 2 is a block diagram illustrating a display apparatus according to an exemplary embodiment of the present invention.
  • the display apparatus 200 includes a communication unit 210 for communicating with an external device, a signal input / output unit 220 provided to input and output predetermined data, a display unit 230 for displaying an image, A user input unit 240 for performing user input, a storage unit 250 for storing data, and a processor 260 for processing data are included.
  • the communication unit 210 is a bidirectional communication circuit including at least one or more of components such as a communication module and a communication chip corresponding to various types of wired and wireless communication protocols.
  • the communication unit 210 may be implemented as a wireless communication module for performing wireless communication with the AP or a LAN card wired to a router or a gateway according to a Wi-Fi method.
  • the signal input / output unit 220 is wired to a predetermined external device in a one-to-one or one-to-many manner, thereby receiving data or outputting data to the external device.
  • the signal input / output unit 220 includes a connector or a port according to a preset transmission standard, for example, an HDMI port, a DisplayPort, a USB port, or the like.
  • the display unit 230 includes a display panel capable of displaying an image on the screen.
  • the display panel is provided with a light receiving structure such as a liquid crystal type or a light emitting structure such as an OLED type.
  • the display unit 230 may further include additional components according to the structure of the display panel. For example, if the display panel is a liquid crystal system, the display unit 230 may supply a backlight unit for supplying light to the liquid crystal display panel and a liquid crystal of the liquid crystal display panel. A panel driving substrate for driving is added.
  • the user input unit 240 includes various types of input interfaces provided for performing a user's input.
  • the user input unit 240 may be configured in various forms according to the type of the display apparatus 200. For example, a mechanical or electronic button unit of the display apparatus 200, a remote controller separated from the display apparatus 200, There is a touch pad and a touch screen installed in the display unit 230.
  • the storage unit 250 is accessed by the processor 260, and operations such as reading, writing, modifying, deleting, and updating data are performed under the control of the processor 260.
  • the storage unit 250 may include a nonvolatile memory such as a flash-memory, a hard-disc drive, a solid-state drive, or the like, capable of storing data regardless of whether power is provided. Volatile memory, such as a buffer, RAM, etc., into which data for processing is loaded.
  • the processor 260 processes content data received through the communication unit 210 or the signal input / output unit 220. If the content data has the property of the image content, the processor 260 displays an image based on the content data on the display 230. At this time, the processor 260 may process various data by applying various techniques to the data of the image before displaying the image. For example, the processor 260 may apply a tone mapping technique such as STM or DTM to the content data, and refer to metadata obtained from the content data as to which tone mapping technique to apply.
  • a tone mapping technique such as STM or DTM
  • the processor 260 may include one or more hardware processors implemented by a CPU, a chipset, a buffer, a circuit, or the like mounted on a printed circuit board.
  • the processor 260 may be implemented as a system on chip (SOC) according to a design scheme.
  • SOC system on chip
  • the processor 260 includes modules corresponding to various processes such as a demultiplexer, a decoder, a scaler, an audio DSP, an amplifier, and the like, and some or all of them may be implemented as SOCs.
  • a module related to image processing such as a demultiplexer, a decoder and a scaler may be implemented as an image processing SOC
  • the audio DSP may be implemented as a chipset separate from the SOC.
  • the processor 260 When the processor 260 according to an embodiment of the present invention receives the content data in which the video frame of the STM corresponding playback section and the video frame of the DTM corresponding playback section are mixed, the processor 260 performs a tone mapping technique for each video frame during playback of the content data. Identifies every image frame in real time.
  • the processor 260 generates tone mapping information for the image frame identified by the static HDR based on content related information extracted from the content data, and extracts the tone mapping information from the content data for the image frame identified by the Dynamic HDR. .
  • the processor 260 processes the corresponding image frame based on the generated or extracted tone mapping information and displays the processed image frame.
  • the processor 260 may process content data in which the STM corresponding video frame and the DTM corresponding video frame are mixed according to each tone mapping technique, thereby providing an optimal image quality intended by the content provider. .
  • FIG. 3 is a flowchart illustrating a method of processing content data by a display apparatus according to an exemplary embodiment of the present invention.
  • the next operation performed by the display apparatus is executed by a processor of the display apparatus.
  • the display apparatus receives content data including a plurality of image frames.
  • the display apparatus acquires data of an image frame of content data and additional data related to the image frame.
  • the display apparatus identifies whether the image quality adjusting method is the first quality adjusting method or the second quality adjusting method based on the acquired additional data.
  • step 340 the display apparatus proceeds to step 350 if the identified picture quality adjusting method is the first picture quality adjusting method, and proceeds to step 360 if the identified picture quality adjusting method is the second picture quality adjusting method instead of the first picture quality adjusting method.
  • the display apparatus If the identified picture quality adjustment method is the first picture quality adjustment method, the display apparatus generates picture quality adjustment information based on the additional data in step 350. On the other hand, if the identified picture quality adjustment method is the second picture quality adjustment method, the display apparatus extracts picture quality adjustment information from the additional data in step 360.
  • the display apparatus adjusts and displays an image frame according to the image quality adjustment information generated or extracted.
  • the display apparatus identifies whether there is an image frame of a next order to be played back. If there is an image frame of the next order, the display device proceeds to step 320, and if there is no image frame of the next order, the process ends.
  • the display apparatus identifies whether the picture quality adjustment scheme is one of a first section and a second section that are different from each other while reproducing the content data, and, according to the identified result, the common picture quality for the first section.
  • One of the first operation of adjusting the image quality based on the adjustment information and the second operation of adjusting the image quality based on the image quality adjustment information corresponding to each image frame for the second section is performed.
  • the display device can identify the image quality adjustment method for each image frame and perform image quality adjustment according to the identification result. In this way, the display apparatus can normally play content in which playback sections corresponding to different image quality adjustment methods are mixed.
  • the difference between 350 and 360 steps is as follows.
  • the additional data does not include the information for tone mapping of the image frame, that is, the tone mapping information directly related to the tone mapping, among the image quality adjustment information for image quality adjustment of the video frame. Therefore, in this case, in order to acquire the tone mapping information, the display apparatus must generate the tone mapping information based on the content related information in the additional data.
  • tone mapping information is included in the additional data. Therefore, in this case, the display apparatus can obtain by simply extracting the tone mapping information from the additional data. If necessary, the display apparatus may correct and use the extracted tone mapping information.
  • the additional data includes metadata that can be extracted from the content data, for example. If the picture quality adjustment method is a tone mapping method, the first picture quality adjustment method corresponds to the STM method, and the second picture quality adjustment method corresponds to the DTM method.
  • FIG. 4 is an exemplary view illustrating a principle of identifying a picture quality adjusting method of an image frame from content data by the display apparatus according to an exemplary embodiment of the present invention.
  • the display apparatus 400 when the display apparatus 400 receives the content data 410 including the plurality of image frames 420, the content data 410 is displayed to sequentially display each image frame 420 in time. Data for each image frame 420 is extracted.
  • One image frame 420 has image data 421 and metadata 422, for example.
  • the display apparatus 400 displays the image frame 420 by processing the image data 421 according to an image processing process. In this case, the display apparatus 400 may process each image frame 420 by referring to various pieces of information of the metadata 422 provided separately for each image frame 420.
  • the metadata 422 includes fields of various types of information units and values of corresponding fields.
  • a field is format data of metadata 422 representing an item of information, and a value of the field is content data or parameter representing a value of an item of information.
  • the display apparatus 400 may acquire desired information by searching for a field corresponding to desired information in the metadata 422 and confirming a value of the found field.
  • the display apparatus 400 obtains information on the image quality adjustment method specified for the video frame 420 from among the information recorded in the metadata 422. Since such information may have various forms and methods according to the standard of the content data 410, the information may not be limited to any one example.
  • the metadata 422 includes additional information 423 such as Video Usability Information (VUI) and Supplemental Enhancement Information (SEI).
  • VUI Video Usability Information
  • SEI Supplemental Enhancement Information
  • the additional information 423 indicates whether the image frame 420 corresponds to either static HDR content or dynamic HDR content. In order for the image to be displayed normally, the static HDR content must be processed by the STM technique and the dynamic HDR content must be processed by the DTM technique. Therefore, the additional information 423 may be configured such that the image frame 420 corresponds to either STM or DTM. Indicates whether it is specified.
  • a value of 16 in the value of the transfer_characteristics field of the VUI indicates a static HDR compliant with the ST, 2084 standard. Accordingly, when the value of the transfer_characteristics field is 16, the display apparatus 400 identifies that the corresponding video frame 420 is designated to be processed in the STM method.
  • Dynamic HDR content supports Static HDR by default.
  • Dynamic HDR content is created based on Static HDR content, which is produced so that it can be played on a basic playback device.
  • the content according to the ST.2094-40 standard of Dynamic HDR includes a portion of the content that has similar quality characteristics in the static HDR content, divided by scene, and dynamic metadata such as tone mapping information and content percentage information is added to each scene. It is made by If the values of fields such as itu_t_t35_terminal_provider_code, itu_t_t35_terminal_provider_oriented_code, itu_t_t35_country_code, etc. of SEI are set to the values defined in ST.2094-40, they can be identified as a playback section or video frame corresponding to Dynamic HDR content.
  • the display apparatus 400 may identify whether the image frame 420 is static HDR content based on the VUI information of the additional information 423, and the image frame 420 may be dynamic based on the information of the SEI. Whether it is HDR content can be identified.
  • this method is one of various examples of using the metadata 422 to identify the HDR type of the image frame 420, the identification method is not limited to any one method.
  • the display apparatus performs the above process in units of image frames or in units of a predetermined number of image frames.
  • the conventional display apparatus performs an identification process for identifying whether the content data is either the static HDR content or the dynamic HDR content, based on the format data of the metadata, at the initial stage of starting the reproduction of the content data. While data is being reproduced, the identification process for each image frame is not performed. In the conventional display apparatus, if the content data is initially identified as the static HDR content, there is no further identification process for each image frame, and then the processing is performed with only one tone mapping information while the content data is reproduced. In addition, when the content data is initially identified as Dynamic HDR content, the conventional display apparatus performs processing with tone mapping information for each scene while the content data is reproduced, without further identifying each image frame.
  • the display apparatus performs the above-described identification process for each image frame while content data is reproduced, so that the content mixed with the static HDR method and the dynamic HDR for each play period is reproduced in each play period. Can reproduce by reflecting the corresponding tone mapping technique.
  • FIG. 5 is an exemplary view illustrating a difference of a tone mapping method between an STM method and a DTM method in a display apparatus according to an exemplary embodiment of the present invention.
  • the display apparatus 500 may display HDR content as an image. For example, if the maximum brightness of the video frame of the HDR content is 1000 nits, while the maximum brightness that the display device 500 can display is 500 nits, the display device 500 is 500 at the video frame if the brightness of the content is applied as it is. I cannot express brightness higher than a knit. Thus, the display apparatus 500 performs 1: 1 tone mapping using the tone mapping information, so that an image frame whose brightness is compensated to correspond to the characteristics of the display apparatus 500 is displayed on the screen.
  • Tone mapping information is specifically, the pixel value or the gray value of each pixel of an image frame, or the brightness value of each pixel of an image frame matching information corresponding to the brightness value actually displayed on a display part. That is, the tone mapping information is information about brightness values of an image corresponding to brightness values of content, respectively. Accordingly, the tone mapping information may be represented by a curve on coordinates in which the brightness value of the content is the horizontal axis and the brightness value of the image is the vertical axis. The tone mapping information is also referred to as the tone mapping curve.
  • Both STM and DTM methods are methods of finally converting brightness values of each pixel of an image frame to brightness values for display on the display according to tone mapping information, and displaying the image frame with the converted brightness values as an image.
  • the STM technique and the DTM technique differ from each other in the process for calculating tone mapping information. This difference will be described below.
  • the display apparatus 500 obtains the metadata 520 from the content data 510.
  • the display apparatus 500 identifies from the metadata 520 whether one image frame is either static HDR content or dynamic HDR content.
  • the display apparatus 500 obtains content related information 530 from the metadata 520.
  • the content related information 530 is reference information for displaying the content, and includes the brightness information of the content, the gamut information of the mastering monitor referred to when the content is produced, and the minimum and maximum brightness information of the mastering monitor.
  • the content related information 530 is information about a property of a mastering monitor displaying an image frame. Since the metadata 520 of the image frame which is the static HDR content does not include the tone mapping information 540 for the corresponding image frame, the display apparatus 500 must newly generate the tone mapping information 540.
  • the display apparatus 500 generates the tone mapping information 540 according to a preset algorithm based on the content related information 530. That is, the display apparatus 500 is optimized for the display apparatus 500 to process the static HDR content based on the brightness information of the content specified in the content related information 530 and the maximum value of the brightness that the display apparatus 500 can display. Generated tone mapping information 540.
  • the metadata 520 includes tone mapping information 550. Accordingly, when the image frame is identified as the Dynamic HDR content, the display apparatus 500 extracts and obtains the tone mapping information 550 from the metadata 520. In this case, the display apparatus 500 may modify the details of the extracted tone mapping information 550 as necessary.
  • the HDR content requires tone mapping processing in order to be normally displayed on the display apparatus 500.
  • the metadata 520 of the static HDR content does not include the tone mapping information 540 for tone mapping of the corresponding content
  • the metadata 520 of the dynamic HDR content is tone mapping for the tone mapping of the corresponding content.
  • the operation of extracting the tone mapping information 550 included in the metadata 520 is performed.
  • the display apparatus 500 acquires tone mapping information in different operations in response to the HDR type of the identified image frame.
  • FIG. 6 is a block diagram illustrating an operation of each processing module in the display device according to an exemplary embodiment of the present invention.
  • the display apparatus 600 includes a decoder 610 for decoding content data, a DP (Display Processing) block 620 for adjusting image quality enhancement and brightness of each decoded image frame.
  • the image quality processing block 630 performs tone mapping, a timing controller 640 for transmitting image data according to a timing for displaying an image, a display panel 650 for displaying image data as an image, and a decoded image.
  • the memory 660 stores metadata of each image frame, and a DSP (Digital Signal Processing) block 670 which transmits metadata of the memory 660 to the image quality processing block 630 according to timing.
  • DSP Digital Signal Processing
  • the timing controller 640 and the display panel 650 may be individual components included in the display unit.
  • Two components of the decoder 610 and the DP block 620, or three components of the decoder 610, the DP block 620, and the image quality processing block 630 may be implemented as SOC together with the CPU.
  • the decoder 610 decodes the content data and divides the content data into image data, audio data, and metadata for each image frame.
  • the image data extracted by the decoder 610 is transferred to the DP block 620, and the metadata is stored in the memory 660.
  • HDR identification information indicating whether the image frame is either static HDR content or dynamic HDR content may be derived from the additional information in the metadata, and the HDR identification information is transmitted to the DP block 620.
  • the DP block 620 performs image processing on image data transferred from the decoder 610 for each image frame.
  • the DP block 620 processes the image data of each image frame and transmits the image data to the image quality processing block 630, and outputs a vertical synchronization signal according to the processing operation of each image frame to the DSP block 670.
  • the DSP block 670 may adjust the timing of the image frame processed at this point in time among the plurality of image frames.
  • Metadata of each image frame stored in the memory 660 from the decoder 610 is stored in a different memory address for each image frame.
  • the DSP block 670 reads metadata stored in the memory 660 and generates tone mapping information or extracts tone mapping information from the metadata based on the content related information of the metadata corresponding to the identified HDR type.
  • the DSP block 670 transfers tone mapping information corresponding to the image frame to the image quality processing block 630 according to the timing of each image frame according to the input vertical synchronization signal.
  • the image quality processing block 630 performs tone mapping on the image data of the image frame received from the DP block 620 using tone mapping information of the corresponding image frame received from the DSP block 670.
  • the image quality processing block 630 transmits the image data on which tone mapping is performed to the timing controller 640, and the timing controller 640 outputs the image data to the display panel 650 so that the image is displayed.
  • the display apparatus can identify the type of the HDR content of the image frame from the preset information of the metadata extracted from the content data and the VUI or SEI information of the metadata, for example, the H.265 standard. It was.
  • the basis for identifying the type of the HDR content of the image frame is not limited to metadata.
  • the display device can identify the HDR content type without checking the VUI, SEI information, and the like.
  • the static HDR content there is a playlist file of a UHD BD disc.
  • the playlist file indicates whether or not Static HDR content is stored on this disc.
  • the file information can be used to determine whether the disk contains static HDR content or dynamic HDR content.
  • the display apparatus directly decodes the contents shown in the playlist file to obtain the VUI and SEI information, and then checks whether there is static HDR related information or dynamic HDR related information. In this case, whether the actual content is playing the static HDR section or the dynamic HDR section is identified.
  • the UHD BD disc of the Dynamic HDR there is information called Disc info of the UHD BD Disc, similarly to the UHD BD Disc of the Static HDR.
  • the disc info indicates whether the disc contains Dynamic HDR content. This information can be used to determine whether the dynamic HDR content is contained in the disc, and the information indicated in the playlist file can be used to determine whether the dynamic HDR content is present.
  • the display apparatus may check the presence or absence of dynamic tone mapping information after decoding the content shown in the playlist corresponding to the content to obtain VUI and SEI information corresponding to the Dynamic HDR related information. Using these methods, the display apparatus can identify whether the corresponding content section is playing the Dynamic HDR content section or the Static HDR content section.
  • FIG. 7 is an exemplary view showing the appearance of a content providing device according to an embodiment of the present invention.
  • the content providing apparatus 700 is implemented as an optical disk device.
  • the content providing device 700 is not necessarily limited to the optical disk device, and may be implemented as a set top box, a streaming content source device, or a source device with a built-in broadcast tuner.
  • the content providing device 700 is an electronic device capable of reading data from or writing data to the optical disc 710.
  • the content providing apparatus 700 processes the content data acquired from the optical disc 710 according to an image processing process such as decoding, and outputs the processed content data to the display apparatus 720 connected through an HDMI cable. In operation 720, an image based on the content data is displayed.
  • the optical disc 710 includes various kinds of discs such as a DVD, a BD, a UHD-BD disc, and the like.
  • the optical disc 710 may be in the form of only acquiring already recorded data or in the form of recording new data, depending on the characteristics imparted at the time of manufacture.
  • the optical disc 710 may be in the form of only the disc itself, or may be in the form of a disc housed in a cartridge.
  • BD an example of the optical disc 710, is an optical recording method defined by the Blu-ray Disc Association (BDA) to store digital data for high-definition (HD) video.
  • BDA Blu-ray Disc Association
  • BD uses a laser having a much shorter wavelength (405 nm) than DVD to read recorded data, so that more data can be contained in the same size as a DVD.
  • a 12-cm diameter Blu-ray disc with a single-layer recording surface can now record 25 gigabytes of data, while a dual-layer disc can store twice as much data as 50 gigabytes.
  • the content providing apparatus 700 may be functionally only capable of reading data recorded on the optical disk 710, or may be additionally capable of recording data on the optical disk 710.
  • the content providing apparatus 700 is expressed as a form that cannot display an image by itself, but according to the embodiment, the content providing apparatus 700 may convert the image data of the optical disc 710 into an image. Can also be displayed.
  • FIG. 8 is a block diagram illustrating a content providing apparatus according to an embodiment of the present invention.
  • the content providing apparatus 800 includes a signal input / output unit 810, an optical disc reproducing unit 820, a user input unit 830, a storage unit 840, and a processor 850.
  • the signal input / output unit 810 is an interface for outputting content data to the display apparatus 860.
  • the signal input / output unit 810 includes a port or a connector provided according to the HDMI standard so that an HDMI cable is connected.
  • the content data reproduced or read in the content providing device 800 is transmitted to the display device 860 through the signal input / output unit 810 and may be converted according to a transmission standard set before transmission.
  • the optical disc reproducing unit 820 acquires content data from the optical disc and transfers the content data to the processor 850.
  • the optical disc reproducing unit 820 includes a spindle motor for rotating the optical disc, a pickup module for picking up the reflected light by irradiating a laser onto the optical disc, and a laser diode driving unit for driving the laser diode for irradiating the laser from the pickup module. It includes the components of.
  • the processor 850 controls the optical disc reproducing unit 820 in response to the recording operation of the optical disc or the reproducing operation of the optical disc.
  • the processor 850 may include one or more hardware processors implemented by a CPU, a chipset, a buffer, a circuit, or the like mounted on a printed circuit board, and may be implemented as an SOC according to a design scheme.
  • the processor 850 identifies the type of the HDR content for each image frame of the content data while the content disc is reproduced in the optical disc reproducing unit 820, and according to the identified result, any one of the static tone mapping information and the dynamic tone mapping information. Data synchronized with one of the video frames is output to the display apparatus 860.
  • the display apparatus 860 may display an image based on the image data of the image frame and the tone mapping information.
  • FIG. 9 is a block diagram showing the operation of each processing module in the content providing apparatus according to the embodiment of the present invention.
  • the content providing apparatus 900 may include a decoder 920 for decoding content data received from the optical disc reproducing unit 910, and image quality enhancement and brightness of image data for each decoded image frame.
  • DP block 930 an HDMI transmitter 940 for transmitting image data and tone mapping information to the display apparatus 970, a memory 950 for storing decoded image frame metadata, and timing And a DSP block 960 for transmitting metadata of the memory 950 to the HDMI transmitter 940.
  • the decoder 920 decodes the content data and divides the content data into image data, audio data, and metadata for each image frame.
  • the image data extracted by the decoder 920 is transferred to the DP block 930, and the metadata is stored in the memory 950.
  • HDR identification information indicating whether the image frame is either static HDR content or dynamic HDR content may be derived from the additional information in the metadata, and the HDR identification information is transferred to the DP block 930. Since the method of identifying HDR is substantially the same as the previous embodiment, detailed description thereof will be omitted.
  • the DP block 930 performs image processing on image data transferred from the decoder 920 for each image frame. In addition, the DP block 930 processes the image data of each image frame and transmits the image data to the HDMI transmitter 940, and outputs a vertical synchronization signal according to the processing operation of each image frame to the DSP block 960.
  • the metadata of each image frame stored in the memory 950 from the decoder 920 is stored in a different memory address for each image frame.
  • the DSP block 960 reads the metadata stored in the memory 950 and generates tone mapping information based on the content related information of the metadata corresponding to the identified HDR type, or extracts the tone mapping information from the metadata.
  • the DSP block 960 transmits the tone mapping information corresponding to the image frame to the HDMI transmitter 940 according to the timing of each image frame according to the input vertical synchronization signal.
  • the HDMI transmitter 940 transmits and receives a signal to and from the display apparatus 970 according to the HDMI standard.
  • components for transmitting and receiving signals with respect to the display apparatus 970 are not limited to the HDMI standard, and various preset transmission standards may be applied.
  • the HDMI transmitter 940 synchronizes tone mapping information of the video frame received from the DSP block 960 with respect to the video data of the video frame received from the DP block 930.
  • the HDMI transmitter 940 transmits the image data and tone mapping information of the synchronized image frame to the display apparatus 970.
  • the display apparatus 970 When the display apparatus 970 receives the image data and the tone mapping information of the image frame synchronized from the content providing apparatus 900, the display apparatus 970 performs the tone mapping process according to the tone mapping information, and then performs the tone mapping process. Display as an image.
  • the content providing apparatus 900 identifies whether the image frame of the content data is static content or dynamic content, and determines which of the static tone mapping information and the dynamic tone mapping information according to the identified result. One is synchronized with the video data of the video frame and provided to the display device 970.
  • Artificial intelligence can be applied to various systems using machine learning algorithms.
  • An artificial intelligence system is a computer system that implements intelligence comparable to the human level or the human level, and is a system in which a machine, a device, or a system autonomously learns and judges, and improves a recognition rate and an accuracy of judgment based on a cumulative use experience.
  • Artificial intelligence technology is composed of element technologies that simulate the functions of human brain cognition and judgment by using deep-running technology and algorithms that use algorithms to classify and learn the characteristics of input data. do.
  • Elemental skills include, for example, linguistic comprehension technology that recognizes human language and characters, visual comprehension technology that recognizes objects as human vision, reasoning and prediction technology that logically infers and predicts information, and human experience.
  • a knowledge expression technique for processing the information as knowledge data an operation control technique for controlling the autonomous driving of the vehicle or the movement of the robot.
  • the linguistic understanding is a technology for recognizing and applying a human language or a character, and includes natural language processing, machine translation, dialogue system, question and answer, speech recognition and synthesis.
  • Inference prediction is a technique of judging and logically predicting information, and includes knowledge and probability based reasoning, optimization prediction, preference based planning, and recommendation.
  • Knowledge representation is a technology that automates human experience information into knowledge data, and includes knowledge construction such as data generation and classification, knowledge management such as utilization of data, and the like.
  • Methods according to an exemplary embodiment of the present invention may be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • Such computer-readable media may include, alone or in combination with the program instructions, data files, data structures, and the like.
  • a computer readable medium may be volatile or nonvolatile, such as a storage device such as a ROM, whether or not removable or rewritable, or a memory such as, for example, a RAM, a memory chip, a device, or an integrated circuit.
  • a storage device such as a ROM, whether or not removable or rewritable
  • a memory such as, for example, a RAM, a memory chip, a device, or an integrated circuit.
  • CD or DVD, magnetic disk or magnetic tape and the like can be stored in a storage medium that is optically or magnetically recordable and simultaneously readable by a machine (eg computer).
  • a memory that can be included in a mobile terminal is an example of a machine-readable storage medium suitable for storing a program or programs containing instructions for implementing embodiments of the present invention.
  • the program instructions recorded on the storage medium may be those specially designed and constructed for the present invention, or may be known and available to those skilled in the art of computer software.

Abstract

Un appareil d'affichage comprend : une unité d'affichage ; un récepteur conçu pour obtenir des données de contenu contenant une pluralité de trames d'images ; et un processeur conçu pour, pendant une reproduction de la pluralité de trames d'images faisant partie des données de contenu, distinguer des première et seconde sections différentes du point de vue d'un procédé d'ajustement de la qualité des images et, selon le résultat de la distinction, effectuer sur la première section une première opération d'ajustement de la qualité des images sur la base d'informations d'ajustement de la qualité commune des images ou, sur la seconde section, une seconde opération d'ajustement de la qualité des images sur la base d'informations d'ajustement de la qualité des images correspondant à chaque trame d'image.
PCT/KR2019/007710 2018-07-03 2019-06-26 Appareil d'affichage, son procédé de commande et support d'enregistrement WO2020009365A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0077347 2018-07-03
KR1020180077347A KR102572432B1 (ko) 2018-07-03 2018-07-03 디스플레이장치 및 그 제어방법과 기록매체

Publications (1)

Publication Number Publication Date
WO2020009365A1 true WO2020009365A1 (fr) 2020-01-09

Family

ID=69059323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/007710 WO2020009365A1 (fr) 2018-07-03 2019-06-26 Appareil d'affichage, son procédé de commande et support d'enregistrement

Country Status (2)

Country Link
KR (1) KR102572432B1 (fr)
WO (1) WO2020009365A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220070912A (ko) * 2020-11-23 2022-05-31 삼성전자주식회사 영상을 제공하는 방법 및 이를 지원하는 전자 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100027315A (ko) * 2008-09-02 2010-03-11 엘지전자 주식회사 디스플레이장치 및 그의 제어 방법
US20150245043A1 (en) * 2014-02-25 2015-08-27 Apple Inc. Display-side adaptive video processing
WO2016027423A1 (fr) * 2014-08-19 2016-02-25 パナソニックIpマネジメント株式会社 Procédé de transmission, procédé et dispositif de reproduction
KR20170021384A (ko) * 2013-07-30 2017-02-27 돌비 레버러토리즈 라이쎈싱 코오포레이션 장면 안정 메타데이터를 발생하기 위한 시스템 및 방법들
KR20170129004A (ko) * 2016-05-16 2017-11-24 엘지전자 주식회사 영상 처리 장치 및 그의 영상 처리 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100027315A (ko) * 2008-09-02 2010-03-11 엘지전자 주식회사 디스플레이장치 및 그의 제어 방법
KR20170021384A (ko) * 2013-07-30 2017-02-27 돌비 레버러토리즈 라이쎈싱 코오포레이션 장면 안정 메타데이터를 발생하기 위한 시스템 및 방법들
US20150245043A1 (en) * 2014-02-25 2015-08-27 Apple Inc. Display-side adaptive video processing
WO2016027423A1 (fr) * 2014-08-19 2016-02-25 パナソニックIpマネジメント株式会社 Procédé de transmission, procédé et dispositif de reproduction
KR20170129004A (ko) * 2016-05-16 2017-11-24 엘지전자 주식회사 영상 처리 장치 및 그의 영상 처리 방법

Also Published As

Publication number Publication date
KR20200004210A (ko) 2020-01-13
KR102572432B1 (ko) 2023-08-30

Similar Documents

Publication Publication Date Title
WO2020235800A1 (fr) Appareil électronique et procédé de commande de celui-ci
WO2011005025A2 (fr) Procédé de traitement de signal et appareil correspondant utilisant la taille de l'écran d'un dispositif d'affichage
JP4935632B2 (ja) 画像処理装置、画像処理方法および画像処理プログラム
CN101998094B (zh) 通信设备和控制方法
JP2006287364A (ja) 信号出力装置及び信号出力方法
CN101690216A (zh) 传送装置、信息传送方法、接收装置及信息处理方法
WO2013172636A1 (fr) Appareil d'affichage, et son un procédé de commande
WO2021096091A1 (fr) Appareil électronique et procédé de commande associé
JP2010141410A (ja) 音声出力装置およびその制御方法
WO2018034479A1 (fr) Appareil d'affichage et support d'enregistrement
WO2020009365A1 (fr) Appareil d'affichage, son procédé de commande et support d'enregistrement
WO2019160275A1 (fr) Dispositif électronique, et procédé de génération d'image récapitulative de dispositif électronique
WO2019098619A1 (fr) Dispositif d'affichage, procédé de commande pour celui-ci et support d'enregistrement
JP2019057824A (ja) 映像出力装置、映像出力方法
WO2019177369A1 (fr) Procédé de détection de bande noire présente dans un contenu vidéo, et dispositif électronique associé
WO2018151540A1 (fr) Appareil électronique destiné à lire une publicité de substitution et procédé de commande associé
WO2023136706A1 (fr) Procédé de fonctionnement d'un dispositif d'affichage et appareil associé
WO2015020436A1 (fr) Appareil de reproduction d'image, serveur et procédés de reproduction d'image associés
US10410674B2 (en) Imaging apparatus and control method for combining related video images with different frame rates
WO2023058861A1 (fr) Dispositif électronique et son procédé de commande
WO2022039423A1 (fr) Appareil d'affichage et son procédé de commande
WO2021137580A1 (fr) Dispositif électronique et procédé de commande de celui-ci
JP2007281544A (ja) ビデオ出力装置
WO2021118032A1 (fr) Dispositif électronique et son procédé de commande
CN101606382A (zh) 媒体信号接收器及用于播放其图像的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19830912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19830912

Country of ref document: EP

Kind code of ref document: A1