WO2021075672A1 - Dispositif d'affichage et son procédé de fonctionnement - Google Patents

Dispositif d'affichage et son procédé de fonctionnement Download PDF

Info

Publication number
WO2021075672A1
WO2021075672A1 PCT/KR2020/009375 KR2020009375W WO2021075672A1 WO 2021075672 A1 WO2021075672 A1 WO 2021075672A1 KR 2020009375 W KR2020009375 W KR 2020009375W WO 2021075672 A1 WO2021075672 A1 WO 2021075672A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
profile
bezier curve
metadata
content
Prior art date
Application number
PCT/KR2020/009375
Other languages
English (en)
Korean (ko)
Inventor
허도원
김범준
이민재
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021075672A1 publication Critical patent/WO2021075672A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/643Hue control means, e.g. flesh tone control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • G06T5/92
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal

Definitions

  • Various embodiments relate to a display device and a method of operating the same, and more particularly, to a display device that reproduces HDR content and a method of operating the same.
  • the human-visible brightness range is approximately 10 -6 nits to 10 8 nits
  • the brightness encountered in real life e.g., 10,000 nits for a light bulb and 0.005 nits or less for the night sky
  • the brightness encountered in real life is considerably wider. .
  • HDTV High Dynamic Range
  • HDR content image processing may be performed for each scene of the HDR content based on dynamic metadata reflecting the characteristics of each scene of the HDR content, thereby providing a sharper image quality.
  • image processing may be performed for each scene of the HDR content based on dynamic metadata reflecting the characteristics of each scene of the HDR content, thereby providing a sharper image quality.
  • Various embodiments may provide a display device capable of preventing a sudden change in brightness from occurring and a method of operating the same when graphic content is provided during reproduction of HDR content.
  • the display device may prevent a sudden change in brightness of a screen when switching from dynamic tone mapping to static tone mapping or switching from static tone mapping to dynamic tone mapping.
  • the display device may prevent distorting of an image characteristic of a video content during image processing or image quality processing of the video content by detecting whether graphic content is activated.
  • FIG. 1 is a diagram illustrating an HDR content reproduction system according to an exemplary embodiment.
  • FIG. 2 is a diagram referenced to explain a method of performing tone mapping by using static metadata and dynamic metadata corresponding to HDR content by a display device according to an exemplary embodiment.
  • FIG. 3 is a diagram illustrating a case in which graphic content is output while an HDR content is output by a display device according to an exemplary embodiment.
  • FIG. 4 is a block diagram illustrating a configuration of a display device according to an exemplary embodiment.
  • FIG. 5 is a diagram illustrating a change in dynamic metadata when graphic content is activated, according to an exemplary embodiment.
  • FIG. 6 is a block diagram illustrating in detail the meta data determination unit of FIG. 4.
  • FIG. 7A and 7B are diagrams referenced for explaining an operation method of the IIR filter application unit of FIG. 4.
  • FIG. 8 is a flowchart illustrating a method of operating a display device according to an exemplary embodiment.
  • step 830 S830 of FIG. 8.
  • FIG. 10 is a block diagram illustrating a configuration of a display device according to an exemplary embodiment.
  • FIG. 11 is a block diagram illustrating a configuration of a display apparatus 1000 according to another exemplary embodiment.
  • a display device includes a memory storing one or more instructions and a processor executing the one or more instructions stored in the memory, wherein the processor includes video content, dynamic metadata corresponding to the video content, and Receives static metadata, determines a section in which graphic content is activated during reproduction of the video content, based on the dynamic metadata, and determines profile A data, profile B data, and the static data included in the dynamic metadata. Based on, the meta data applied to the section in which the graphic content is activated is determined, and based on the difference between the profile B data and the conversion profile B data generated based on the profile A data, the meta data is the profile
  • the metadata may be determined to be gradually converted from the B data to the static metadata.
  • the processor may determine a section in which the graphic content is activated based on whether Bezier Curve information of the profile B data changes from n to 0.
  • the processor may determine a section in which the graphic content is activated based on whether the percentage information of the profile A data gradually changes from the static metadata to a predefined value. have.
  • the processor may generate the conversion profile B data by converting percentage information of the profile A data of a section in which the graphic content is activated into Bezier curve information.
  • the processor includes the meta data in the profile B data based on a difference between a coefficient of a first Bezier curve of the profile B data and a coefficient of the second Bezier curve of the transform profile B data.
  • the metadata may be determined to be gradually converted to static metadata.
  • the processor may include the first Bezier curve and the second Bezier curve based on a difference between the coefficient of the first Bezier curve of the profile B data and the second Bezier curve of the transform profile B data.
  • the meta data can be determined by applying an IIR filter to the tailored curve.
  • the processor when the coefficient of the first Bezier curve is greater than the coefficient of the second Bezier curve, applies an IIR filter to the first Bezier curve, and the coefficient of the first Bezier curve Until is equal to the coefficient of the second Bezier curve, the first Bezier curve is gradually transformed, and when the coefficient of the first Bezier curve and the coefficient of the second Bezier curve become the same, the second Bezier curve 2
  • the metadata may be determined.
  • the processor When the coefficient of the first Bezier curve is smaller than the coefficient of the second Bezier curve, the processor according to an embodiment is the same as the coefficient of the second Bezier curve. Until, the first Bezier curve is determined as the metadata, and when the coefficient of the first Bezier curve and the coefficient of the second Bezier curve become the same, an IIR filter is applied to the second Bezier curve, The meta data may be determined by gradually transforming the second Bezier curve.
  • the display device further includes a display, wherein the processor performs tone mapping of the video content and the graphic content by applying the metadata to a section in which the graphic content is activated, and the tone-mapped video
  • the display may be controlled to display content and graphic content.
  • the display device may further include an input/output unit for receiving the video content, the dynamic metadata, and the static metadata.
  • a method of operating a display device includes the steps of receiving video content, dynamic metadata corresponding to the video content, and static metadata, and based on the dynamic metadata, graphic content is displayed during playback of the video content. Determining a section to be activated, and determining metadata applied to a section in which the graphic content is activated, based on profile A data, profile B data, and the static data included in the dynamic meta data, , The meta data applied to the section in which the graphic content is activated is based on a difference between the profile B data and the conversion profile B data generated based on the profile A data, and the meta data is in the profile B data. It can be determined to be gradually converted to static metadata.
  • the term "user” refers to a viewer who views an image displayed on an electronic device or a person who controls a function or operation of the electronic device, and may include an administrator or an installer.
  • FIG. 1 is a diagram illustrating an HDR content reproduction system according to an exemplary embodiment.
  • the content creator may provide encoded content based on the brightness and color of an image intended by the creator.
  • the high dynamic range (HDR) content 10 providing a more vivid image by improving the contrast ratio of the screen
  • encoding information corresponding to the encoding method may be provided together.
  • the HDR content 10 encoded based on the brightness and color of an image intended by the creator will be provided together with related metadata. I can.
  • an HDR content playback system may include a playback device 200 and a display device 100, and the HDR content 10 includes a hard disk, a floppy disk, and a magnetic tape. It may be stored in a data storage medium 20 including a magnetic medium, an optical recording medium such as a CD-ROM and a DVD, a magneto-optical medium such as a floptical disk, and the like. The HDR content 10 stored in the storage medium may be reproduced through the playback device 200 and displayed through the display device 100.
  • the playback device 200 may be a Blu-ray player, a Digital Versatile Disc (DVD) player, or the like, but is not limited thereto, and may be implemented as various types of playback devices.
  • the display device 100 includes a TV, a large format display (LFD), a digital signage (digital signage), a video wall, a mobile phone, a speaker, a tablet PC, a digital camera, a camcorder, a laptop computer, Tablet PC, desktop, e-book terminal, digital broadcasting terminal, PDA (Personal Digital Assistants), PMP (Portable Multimedia Player), navigation, MP3 player, camcorder, IPTV (Internet Protocol Television), DTV (Digital Television), wearable devices It may be implemented in various electronic devices such as (wearable device).
  • the display device 100 may be a fixed electronic device disposed at a fixed position or a mobile electronic device having a portable form, and may be a digital broadcast receiver capable of receiving digital broadcasts.
  • the playback device 200 When the playback device 200 reproduces the storage medium 20 on which the HDR content is recorded, the HDR content and metadata corresponding to the HDR content are provided to the display device 100 through an input/output unit.
  • the HDR content 10 may be transmitted to the display device 100 through the network 30.
  • a server (not shown) according to an embodiment may transmit HDR content to the display device 100 together with metadata through the network 30.
  • the display device 100 may include a separate device connected to a TV, for example, a set-top box. That is, the display apparatus 100 may include not only a device that displays an image by itself, but also a device that transmits related data to a display connected to it so that an image can be displayed on a display connected to it.
  • metadata corresponding to HDR content may include static metadata or dynamic metadata.
  • the static metadata is metadata reflecting the characteristics of the entire HDR content, and means metadata that is fixedly applied to the HDR content.
  • Static metadata applies to all scenes regardless of scene changes.
  • the scene refers to a section having similar quality characteristics, and may be distinguished from a scene according to a spatial change in a scenario that is classified by a content producer such as a conventional movie. That is, even in the same space on a scenario divided by a content creator such as a movie, different scenes may be classified according to the brightness and color of the image. However, it is not limited thereto.
  • the dynamic metadata is metadata reflecting the characteristics of each scene of the HDR content, and means metadata dynamically applied to each scene of the HDR content.
  • FIG. 2 is a diagram referenced to explain a method of performing tone mapping by using static metadata and dynamic metadata corresponding to HDR content by a display device according to an exemplary embodiment.
  • HDR content may be provided together with static metadata or dynamic metadata.
  • the HDR content may be provided together with static metadata and dynamic metadata.
  • static HDR content for example,'HDR 10'
  • dynamic HDR content for example,'HDR 10'
  • +' but is not limited thereto.
  • Dynamic metadata may include profile A data and profile B data.
  • Profile A data may include luminance distribution information (percentile information) of an image
  • profile B data may include Bezier Curve information of an image. However, it is not limited thereto.
  • the display device may be implemented to support an HDR function.
  • the HDR function may mean a function of performing image quality conversion (or image quality processing) and tone mapping on the HDR content based on metadata provided together with the HDR content and displaying it.
  • Tone mapping is a method of expressing the original tone of the HDR content according to the dynamic range of the display device based on the received metadata. For example, the maximum luminance of the HDR content based on the metadata may be mapped to the maximum luminance that can be expressed by the display device of the display device.
  • the display device may perform static tone mapping in a static manner. For example, when HDR content and static metadata corresponding to the HDR content are provided, the display device applies the same tone mapping graph based on the static metadata to all frames included in the HDR content, Mapping can be performed. As shown in (a) of FIG. 2, the display device 100 performs static tone mapping by applying the same tone mapping graph 215 to the first to third frames 210 included in the HDR content. can do.
  • the display apparatus 100 may perform dynamic tone mapping in a dynamic manner. For example, when dynamic metadata corresponding to the HDR content and each scene of the HDR content is provided, the display device applies a different tone mapping graph based on the dynamic metadata for each scene of the HDR content to map the tone of the HDR content. You can do it. As shown in (b) of FIG. 2, the display apparatus 100 includes a first tone mapping graph 225 in the first frame 211 and a second tone mapping graph 235 in the second frame 212. By applying the third tone mapping graph 245 to the third frame 213, dynamic tone mapping may be performed.
  • FIG. 3 is a diagram illustrating a case in which graphic content is output while an HDR content is output by a display device according to an exemplary embodiment.
  • a display device receives HDR content and dynamic metadata corresponding to the HDR content, performs tone mapping of the HDR content based on the dynamic metadata, and performs tone-mapped HDR content. Can be printed.
  • the graphic content 320 may include at least one of an interactive graphic (IG), a presentation graphic (PG), and a graphical user interface (GUI).
  • IG refers to graphic content that can be selected or controlled by the user, such as a main menu graphic provided at a specific point in time (for example, at the initial point of content)
  • PG is a content that is only displayed unilaterally to the user, such as subtitles and performer information. It means graphic content.
  • the GUI refers to a UI provided according to a user command, such as a playback control menu.
  • subtitles if provided throughout the content, it may not be included in the graphic content according to an embodiment of the present invention.
  • the display device 100 When the graphic content 320 is output, the display device 100 must perform tone mapping by switching from the dynamic tone mapping mode to the static tone mapping mode.
  • an equation to be used for static tone mapping is defined only for profile A data, and profile B data cannot be used. Therefore, when static tone mapping needs to be performed while dynamic tone mapping is performed using Bezier curve information of profile B data, it is necessary to convert from profile B data to profile A data.
  • the brightness of an output screen suddenly changes, which may cause viewer discomfort.
  • the display device 100 needs to perform image quality processing suitable for the graphic content 320, but when it cannot be determined whether the graphic content 320 is output, the graphic content is displayed. It is judged as a video content image. In this case, during image processing or quality processing of the video content, the characteristics of the actual video content image may be distorted.
  • the display apparatus 100 may not detect the letter box 315. That is, the display apparatus 100 may determine that the letter box 315 does not exist even though the letter box 315 is included in the video content. Accordingly, during image processing or image quality processing of video content, characteristics of an actual video content image may be distorted.
  • the display device 100 determines whether the graphic content 320 is activated while the HDR content 310 is being output, and meta data for tone mapping applied to the section in which the graphic content is activated. Can be determined.
  • the display apparatus 100 may include a mode change detection unit 410 and a meta data determination unit 420, and the meta data determination unit 420 may include a meta data conversion unit ( 421), a metadata comparison unit 423, and an IIR filter application unit 425 may be included.
  • the display apparatus 100 may perform tone mapping by using dynamic metadata corresponding to HDR content in a section in which graphic content is not output (dynamic tone mapping mode).
  • the display apparatus 100 may perform tone mapping using static metadata corresponding to HDR content in a section in which graphic content is output (static tone mapping mode).
  • the display device 100 needs to switch from the dynamic tone mapping mode to the static tone mapping mode.
  • the mode change detector 410 may determine whether graphic content is output based on dynamic metadata corresponding to the HDR content.
  • Dynamic metadata may include profile A data and profile B data.
  • Profile A data may include luminance distribution information (percentile information) of an image
  • profile B data may include Bezier Curve information of an image. However, it is not limited thereto.
  • FIG. 5 is a diagram illustrating a change in dynamic metadata when graphic content is activated, according to an exemplary embodiment.
  • percentage information of profile A data is gradually changed to a value defined in static metadata. For example, if the value defined in the static metadata is '9', the percentage information gradually changes from '1' to '9'.
  • the Bezier curve information of the profile B data changes from'n' to '0'.
  • the Bezier curve information may mean the number of points (eg, a knee point or an anchor point) of the Bezier curve, but is not limited thereto.
  • Bezier curve information of profile B data changes from '8' to '0'.
  • the percentage information of the profile A data gradually changes.
  • the percentage information of profile A data gradually changes from '9' to '1'.
  • the Bezier curve information of the profile B data changes from '0' to '8'.
  • the mode change detection unit 410 may determine a section in which graphic content is activated based on a change in the percentage information of the profile A data and the Bezier curve information of the profile B data. For example, when the percentage information of the profile A data gradually changes to a value defined in the static metadata, and the Bezier curve information of the profile B changes to 0, it may be determined that the graphic content is activated.
  • the mode change detection unit 410 may determine the first time point t1 as a start time point of activation of the graphic content.
  • the mode change detection unit 410 may determine a graphic content activation end time based on a change in the percentage information of the profile A data and the Bezier curve information of the profile B data, and the section 510 in which the graphic content is activated. Can be determined.
  • the mode change detection unit 410 does not activate the graphic content, but the actual image of the HDR content (video content). It can be determined that a change has occurred.
  • the meta data determiner 420 may determine the meta data applied to the section 510 in which the graphic content is activated.
  • FIG. 6 is a block diagram illustrating in detail the meta data determination unit of FIG. 4, and FIG. 7 is a diagram referenced to explain an operation method of the IIR filter application unit of FIG. 4.
  • the meta data converter 421 may generate profile B data (conversion profile B data) based on profile A data.
  • the metadata conversion unit 421 may generate Bezier curve information (second Bezier curve information) of profile B data by using percentage information of profile A data.
  • the meta data conversion unit 421 may generate Bezier curve information (third Bezier curve information) of profile B data by using static meta data.
  • the metadata comparison unit 423 includes Bezier curve information (first Bezier curve information (Profile B)) of profile B data corresponding to the HDR content and Bezier curve information (first Bezier curve information) generated from the percentage information of the profile A data. 2 Compare the Bezier curve information (B_conv)), and the Infinite Impulse Response (IIR) filter application unit 425 determines metadata applied to the section 510 in which the graphic content is activated, based on the comparison result. You can decide how to apply the IIR filter.
  • Bezier curve information first Bezier curve information (Profile B)
  • Bezier curve information first Bezier curve information generated from the percentage information of the profile A data. 2
  • B_conv Compare the Bezier curve information
  • IIR Infinite Impulse Response
  • the meta data comparison unit 423 may compare the coefficients of the first Bezier curve and the coefficients of the second Bezier curve.
  • the metadata comparison unit 423 may compare the coefficients of the first Bezier curve, the coefficient of the second Bezier curve, and the coefficient of the third Bezier curve.
  • the IIR filter application unit 425 performs tone mapping using the first Bezier curve without applying the IIR filter. I can.
  • the IIR filter application unit 425 is By applying an IIR filter to the curve, the first Bezier curve can be gradually transformed until the coefficient of the first Bezier curve becomes the same as the coefficient of the second Bezier curve.
  • the meta data determiner 420 may determine meta data corresponding to a section in which graphic content is activated (section T1) by using the first Bezier curve to which the IIR filter is applied.
  • the IIR filter application unit 425 applies the IIR filter to the coefficients of the second Bezier curve,
  • the second Bezier curve may be gradually transformed until is equal to the coefficient of the third Bezier curve.
  • the meta data determiner 420 may determine meta data corresponding to a section in which graphic content is activated (section T2) by using the second Bezier curve to which the IIR filter is applied.
  • the metadata determination unit 420 uses the third Bezier curve to determine the meta data corresponding to the section in which the graphic content is activated. You can decide.
  • the display device 100 Based on the first Bezier curve to which the IIR filter is applied, the second Bezier curve to which the IIR filter is applied, and the third Bezier curve, the display device 100 includes HDR content (video content) and Tone mapping of graphic contents can be performed.
  • HDR content video content
  • Tone mapping of graphic contents can be performed.
  • a sudden change from the first Bezier curve to the second Bezier curve does not occur (for example, the dotted line graph shown in FIG. 7A), and thus a rapid change in screen brightness is prevented. Can be prevented.
  • the meta data determination unit 420 determines that the coefficient of the first Bezier curve is the second Until the coefficient of the Bezier curve becomes equal to the coefficient of the Bezier curve, the meta data corresponding to the section (section T3) in which the graphic content is activated may be determined using the first Bezier curve.
  • the IIR filter application unit 425 applies the IIR filter to the second Bezier curve,
  • the second Bezier curve can be gradually transformed until the coefficients of the 3 Bezier curve become the same.
  • the meta data determiner 420 may determine the converted second Bezier curve as meta data corresponding to the section T4 in which the graphic content is activated.
  • the metadata determination unit 420 uses the third Bezier curve to determine the meta data corresponding to the section in which the graphic content is activated. You can decide.
  • the display device 100 is based on a first Bezier curve, a second Bezier curve to which an IIR filter is applied, and a third Bezier curve, in a section in which graphic content is activated, tones of HDR content (video content) and graphic content. Mapping can be performed.
  • a sudden change from the first Bezier curve to the second Bezier curve does not occur (for example, the dotted line graph shown in FIG. 7B), so that a sudden change in screen brightness is prevented. Can be prevented.
  • FIG. 8 is a flowchart illustrating a method of operating a display device according to an exemplary embodiment.
  • the display apparatus 100 may acquire video content, dynamic metadata corresponding to the video content, and static metadata (S810).
  • Video content according to an embodiment may be HDR content.
  • the display device 100 may receive HDR content, dynamic metadata corresponding to the HDR content, and static metadata through an input/output unit.
  • the display device 100 may receive HDR content, dynamic metadata corresponding to the HDR content, and static metadata from an external server through a network.
  • the display apparatus 100 may determine a section in which the graphic content is activated based on dynamic metadata corresponding to the HDR content (S820).
  • Dynamic metadata may include profile A data and profile B data.
  • Profile A data may include luminance distribution information (percentile information) of an image
  • profile B data may include Bezier Curve information of an image. However, it is not limited thereto.
  • the display apparatus 100 may determine a section in which graphic content is activated based on the percentage information of the profile A data and the change in Bezier curve information of the profile B data.
  • the display apparatus 100 may determine metadata applied to a section in which graphic content is activated (S830).
  • Step 830 (S830) will be described in detail with reference to FIG. 9.
  • step 830 S830 of FIG. 8.
  • the display apparatus 100 may convert percentage information of profile A data into second Bezier curve information (S910). Also, the display device 100 may convert static metadata into third Bezier curve information.
  • the display apparatus 100 may compare the first Bezier curve information and the second Bezier curve information of the profile B data.
  • the display apparatus 100 may compare the coefficients of the first Bezier curve and the coefficients of the second Bezier curve (S920 ).
  • the display apparatus 100 may perform tone mapping using the first Bezier curve without applying an IIR filter.
  • the display apparatus 100 applies an IIR filter to the first Bezier curve, so that the coefficient of the first Bezier curve is the second Bezier curve. Until the coefficient of is equal to, the first Bezier curve may be gradually transformed (S930).
  • the display apparatus 100 may determine metadata corresponding to a section in which graphic content is activated using the first Bezier curve to which the IIR filter is applied.
  • the display device 100 displays the first Bezier curve until the coefficient of the first Bezier curve becomes the same as the coefficient of the second Bezier curve.
  • Meta data corresponding to a section in which graphic content is activated may be determined using the curve (S940).
  • the display apparatus 100 applies an IIR filter to the coefficient of the second Bezier curve, so that the coefficient of the second Bezier curve becomes the third Bezier curve.
  • the second Bezier curve may be gradually transformed until the coefficient of the tailored curve becomes the same (S950).
  • the display apparatus 100 may determine metadata corresponding to a section in which graphic content is activated using the second Bezier curve to which the IIR filter is applied.
  • the display apparatus 100 may perform tone mapping of HDR content (video content) and graphic content in a section in which graphic content is activated (S840).
  • FIG. 10 is a block diagram illustrating a configuration of a display device according to an exemplary embodiment.
  • the display apparatus 100 may include an input/output unit 110, a processor 120, and a memory 130.
  • the input/output unit 110 includes video (eg, video), audio (eg, voice, music, etc.) from the outside of the display device 100 under the control of the processor 120. ) And additional information (eg, EPG, etc.). Alternatively, the input/output unit 110 transmits video, audio, and additional information to an external device under the control of the processor 120.
  • the input/output unit 110 may include one of an HDMI port (High-Definition Multimedia Interface port), a component jack, a PC port, and a USB port.
  • the input/output unit 110 may include a combination of an HDMI port, a component jack, a PC port, and a USB port.
  • the input/output unit 110 When HDR content according to an embodiment is stored in a data storage medium and played back in an external playback device, the input/output unit 110 provides HDR content from an external playback device, dynamic metadata corresponding to the HDR content, and static metadata. Can be received.
  • the processor 120 may control the overall operation of the display device 100.
  • the processor 120 may control other components included in the display apparatus 100 to perform a predetermined operation.
  • the processor 120 may execute one or more programs stored in the memory 130.
  • the processor 120 may include a single core, a dual core, a triple core, a quad core, and a multiple of cores.
  • the processor 120 may include a plurality of processors.
  • the memory 130 may store various data, programs, or applications for driving and controlling the electronic device 100.
  • a program stored in the memory 130 may include one or more instructions.
  • a program (one or more instructions) or an application stored in the memory 130 may be executed by the processor 120.
  • the processor 120 includes the operations of the mode change detection unit 410, the meta data conversion unit 421, the meta data comparison unit 423, and the IIR filter application unit 425 illustrated and described in FIG. 4. At least one can be performed or controlled to be performed. For example, while outputting HDR content, the processor 120 may determine whether to output graphic content based on dynamic metadata. For example, whether the percentage information of the profile A data included in the dynamic metadata gradually changes to a preset value, and the Bezier curve information of the profile B data included in the dynamic metadata is changed from'n' to '0'. Depending on whether it is changed to or not, a section in which graphic content is activated may be determined.
  • the processor 120 may determine metadata applied to a section in which the graphic content is activated.
  • the processor 120 may generate Bezier curve information (second Bezier curve information) of the profile B data by using the percentage information of the profile A data.
  • the processor 120 may generate Bezier curve information (third Bezier curve information) of profile B data by using static metadata.
  • the processor 120 receives Bezier curve information (first Bezier curve information) of the profile B data corresponding to the HDR content and Bezier curve information (second Bezier curve information) generated from the percentage information of the profile A data.
  • Bezier curve information first Bezier curve information
  • second Bezier curve information second Bezier curve information
  • the processor 120 may perform tone mapping of the HDR content (video content) and the graphic content in a section in which the graphic content is activated based on the determined metadata. In addition, the processor 120 may control the HDR content and graphic content for which the tone mapping has been performed to be displayed on the display.
  • the processor 120 transmits information on whether graphic content is output or a section in which the graphic content is activated to other components in the display device 100, and when image processing or image quality is processed by other components, the graphic content Accordingly, it is possible to prevent the characteristics of the actual video content image from being distorted.
  • FIG. 11 is a block diagram illustrating a configuration of a display apparatus 1000 according to another exemplary embodiment.
  • the display device 1000 of FIG. 11 may be an embodiment of the display device 100 described with reference to FIGS. 1 to 10.
  • a display device 1000 includes a tuner unit 1040, a processor 1010, a display unit 1020, a communication unit 1050, a sensing unit 1030, an input/output unit. (1070), a video processing unit 1080, an audio processing unit 1085, an audio output unit 1060, a memory 1090, and a power supply unit 1095.
  • the input/output unit 1070 of FIG. 11 is in the input/output unit 110 of FIG. 10, the processor 1010 of FIG. 11 is in the processor 120 of FIG. 10, and the memory 1090 of FIG. 11 is in FIG. Since each corresponds to the memory 130 of 10, the same description will be omitted.
  • the tuner unit 1040 is intended to be received by the electronic device 1000 from among many radio wave components through amplification, mixing, resonance, etc. of a broadcast signal received by wire or wirelessly. It can be selected by tuning only the frequency of the channel.
  • the broadcast signal includes audio, video, and additional information (eg, Electronic Program Guide (EPG)).
  • EPG Electronic Program Guide
  • the tuner unit 1040 may receive broadcast signals from various sources such as terrestrial broadcasting, cable broadcasting, satellite broadcasting, and Internet broadcasting.
  • the tuner unit 1040 may receive a broadcast signal from a source such as analog broadcast or digital broadcast.
  • the communication unit 1050 may transmit and receive data or signals with an external device or a server under the control of the processor 1010.
  • the processor 1010 may transmit/receive content to/from an external device connected through the communication unit 1050, download an application from the external device, or browse the web.
  • the communication unit 1050 corresponds to the performance and structure of the display device 1000, and provides a wireless LAN 1051 (for example, Wi-Fi), Bluetooth 1052, and wired Ethernet 1053.
  • IR infrared
  • BLE bluetooth low energy
  • ultrasound and zigbee (zigbee) can transmit and receive data or signals in at least one method.
  • the communication unit 1050 may receive HDR content, metadata corresponding to the HDR content, and graphic content from an external device.
  • the video processing unit 1080 processes video data received by the display apparatus 1000.
  • the video processing unit 1080 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion on video data.
  • the sensing unit 1030 detects a user's voice, a user's image, or a user's interaction, and may include a microphone 1031, a camera unit 1032, and a light receiving unit 1033.
  • the microphone 1031 receives a user's uttered voice.
  • the microphone 1031 may convert the received voice into an electrical signal and output it to the processor 1010.
  • the user voice may include, for example, a voice corresponding to a menu or function of the electronic device 1000.
  • the camera unit 1032 may receive an image (eg, a continuous frame) corresponding to a user's motion including a gesture in the camera recognition range.
  • the processor 1010 may select a menu displayed on the electronic device 1000 using the received motion recognition result or may perform a control corresponding to the motion recognition result.
  • the light receiving unit 1033 receives an optical signal (including a control signal) received from an external control device through a light window (not shown) of the bezel of the display unit 1020.
  • the light receiving unit 1033 may receive an optical signal corresponding to a user input (eg, a touch, a push, a touch gesture, a voice, or a motion) from the control device.
  • a control signal may be extracted from the received optical signal under control of the processor 1010.
  • the input/output unit 1070 includes video (for example, video), audio (for example, voice, music, etc.) and additional information (for example, from outside the display device 1000) under the control of the processor 1010. For example, EPG, etc.). Alternatively, the input/output unit 1070 transmits video, audio, and additional information to an external device under the control of the processor 1010.
  • the input/output unit 1070 is one of an HDMI port (High-Definition Multimedia Interface port, 1071), a component jack (1072), a PC port (PC port, 1073), and a USB port (USB port, 1074). It may include.
  • the processor 1010 controls the overall operation of the display device 1000 and a signal flow between internal components of the electronic device 1000 and processes data.
  • the processor 1010 may execute an OS (Operation System) stored in the memory 1090 and various applications when there is a user input or a preset and stored condition is satisfied.
  • OS Operaation System
  • the processor 1010 stores a signal or data input from the outside of the display device 1000, or is used as a storage area corresponding to various tasks performed by the electronic device 1000, a display device 1000 It may include a ROM (ROM) and a processor (Processor) in which a control program for controlling the control program is stored.
  • ROM read-only memory
  • PROM processor
  • the processor 1010 may include a graphic processing unit (not shown).
  • the graphic processing unit (not shown) generates a screen including various objects such as icons, images, texts, etc. using an operation unit (not shown) and a rendering unit (not shown).
  • the operation unit calculates attribute values such as coordinate values, shape, size, color, etc. to be displayed for each object according to the layout of the screen by using the user input sensed through the detection unit 1030.
  • the rendering unit generates screens of various layouts including objects based on the attribute values calculated by the calculation unit. The screen generated by the rendering unit is displayed in the display area of the display unit 1020.
  • the display unit 1020 converts an image signal, a data signal, an OSD signal, a control signal, and the like processed by the processor 1010 to generate a driving signal.
  • the display unit 1020 may be implemented as a PDP, LCD, OLED, flexible display, or the like, and may also be implemented as a 3D display.
  • the display unit 1020 may be configured as a touch screen and used as an input device other than an output device.
  • the audio processing unit 1085 processes audio data.
  • the audio processing unit 1085 may perform various processing such as decoding, amplification, noise filtering, and the like for audio data. Meanwhile, the audio processing unit 1085 may include a plurality of audio processing modules to process audio corresponding to a plurality of contents.
  • the audio output unit 1060 outputs audio included in a broadcast signal received through the tuner unit 1040 under the control of the processor 1010.
  • the audio output unit 860 may output audio (eg, voice, sound) input through the communication unit 1050 or the input/output unit 1070.
  • the audio output unit 1060 may output audio stored in the memory 1090 under the control of the processor 1010.
  • the audio output unit 1060 may include at least one of a speaker 1061, a headphone output terminal 1062, or an S/PDIF (Sony/Philips Digital Interface: output terminal 1063).
  • the audio output unit 1060 may include A combination of a speaker 1061, a headphone output terminal 1062, and an S/PDIF output terminal 1063 may be included.
  • the power supply unit 1095 supplies power input from an external power source to components inside the display apparatus 1000 under the control of the processor 1010. Also, the power supply unit 1095 may supply power output from one or more batteries (not shown) located inside the display apparatus 1000 under the control of the processor 1010 to internal components.
  • the memory 1090 may store various data, programs, or applications for driving and controlling the display apparatus 1000 under the control of the processor 1010.
  • the memory 1090 includes a broadcast reception module, a channel control module, a volume control module, a communication control module, a voice recognition module, a motion recognition module, an optical reception module, a display control module, an audio control module, an external input control module, and a power supply. It may include a control module, a power control module of an external device connected wirelessly (eg, Bluetooth), a voice database (DB), or a motion database (DB).
  • Modules and databases not shown in the memory 1090 include a broadcast reception control function, a channel control function, a volume control function, a communication control function, a voice recognition function, a motion recognition function, and an optical reception control function.
  • a display control function, an audio control function, an external input control function, a power control function, or a power control function of an external device connected by wireless (eg, Bluetooth) may be implemented in the form of software.
  • the processor 1010 may perform each function using these software stored in the memory 1090.
  • FIGS. 10 and 11 are block diagrams for an exemplary embodiment.
  • Each component of the block diagram may be integrated, added, or omitted according to the specifications of the display devices 100 and 1000 that are actually implemented. That is, if necessary, two or more components may be combined into one component, or one component may be subdivided into two or more components to be configured.
  • the functions performed by each block are for describing the embodiments, and specific operations or devices thereof do not limit the scope of the present invention.
  • the method of operating a display device may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, and the like alone or in combination.
  • the program instructions recorded in the medium may be specially designed and configured for the present invention, or may be known and usable to those skilled in computer software.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
  • -A hardware device specially configured to store and execute program instructions such as magneto-optical media, and ROM, RAM, flash memory, and the like.
  • Examples of program instructions include not only machine language codes such as those produced by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
  • a method of operating a display device may be provided by being included in a computer program product.
  • Computer program products can be traded between sellers and buyers as commodities.
  • the computer program product may include a S/W program and a computer-readable storage medium in which the S/W program is stored.
  • a computer program product may include a product (eg, a downloadable app) in the form of a S/W program that is electronically distributed through a manufacturer of a broadcast receiving device or an electronic market (eg, Google Play Store, App Store). I can.
  • a product eg, a downloadable app
  • the storage medium may be a server of a manufacturer, a server of an electronic market, or a storage medium of a relay server temporarily storing an SW program.
  • the computer program product may include a storage medium of a server or a storage medium of a client device in a system composed of a server and a client device.
  • a third device eg, a smart phone
  • the computer program product may include a storage medium of the third device.
  • the computer program product may include a S/W program itself transmitted from a server to a client device or a third device, or transmitted from a third device to a client device.
  • one of the server, the client device, and the third device may execute the computer program product to perform the method according to the disclosed embodiments.
  • two or more of a server, a client device, and a third device may execute a computer program product to distribute and implement the method according to the disclosed embodiments.
  • a server eg, a cloud server or an artificial intelligence server

Abstract

Un dispositif d'affichage selon un mode de réalisation décrit comprend : une mémoire pour stocker une ou plusieurs instructions ; et un processeur pour exécuter la ou les instructions stockées dans la mémoire. Le processeur peut recevoir un contenu vidéo, des métadonnées dynamiques et des métadonnées statiques correspondant au contenu vidéo, peut déterminer, sur la base des métadonnées dynamiques, la section dans laquelle un contenu graphique doit être activé pendant la lecture du contenu vidéo, peut déterminer des métadonnées à appliquer à la section dans laquelle le contenu graphique doit être activé sur la base de données de profil A et de données de profil B incluses dans les métadonnées dynamiques et des données statiques, et déterminer les métadonnées de telle sorte que les métadonnées sont converties progressivement à partir des données de profil B vers les métadonnées statiques sur la base de la différence entre les données de profil B et les données de profil B converties générées sur la base des données de profil A.
PCT/KR2020/009375 2019-10-16 2020-07-16 Dispositif d'affichage et son procédé de fonctionnement WO2021075672A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0128717 2019-10-16
KR1020190128717A KR20210045227A (ko) 2019-10-16 2019-10-16 디스플레이 장치 및 그 동작방법

Publications (1)

Publication Number Publication Date
WO2021075672A1 true WO2021075672A1 (fr) 2021-04-22

Family

ID=75537863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/009375 WO2021075672A1 (fr) 2019-10-16 2020-07-16 Dispositif d'affichage et son procédé de fonctionnement

Country Status (2)

Country Link
KR (1) KR20210045227A (fr)
WO (1) WO2021075672A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113411533B (zh) * 2021-06-15 2023-03-31 三星电子(中国)研发中心 一种高动态范围制式的转换方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180018932A1 (en) * 2016-05-27 2018-01-18 Dolby Laboratories Licensing Corporation Transitioning between video priority and graphics priority
KR20190000762A (ko) * 2017-06-23 2019-01-03 삼성전자주식회사 전자 장치, 디스플레이 장치 및 그 제어 방법
KR20190008070A (ko) * 2017-07-13 2019-01-23 삼성전자주식회사 전자 장치, 디스플레이 장치 및 그 제어 방법
WO2019069483A1 (fr) * 2017-10-06 2019-04-11 パナソニックIpマネジメント株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image
KR20190059006A (ko) * 2017-11-22 2019-05-30 톰슨 라이센싱 디스플레이 적응형 hdr 이미지를 재구성하기 위한 방법 및 디바이스

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180018932A1 (en) * 2016-05-27 2018-01-18 Dolby Laboratories Licensing Corporation Transitioning between video priority and graphics priority
KR20190000762A (ko) * 2017-06-23 2019-01-03 삼성전자주식회사 전자 장치, 디스플레이 장치 및 그 제어 방법
KR20190008070A (ko) * 2017-07-13 2019-01-23 삼성전자주식회사 전자 장치, 디스플레이 장치 및 그 제어 방법
WO2019069483A1 (fr) * 2017-10-06 2019-04-11 パナソニックIpマネジメント株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image
KR20190059006A (ko) * 2017-11-22 2019-05-30 톰슨 라이센싱 디스플레이 적응형 hdr 이미지를 재구성하기 위한 방법 및 디바이스

Also Published As

Publication number Publication date
KR20210045227A (ko) 2021-04-26

Similar Documents

Publication Publication Date Title
US20060181645A1 (en) TV and method of setting wallpaper or screen saver mode thereof
JP5515389B2 (ja) オーディオ処理装置及びオーディオ処理方法
CN101146199B (zh) 视频信息处理设备、视频信息处理方法
WO2018043977A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
KR102276855B1 (ko) 영상 컨텐츠를 재생하는 재생 장치 및 그 동작방법
WO2021137437A1 (fr) Appareil d'affichage et procédé de commande associé
WO2018131806A1 (fr) Appareil électronique et son procédé de fonctionnement
WO2020101189A1 (fr) Appareil de traitement d'image et d'audio et son procédé de fonctionnement
WO2017146518A1 (fr) Serveur, appareil d'affichage d'image et procédé pour faire fonctionner l'appareil d'affichage d'image
WO2019031767A1 (fr) Appareil d'affichage et procédé de commande associé
WO2021075672A1 (fr) Dispositif d'affichage et son procédé de fonctionnement
WO2019194457A1 (fr) Appareil électronique, procédé de commande de celui-ci, et support d'enregistrement associé
WO2021251632A1 (fr) Dispositif d'affichage pour générer un contenu multimédia et procédé de mise en fonctionnement du dispositif d'affichage
WO2019031718A1 (fr) Appareil électronique et procédé de commande associé
WO2020122554A1 (fr) Appareil d'affichage et son procédé de commande
US7327402B2 (en) Video displayer facilitating channels and video/audio input settings
WO2020184856A1 (fr) Dispositif de réception de diffusion et procédé de fonctionnement de celui-ci
WO2022080866A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022181865A1 (fr) Dispositif d'affichage et son procédé de fonctionnement
US20070171307A1 (en) Media playback system with real-time camera image display and method thereof
WO2020111744A1 (fr) Dispositif électronique et procédé de commande associé
WO2021004046A1 (fr) Procédé et appareil de traitement audio, et dispositif d'affichage
CN113542829A (zh) 分屏显示方法、显示终端及可读存储介质
WO2022164193A1 (fr) Dispositif d'affichage et son procédé de fonctionnement
WO2014119896A1 (fr) Appareil d'affichage et son procédé d'affichage de menu

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20876663

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20876663

Country of ref document: EP

Kind code of ref document: A1