CN106796775A - Display device and the method for controlling the display device - Google Patents

Display device and the method for controlling the display device Download PDF

Info

Publication number
CN106796775A
CN106796775A CN201580054198.4A CN201580054198A CN106796775A CN 106796775 A CN106796775 A CN 106796775A CN 201580054198 A CN201580054198 A CN 201580054198A CN 106796775 A CN106796775 A CN 106796775A
Authority
CN
China
Prior art keywords
image
brightness value
mapping function
dynamic range
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201580054198.4A
Other languages
Chinese (zh)
Other versions
CN106796775B (en
Inventor
韩升勳
李尚昱
徐贵原
金昌源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to CN201911350815.1A priority Critical patent/CN110992914B/en
Priority claimed from PCT/KR2015/010387 external-priority patent/WO2016056787A1/en
Publication of CN106796775A publication Critical patent/CN106796775A/en
Application granted granted Critical
Publication of CN106796775B publication Critical patent/CN106796775B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/57Control of contrast or brightness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/77Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0673Adjustment of display parameters for control of gamma adjustment, e.g. selecting another gamma curve
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • G09G2340/0428Gradation resolution change
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/16Calculation or use of calculated indices related to luminance levels in display data

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

A kind of display device, including:Content receipt unit, is configured to receive high dynamic range images;Graphics processing unit, is configured to detect that brightness value is equal to or more than the first area of reference brightness value in high dynamic range images, and the characteristic information of the image based on first area performs tone mapping to the image of first area;And display unit, it is configured to show the low dynamic range echograms of executed tone mapping.

Description

Display device and the method for controlling the display device
Technical field
Implementation method is related to a kind of Liquid Crystal Display And Method For Driving, the liquid crystal display have be thermally generated unit with pre- Hydrothermal solution LCD panel.
Background technology
It is known that, conventionally, the brightness of real world has 100,000,000:1 dynamic range.Furthermore it is known that can be by human eye The contrast range (that is, the dynamic range of human eye) of differentiation is for about 1,000:1 to 10,000:1.The known phase according to state-of-the-art technology The dynamic range of machine is for about 10,000:1.
On the other hand, liquid crystal display panel, plasma display, the organic light-emitting diodes of display device are widely used as Tube face plate etc. has about 100:1 to 1000:1 dynamic range.
In other words, the dynamic range of the image that can be exported from display device than can by human eye distinguish dynamic range and can Dynamic range by detections such as cameras is narrower.
So, the bigger image of the dynamic range of the image by dynamic range than that can be exported from conventional display device is referred to as high Dynamic range (HDR) image.With high dynamic range images conversely, by dynamic range be equal to or less than can be defeated from conventional display device The image of the dynamic range of the image for going out referred to as low-dynamic range (LDR) image.
When high dynamic range images are input into from such image source, display device is performed and changes high dynamic range images Into the operation of displayable dynamic range.Such operation is referred to as " tone mapping ".
Tone mapping method in correlation technique includes following methods:Compress whole dynamic range and by HDR figure Method as being converted into low dynamic range echograms;High dynamic range images are directly displayed at into the display with low-dynamic range to set Standby upper method etc..
However, the method according to the whole dynamic range of high dynamic range images is compressed, the problem for existing has:With Original image is compared, and is significantly reduced from the lightness of the image of display device output.
Additionally, the method according to being directly displayed at high dynamic range images on the display device with low-dynamic range, The problem of presence has:Do not show the image with the brightness that can not be shown on the display device.
The content of the invention
Technical problem
Therefore, it is that display device and its control method are provided in terms of the disclosure, by the display device and controlling party Method, display image on the display device can maintain the lightness of original image to be not changed in, and can exhibit high brilliance region In included image information.
Technical scheme
According to the aspect of disclosed implementation method, there is provided a kind of display device, the display device includes:Content reception Unit, is configured to receive the monochrome information of high dynamic range images and high dynamic range images;Graphics processing unit, is configured to base Tone mapping is performed in monochrome information, to cause that high dynamic range images are converted into low dynamic range echograms;And display unit, It is configured to show low dynamic image, wherein, monochrome information includes the maximum brightness value and minimum luminance value of high dynamic range images.
Monochrome information may include the maximum brightness value and minimum luminance value of included high dynamic range images in scene.
Monochrome information may include to form the maximum brightness value and minimum luminance value of the high dynamic range images of frame.
Monochrome information may include the maximum brightness value and minimum brightness of included high dynamic range images in entire content Value.
Graphics processing unit can detect that brightness value is equal to or more than the first of reference brightness value in high dynamic range images Region, and the characteristic information of the image based on first area performs tone mapping to the image of first area;Characteristic information can At least one of marginal information, texture information and gray scale (gradation) information including high dynamic range images.
Graphics processing unit can detect fringe region in the image of first area, and based on included in fringe region Pixel histogram generate the first mapping function.
First mapping function can have the slope changed according to the quantity of pixel included in fringe region.
In the first mapping function, the slope at position in brightness value edge region more than included pixel quantity is big Slope in brightness value edge region at the few position of included quantity.
First mapping function can be integrated by the histogram to pixel included in fringe region and obtained Accumulative histogram.
Graphics processing unit can detect texture region in the image of first area, and based on included in texture region Pixel histogram generate the first mapping function.
Graphics processing unit can detect gray areas in the image of first area, and based on included in gray areas Pixel histogram generate the first mapping function.
The brightness value that graphics processing unit can be based on high dynamic range images generates the second mapping function.
Graphics processing unit can perform the mapping of the second tone, and root to high dynamic range images according to the second mapping function The first tone is performed according to the first mapping function to the image for performing the mapping of the second tone to map.
The brightness value of the second area that graphics processing unit can be based in high dynamic range images generates the second mapping function, Wherein, the brightness value of second area is less than reference brightness value.
Graphics processing unit can based on the first mapping function and the second mapping function generation tone mapping function, and according to High dynamic range images are converted into low dynamic range echograms by tone mapping function.Graphics processing unit can be to HDR figure The first pixel that included multiple pixels are central as in, brightness value is less than reference brightness value performs linear tone mapping, and To in the middle of multiple pixels, brightness value be equal to or more than reference brightness value the second pixel perform non-linear tone mapping.
When the scene average luminance value of high dynamic range images included in scene is less than reference brightness value, at image Reason unit can in the middle of multiple pixels included in high dynamic range images, brightness value less than reference brightness value the first pixel Perform the mapping of linear tone, and in the middle of multiple pixels, brightness value is equal to or more than the second pixel of reference brightness value and performs Non-linear tone maps.
When the average scene average brightness value of the scene of high dynamic range images included in scene is equal to or more than reference During brightness value, graphics processing unit can in the middle of multiple pixels included in high dynamic range images, brightness value be less than scene First pixel of average brightness value performs linear tone mapping, and in the middle of multiple pixels, brightness value be equal to or more than scene Second pixel of average brightness value performs non-linear tone mapping.
According to the another aspect of disclosed implementation method, there is provided the method for control display device, the method includes:Connect Receive the monochrome information of high dynamic range images and high dynamic range images;Tone mapping is performed based on monochrome information, to cause height Dynamic image is converted into low dynamic range echograms;Low dynamic image is shown, wherein, monochrome information includes HDR figure The maximum brightness value and minimum luminance value of picture.
Monochrome information may include the maximum brightness value and minimum luminance value of included high dynamic range images in scene.
Monochrome information may include to form the maximum brightness value and minimum luminance value of the high dynamic range images of frame.
Monochrome information may include the maximum brightness value and minimum brightness of included high dynamic range images in entire content Value.
Performing tone mapping may include:Detect that brightness value is equal to or more than reference brightness value in high dynamic range images First area, and the characteristic information of the image based on first area generates tone mapping function;According to tone mapping function pair High dynamic range images perform tone mapping, and high dynamic range images are converted into low dynamic image.Characteristic information may include At least one of the marginal information of high dynamic range images, texture information and half-tone information.
It is aobvious by this according to the aspect of disclosed implementation method, there is provided such display device and its control method Show equipment and its control method, use high-brightness region and low brightness area different tone mapping functions, therefore, display Image on the display device can maintain the lightness of original image, and can be included in exhibit high brilliance region image letter Breath.
According to the another aspect of disclosed implementation method, there is provided the method for control display device, the method includes:Really Determine the first area of image, the first area of described image has brightness higher than the second area of image;It is determined that with first Region first mapping function and second mapping function corresponding with second area, wherein, the first mapping function strengthen one or Multiple images feature, and the second mapping function increases lightness;And in response to brightness, use the first mapping function and second Mapping function is mapped image, with the lightness for keeping the image of second area and the spy of the image for keeping first area Reference ceases.
The beneficial effect of the invention
There is provided such display device and its control method, by the display device and its control method, it is displayed in aobvious Show that the image in equipment can maintain the lightness of original image without changing, and can be included in exhibit high brilliance region figure As information.
Brief description of the drawings
With reference to accompanying drawing, according to the following description of implementation method, the disclosure these and/or other aspect will be apparent with And be more easily understood, in the accompanying drawings:
Fig. 1 shows the external form of the display device according to implementation method;
Fig. 2 shows control configuration (control configuration) of the display device according to implementation method;
Included example images processing unit in the display device that Fig. 3 shows according to implementation method;
Fig. 4 show according to implementation method by graphics processing unit included in display device carry out by image line data The exemplary operation of property;
Fig. 5 shows exemplary original image;
Fig. 6 shows the brightness histogram of the original image shown in Fig. 5;
Fig. 7 shows the example for being divided the original image shown in Fig. 5 according to the brightness value of pixel;
Fig. 8 shows that the original image shown in Fig. 5 is divided into obtained example images according to brightness value;
Fig. 9 shows that the original image shown in Fig. 5 is divided into obtained another exemplary figure according to brightness value Picture;
Figure 10 shows that the graphics processing unit shown in Fig. 3 extracts the example of characteristic point from first area;
Figure 11 shows that the graphics processing unit shown in Fig. 3 is based on characteristic point first mapping function of generation of first area Example;
Figure 12 shows that the graphics processing unit shown in Fig. 3 generates the first mapping function according to the characteristic point of first area Another example;
Figure 13 shows that image of the graphics processing unit based on second area shown in Fig. 3 generates the second mapping function Example;
Figure 14 shows the exemplary tone mapping function of the graphics processing unit generation shown in Fig. 3;
Figure 15 shows that the display device of correlation technique performs the result obtained when tone maps to high-dynamics image;
Figure 16 shows to perform high-dynamics image according to the display device of implementation method the result obtained when tone maps;
Figure 17 shows the exemplary high dynamic range images display operation of the display device according to implementation method;
Included another exemplary graphics processing unit in the display device that Figure 18 shows according to implementation method;
Figure 19 and Figure 20 show that the graphics processing unit shown in Figure 18 generates the example of tone mapping function;
Figure 21 shows the another exemplary high dynamic range images display operation of the display device according to implementation method;
Included another exemplary graphics processing unit in the display device that Figure 22 shows according to implementation method;
Figure 23 shows that the graphics processing unit shown in Figure 22 performs the example of tone mapping to the image of first area;
Figure 24 shows that the graphics processing unit shown in Figure 22 performs the example of tone mapping to the image of second area;
Figure 25 shows the another exemplary high dynamic range images display operation of the display device according to implementation method;
Included another exemplary graphics processing unit in the display device that Figure 26 shows according to implementation method;
Figure 27 shows the 3rd mapping function of the graphics processing unit generation shown in Figure 26;
Figure 28 shows the another exemplary high dynamic range images display operation of the display device according to implementation method;
Included another exemplary graphics processing unit in the display device that Figure 29 shows according to implementation method;
Figure 30 and Figure 31 show the 4th mapping function of the graphics processing unit generation shown in Figure 29;And
Figure 32 shows the another exemplary high dynamic range images display operation of the display device according to implementation method.
Specific embodiment
Now with detailed reference to implementation method, its example shows in the accompanying drawings, and identical is with reference to mark in whole accompanying drawings Note represents identical element.Implementation method is described referring to the drawings.
The configuration shown in implementation method and accompanying drawing described in this specification is only the exemplary of disclosure implementation method Example.Implementation method is covered in can replace the various modifications of implementation method and accompanying drawing herein when submitting the application to.
Hereinafter, implementation method is described in detail with reference to the accompanying drawings.
Fig. 1 shows the external form of the display device according to implementation method, and Fig. 2 shows the display device according to implementation method Control configuration.
Display device 100 is that the picture signal that will can be received from the outside is processed and regarded processed image The equipment for feeling display.Hereinafter, will illustrate display device 100 is the situation of TV (TV), but present embodiment is not received This limitation.For example, display device 100 can be realized with all kinds, and such as, monitoring device, moving multimedia equipment or mobile logical Letter equipment.The type of display device 100 is unrestricted, as long as the visually equipment of display image.
As depicted in figs. 1 and 2, display device 100 includes main body 101, and main body 101 forms the external form of display device 100 simultaneously And accommodate each part of display device 100.
The stand (stand) 102 of supporting main body 101 can be provided below main body 101.Main body 101 can be stably placed at By in the plane of stand 102.However, present embodiment not limited to this, but main body 101 can be arranged on vertical table by support etc. On face, such as, wall surface.
In the front portion of main body 101, it is possible to provide button group 121 and display panel 143, wherein, button group 121 is configured to User control command is received from user, display panel 143 is configured to according to user control command display image.
Additionally, the various parts for having the function of being configured to realize display device 100 can be set in main body 101.In main body 101 It is settable to there is the control shown in Fig. 2 to configure.
Specifically, display device 100 includes input block 120, content receipt unit 130, graphics processing unit 200, aobvious Show unit 140, voice output unit 150 and main control unit 110, wherein, input block 120 is configured to receive user from user Control command;Content receipt unit 130 is configured to be received from external equipment includes the content of image and sound;Graphics processing unit 200 are configured to included view data in process content;Display unit 140 is configured to display with image included in content The corresponding image of data;Voice output unit 150 is configured to output sound corresponding with voice data included in content;With And main control unit 110 is configured to control on the whole the operation of display device 100.
Input block 120 may include to be configured to receive the button group 121 of various user control commands from user.For example, Button group 121 may include volume button, channel button and power knob, wherein, volume button is used to adjust defeated from sound Go out the size of the sound output of unit 150, channel button is used to change the communication channel that content receipt unit 130 receives content, And power knob is used for being turned on and off for the power supply of display device 100.
In button group 121 included various buttons can be used press compress switch, diaphragm switch or touch switch, its In, switchgear distribution into the pressing of detection user is pressed, diaphragm switch or touch switch are configured to detect the part of user's body Contact.However, present embodiment not limited to this, but button group 121 can be used various input methods, the input method can The output electric signal corresponding with the concrete operations of user.
Additionally, input block 120 may include remote control, the remote control remotely from user receive user control command and The user control command of reception is transmitted to display device 100.
Content receipt unit 130 can receive every content from various external equipments.
For example, content receipt unit 130 can be from antenna, Set Top Box, multimedia playback device (for example, DVD player, CD Player or Blu-ray player) content is received, wherein, into wireless receiving broadcast singal, Set Top Box is configured to have antenna configuration Line or wireless mode receive broadcast singal and suitably change received broadcast singal, and multimedia playback device is configured to Play content of the storage in multimedia storage medium etc..
Specifically, content receipt unit 130 may include multiple connectors 131, RX path select unit 133 and tuner 135, wherein, the externally connected equipment of connector 131, RX path select unit 133 is configured among multiple connectors 131 Selection receives the path of content, and tuner 135 is configured to the letter that selection receives broadcast singal when broadcast singal etc. is received Road (or frequency).
Connector 131 may include coaxial cable connector (RF coaxial cable connectors), HDMI (HDMI) connector, component video connector, composite video connector, D-sub connectors etc., wherein, coaxial cable connector It is configured to be received from antenna and includes the broadcast singal of content, high definition multimedia interface connector is configured to from Set Top Box or many Media-playback device receives content.
RX path select unit 133 selects to receive the connector of content among multiple connectors 131 as described above. For example, RX path select unit 133 can automatically select the connector for having received content according to the user control command of user 131, or artificially select the connector 131 by content is received.
Tuner 135 extracts CF (channel) from various signals when receiving broadcast singal by receptions such as antennas Transmission signal.In other words, tuner 135 can according to the channel selection command of user selection receive content channel (or frequency Rate).
Picture material in the content that reason content receipt unit 130 is received at graphics processing unit 200, and will locate The view data of reason is supplied to display unit 140.
Graphics processing unit 200 can be computer, and may include graphic process unit 201 and graphic memory 203.
Graphic memory 203 can store the image processing program and image processing data for image procossing, or deposit temporarily Store up the view data received from the view data of the output of graphic process unit 201 or from content receipt unit 130.
Graphic memory may include the volatile memory of such as SRAM or DRAM, and such as flash memory, read-only Memory (ROM), erasable programmable read only memory (EPROM) or EEPROM (EEPROM) it is non- Volatile memory.
For example, nonvolatile memory can semi-permanently store the image processing program and image procossing for image procossing Data.Volatile memory can the image processing program that is loaded from nonvolatile memory of interim storage and image processing data, The view data received from content receipt unit 130 or the view data from the output of graphic process unit 201.
Additionally, nonvolatile memory can be set to be separated with volatile memory, and form volatile memory Auxiliary storage device.
Graphic process unit 201 can be deposited to storing according to the image processing program stored in graphic memory 203 in figure View data in reservoir 203 is processed.For example, graphic process unit 201 can perform all linearities as will be described below Change the image procossing with tone mapping.
Although individually describing graphic process unit 201 and graphic memory 203 above, present embodiment is not limited to In the situation that graphic process unit 201 and graphic memory 203 are provided as single chip.Graphic process unit 201 and figure are deposited Reservoir 203 can be provided as one single chip.
The detailed operation of graphics processing unit described in detail below 200.
Display unit 140 may include to be configured to the display panel 143 of visual display images and be configured to drive display panel 143 display driver 141.
Display panel 143 can be according to the view data output image received from display driver 141.
Display panel 143 may include pixel, and pixel is the unit of display image.Each pixel can receive instruction view data Electric signal, and export the optical signalling corresponding with the electric signal for being received.
By this way, the optical signalling exported in included multiple pixels from display panel 143 is combined, Thus an image is shown on display panel 143.
If additionally, the method that display panel 143 can export optical signalling according to each pixel is divided into dry type.For example, aobvious Show that panel 143 can be divided into active display, transmissive display or reflected displaying device, wherein, pixel sheet in active display Body lights, and transmissive display is configured to stop or transmit the light sent from backlight etc., and reflected displaying device is configured to instead Penetrate or absorb the incident light from external light source.
Display panel 143 can be used cathode-ray tube (CRT) display, liquid crystal display (LCD) panel, light emitting diode (LED) panel, Organic Light Emitting Diode (OLED), plasma display (PDP), Field emission displays (FED) panel etc.. However, the not limited to this of display panel 143, and display panel 143 can be used various display methods, and can by these display methods The visual display image corresponding with view data.
Display driver 141 receives picture number according to the control signal of main control unit 110 from graphics processing unit 200 According to, and drive display panel 143 to show the image corresponding with the view data for being received.
Specifically, electric signal corresponding with view data is sent to display driver 141 multiple pictures of display panel 143 Each pixel in element.
In order to electric signal to be sent to whole pixels of display panel 143 in a short time, display driver 141 can be used Electric signal is sent to various methods each in pixel.
For example, according to interleaved method, display driver 141 alternately by electric signal be sent to multiple pixels it In odd column pixel and the pixel of even column.
Additionally, the method for scanning on the basis of a row-wise, electric signal is transmitted multiple by display driver 141 to arrange for unit sequence Pixel.
By this way, when the electric signal corresponding with view data is transferred to display panel 143 by display driver 141 Pixel in each when, each in pixel exports the optical signalling corresponding with the electric signal for being received, from picture The optical signalling of element output is combined and an image is shown on display panel 143.
Voice output unit 150 can be received according to the output of the control signal of main control unit 110 with content receipt unit 130 Content in the included corresponding sound of voice data.Voice output unit 150 may include at least one loudspeaker 151, raise Sound device 151 is configured to convert the electrical signal to voice signal.
Main control unit 110 may include primary processor 111 and main storage 113.
Main storage 113 can store the control program and control data of the operation for controlling display device 100, Yi Jilin When the storage user control command received by the input block 120 or control signal exported from primary processor 111.
Main storage 113 may include the volatile memory of such as SRAM or DRAM, and such as flash memory, read-only Memory (ROM), erasable programmable read only memory (EPROM) or EEPROM (EEPROM) it is non- Volatile memory.
For example, nonvolatile memory can semi-permanently store control program and the control for controlling display device 100 Data.Volatile memory can the control program that is loaded from nonvolatile memory of interim storage and control data, by input User control command or the control signal from the output of primary processor 111 that unit 120 is received.
Additionally, nonvolatile memory can be provided separately with volatile memory, and form the auxiliary of volatile memory Secondary storage device.
Primary processor 111 can be according to control program of the storage in main storage 113 to storing in main storage 113 Various types of data are processed.
For example, primary processor 111 user control command that is input into by input block 120 can be processed, according to Family control command generate for select for content receipt unit 130 receive content path channel selecting signal and according to User control command generates the volume control signal of the size for adjusting the sound from audio output device output.
Although individually describing primary processor 111 and main storage 113 above, present embodiment is not limited to master The situation that processor 111 and main storage 113 are provided as single chip.Primary processor 111 and main storage 113 can conducts One single chip is provided.
Main control unit 110 can be according to various parts included in the control command of user control display device 100 Operation.Specifically, main control unit 110 can control 200 pairs of picture numbers received by content receipt unit 130 of graphics processing unit According to execution image procossing, and the view data that the display image of control display unit 140 has been processed.
The configuration of graphics processing unit 200 is described below.
Included example images processing unit in the display device that Fig. 3 shows according to implementation method.
As described above, graphics processing unit 200 is included as the graphic process unit 201 and graphic memory of hardware component 203。
Additionally, graphics processing unit 200 may include as the various image processing modules of software component.Specifically, figure Processor 201 can perform various images according to image processing program of the storage in graphic memory 203 and image processing data Treatment operation.When graphics processing unit 200 is divided according to the image processing operations performed by graphic process unit 201, figure As processing unit 200 may include various image processing modules as shown in Figure 3.
As shown in figure 3, graphics processing unit 200 may include:Image receiver module 205, is configured to receive view data ID With metadata MD;Linearization block 210, is configured to linearly change view data;Region division module 220, is configured to based on bright Degree divides image;First mapping function generation module 231, is configured to the tone mapping letter of first area of the generation with high brightness Number;Second mapping function generation module 232, is configured to the tone mapping function of second area of the generation with low-light level;Tone Mapping block 240, is configured to perform tone mapping;And details enhancing module 250, it is configured to the figure to the mapping of executed tone As performing post-processing operation.
Image receiver module 205 receives content C, and output image data ID and metadata from content receipt unit 130 MD, wherein, view data ID is included in the content C of reception, and metadata MD is related to view data ID.Here, metadata MD May include the information on view data ID.
Linearization block 210 by view data ID linearize, and linearization original image I1.
Region division module 220 receives original image I1 from linearization block 210, and the original image I1 of reception is divided into First area R1 and second area R2, and export the image of first area R1 and the image of second area R2.
First mapping function generation module 231 receives the image of first area R1 from region division module 220, and is based on The image of first area R1 is generated and exports the first mapping function MF1.
Second mapping function generation module 232 receives the image of second area R2 from region division module 220, and is based on The image of second area R2 is generated and exports the second mapping function MF2.
Tone mapping block 240 is respectively from the first mapping function generation module 231 and the second mapping function generation module 232 The first mapping function MF1 and the second mapping function MF2 is received, and based on the first mapping function MF1 and the second mapping function MF2 Generation tone mapping function TMF.
Additionally, tone mapping block 240 performs tone according to the tone mapping function TMF of generation on original image I1 reflecting Penetrate and export the first image I2.
Details strengthens module 250 and receives the first image I2 from tone mapping block 240, and the first image I2 to receiving is performed Details enhancing operation, and output performs the enhanced second image I3 of details.
Graphics processing unit 200 receives the view data of HDR from content receipt unit 130, according to what is received The view data of HDR generates the display image of low-dynamic range, and it is single that the display image of generation is sent into display Unit 140.
The operation of included modules in graphics processing unit 200 is described below.
First, image receiver module 205 will be described.
Image receiver module 205 extracts view data ID and metadata from the content C received by content receipt unit 130 MD。
Content C includes representing the view data ID and the metadata MD related to view data ID of original image.
Metadata MD may include the monochrome information of view data ID.When content C is such as video, metadata MD can then be wrapped Include included each frame in monochrome information, the content C of included each scene in monochrome information, the content C of entire content C Monochrome information etc..Here, frame refers to the single rest image to form video.Additionally, scene refers in representing single background The beam of the series of frames of single condition.In other words, scene can be regarded as image does not have the beam of the successive frame of significant changes.
Specifically, metadata MD may include the maximum brightness value of included multiple images in content C, minimum luminance value and Average brightness value.Additionally, metadata MD may include the maximum brightness value of included image in each scene, minimum luminance value and Average brightness value.Additionally, maximum brightness value, minimum luminance value and average bright that metadata MD may include to form the image of each frame Angle value.
By this way, in addition to view data ID, image receiver module 205 can also extract each scene from content C Monochrome information or each frame monochrome information.
Then, linearization block 210 will be described.
Fig. 4 was shown showing that view data is linearly changed according to graphics processing unit included in implementation method display device Example property operation.Additionally, Fig. 5 shows exemplary original image.Fig. 6 shows the brightness histogram of the original image shown in Fig. 5.
As shown in figure 4, the view data ID linearisations that linearization block 210 will be received from image receiver module 205, and The brightness value of each in the image of calculating linearisation in included pixel.
Due to a variety of causes, the view data received by content receipt unit 130 may be different from real image.For example, May be variant between the image of the actual imaging target according to imageing sensor and the image according to view data, wherein, Imageing sensor is configured to obtain the image of image object.Additionally, in order to transmit or storage image data and image is pressed During the process of contracting or coding, the initial image of transmission and according to may be variant between the image of view data.
Specifically, because high dynamic range images include substantial amounts of information, therefore, in order to pass through communication network transmission image Or in storing the image on storage medium, it is necessary to image is compressed or is encoded.
High-high brightness is that L1max and minimum brightness are for the original image of L1min is convertible into recognizable dynamic range The view data (in this case, it is assumed that the difference between L1max and L1min is greater than the number of N1) of N1 (N0 to N1).Example Such as, can be by the difference between high-high brightness L1max and minimum brightness L1min for the original image of 10,000 nit (nits) compresses Into can expression range be 2000 nits view data.
When the dynamic range of an image is reduced, the size of view data reduces.However, in the presence of will be included in original image Information some parts lose worry.By this way, when the dynamic range L1 of original image is dynamic more than view data During state scope N1, in order that the information that coding or compression are lost during processing is minimized, first shown in usable Fig. 4 A is non- Linear mapping function F1.
When using the first nonlinear mapping function F1, bulk information will be included with different compression ratios in original image Region and the region including a small amount of information are compressed.In other words, in the region including bulk information, image is with low compression ratio It is compressed.In the region including a small amount of information, image is compressed with high compression ratio.Therefore, it is possible to improve compression efficiency, And view data may include bigger information content.
Included view data ID can be by showing in Fig. 4 A in the content C received by content receipt unit 130 The view data of the first nonlinear mapping function F1 non-linear compressions.
Linearization block 210 linearly changes the view data of non-linear compression by this way.
Specifically, linearization block 210 can be used the second nonlinear mapping function F2 and original image shown in Fig. 4 B The monochrome information of I1 linearly changes the view data of non-linear compression.Additionally, the monochrome information of original image I1 can be from as above institute Received in the image receiver module 205 stated, the monochrome information of original image I1 may include the maximum brightness value in units of scene And minimum luminance value, or maximum brightness value and minimum luminance value in units of frame.
Here, the second nonlinear mapping function F2 can be used the inverse function of the first nonlinear mapping function F1, wherein, first Nonlinear mapping function F1 is used for original image boil down to view data.
The first nonlinear mapping function F1 that original image is compressed into view data is known with by international standard etc. Function pair should.Therefore, linearization block 210 can generate the second nonlinear mapping function based on the first nonlinear mapping function F1 F2.Additionally, the second nonlinear mapping function F2 can be stored in advance in graphic memory 203.
The view data ID received from content receipt unit 130 can revert to original image by linearization block 210.
For example, the original image for recovering can be original image I1 as shown in Figure 5.
Additionally, linearization block 210 can analyze the brightness of original image I1.
Specifically, linearization block 210 can obtain the maximum brightness value L1max of original image I1, minimum luminance value L1min And average brightness value.
Linearization block 210 can be used various methods to obtain maximum brightness value L1max, the minimum luminance value of original image I1 L1min and average brightness value.
As set forth above, it is possible to the form of the metadata MD of view data ID receives original image I1 most from external equipment Big brightness value L1max, minimum luminance value L1min and average brightness value.
In such a case, it is possible to be carried in units of content C, in units of the frame of image or in units of the scene of image For maximum brightness value L1max, minimum luminance value L1min and average brightness value.When the value is provided in units of scene, linearisation Module 210 refers to maximum brightness value L1max, minimum luminance value L1min and the average brightness value of previous frame.
When the metadata MD of the content C for being received does not include maximum brightness value L1max, minimum luminance value L1min or average During brightness value, linearization block 210 can directly calculate maximum brightness value L1max, minimum brightness according to the original image of linearisation Value L1min and average brightness value.
Linearization block 210 can be used equation 1 to calculate the brightness value of included pixel in original image I1.Here, Each in the pixel of the original image of linearisation includes red value, green value and blue valve.
【Equation 1】
L=0.2126R+0.7152G+0.0722B
(wherein, L represents the brightness value of pixel, and R represents the red value of pixel, and G represents the green value of pixel, and B is represented The blue valve of pixel.)
The brightness value of each in pixel included in original image can be expressed as brightness histogram.Here, it is former The brightness histogram G1 of beginning image I1 represents frequency distribution of the pixel according to brightness value.That is, the X-axis of brightness histogram G1 represents bright Angle value, and Y-axis represents the quantity of pixel corresponding with brightness value.
For example, the original image I1 that linearization block 210 will can show in Fig. 5 is expressed as brightness Nogata illustrated in fig. 6 Figure G1.As shown in fig. 6, in the example of original image I1 shown in Figure 5, the quantity of the pixel with minimum brightness is most Greatly, and as brightness increases, the quantity of pixel is reduced.
Brightness histogram G1 is described above to understand, but this is only for the example for promoting to understand.At image Reason unit 200 not necessarily generates brightness histogram G1.Graphics processing unit 200 can as needed generate brightness histogram G1.
Next, will be described to region division module 220.
Fig. 7 shows the example for being divided the original image shown in Fig. 5 according to the brightness value of pixel.Fig. 8 shows Original image shown in Fig. 5 is divided by obtained example images according to brightness value.Additionally, Fig. 9 shows basis Original image shown in Fig. 5 is divided obtained another exemplary image by brightness value.
As shown in Figure 7, Figure 8 and Figure 9, region division module 220 based on the brightness according to multiple pixels first is with reference to bright Original image is divided into first area R1 and second area R2 by angle value m.Specifically, region division module 220 can be by original graph As be divided into including brightness be equal to or more than reference brightness value m pixel first area and including brightness less than first with reference to bright The second area of the pixel of angle value m.
When the brightness histogram G1 shown in Fig. 6 is illustrated, the first reference brightness value m as shown in Figure 7 can be based on Multiple pixels of original image I1 are divided into included pixel in included pixel or second area R2 in the R1 of first area.
Here, the first reference brightness value m may be configured as maximum brightness value or the brightness value less than maximum brightness value, wherein, The maximum brightness value can be from the output to greatest extent in display device 100.
Additionally, the first reference brightness value m can be set by user, or it can be predetermined value.
As shown in figure 8, work as that the original image I1 shown in Fig. 5 is divided into first area according to the first reference brightness value m During R1 and second area R2, image can divide the first area of the brightness value equal to or more than the first reference brightness value m of pixel Second area R2 of the brightness value of R1 and pixel less than the first reference brightness value m.
Additionally, region division module 220 can by brightness value be equal to or more than the first reference brightness value m pixel and be in The pixel that brightness value is equal to or more than near the pixel of the first reference brightness value m is set to first area R1.Because holding Need to maintain the continuity of image after tone of having gone mapping.
For example, as shown in Figure 9 A, original image I1 can be divided into multiple regions by region division module 220.
Then, region division module 220 can be set region, to cause to include that brightness value is equal to or more than the first reference brightness The region of the pixel of value m is not include that brightness value is equal to or more than first with reference to bright in first area R1, and the region for dividing The region of the pixel of angle value m is second area R2.
As shown in Figure 9 B, when the original image I1 shown in Fig. 5 is divided into first area R1 and second by this way During the R2 of region, brightness value can be equal to or more than the pixel of the first reference brightness value m and in brightness by region division module 220 The pixel that value is equal to or more than near these pixels of the first reference brightness value m is set to first area R1.
Next, will be described to the first mapping function generation module 231.
Figure 10 shows that the graphics processing unit shown in Fig. 3 extracts the example of characteristic point from first area.Additionally, Figure 11 Show that the graphics processing unit shown in Fig. 3 is based on the example of characteristic point first mapping function of generation of first area.Figure 12 Show that the graphics processing unit shown in Fig. 3 generates another example of the first mapping function according to the characteristic point of first area.
As shown in Figure 10, Figure 11 and Figure 12, the first image of the mapping function generation module 231 based on first area R1 is generated First mapping function MF1.
Here, the first mapping function MF1 refer to using in original image I1 first area R1 as high dynamic range images Image be converted into the parametric function of low dynamic range echograms.In other words, the image of first area R1 passes through the first mapping function MF1 is converted into low dynamic range echograms.
Specifically, the first mapping function MF1 by brightness value between the first reference brightness value m and maximum original luminance value High dynamic range images between L1max scopes are converted into brightness value between the second reference brightness value n and maximum display brightness value Low dynamic range echograms between L2max scopes.Here, the second reference brightness value n can be set by user, or can be by display device 100 designer is suitably set in advance.
First mapping function generation module 231 is extracted includes the pixel of characteristic information, and based on the pixel extracted Histogram generates the first mapping function MF1.Here, characteristic information may include the edge letter of included image in the R1 of first area The half-tone information of breath, the texture information of image and image.
First mapping function generation module 231 can generate the first mapping function MF1, and first mapping function MF1 is used for straight See the gray scale at the edge, the texture of display image directly perceived or display image directly perceived of display first area R1.
For example, for showing edge region directly perceived, the first mapping function generation module 231 can be wrapped from the R1 of first area Such pixel is extracted in the pixel for including:The brightness value of the pixel is different from the brightness value of adjacent pixel and is equal to or more than Reference value.In other words, extracted in the image of the first area R1 that the first mapping function generation module 231 can show from Figure 10 A Pixel FP as shown in Figure 10 B:The brightness value that pixel has is equal to or more than reference value, and the brightness value from adjacent image point is different. Here, brightness value can be equal to or more than into the reference value pixel different from the brightness value of adjacent image point to be defined as being positioned at image Fringe region at.
Additionally, the first mapping function generation module 231 can calculate the pixel (brightness of the pixel of positioning edge region Value equal to or more than reference value it is different from the brightness value of adjacent image point) frequency distribution.
As shown in Figure 11 A, the frequency distribution that can will be located into the pixel of fringe region is expressed as edge histogram G2.Specifically Ground, the X-axis of edge histogram G2 represents brightness value, and Y-axis represents that brightness value is equal to or more than reference value and adjacent image point The quantity of the different pixel of brightness value.
As shown in Figure 11 A, it is maximum when the quantity of the pixel at positioning edge region is near brightness value p.Except brightness In value p brightness values other than around, brightness value is equal to or more than the number of the reference value pixel different from the brightness value of adjacent image point Amount very little.
In order to the fringe region of the image being displayed on display device 100 intuitively shows, the first mapping function generation mould Block 231 can be the brightness range wider that the brightness range distribution more than border area pixels quantity is displayed on display device 100, And the narrower brightness range on display device 100 is displayed in for the few brightness range distribution of border area pixels quantity.
Specifically, as shown in Figure 11 B, the first mapping function generation module 231 can generate the first mapping function MF1, In one mapping function MF1, brightness value has big slope, Yi Jiding at the position more than pixel quantity at positioning edge region Brightness value has small slope at the few position of quantity of the pixel at the edge region of position.Specifically, reflected to generate first Function MF1 is penetrated, the first mapping function generation module 231 can be by the accumulative side by the way that edge histogram G2 is integrated and is obtained Edge histogram is defined as the first mapping function MF1.
However, the first mapping function MF1 not limited to this generated by the first mapping function generation module 231.
For example, when edge histogram G1 is identical with Figure 12 A, the first mapping function generation module 231 can be generated such as Figure 12 B The first shown mapping function MF1.
Specifically, as shown in Figure 12 B, the first mapping function generation module 231 can generate the first mapping function MF1, first Mapping function MF1 has constant slope on the brightness value at the big position of the quantity of the pixel being positioned at fringe region.
Used as another example, for the texture of display image directly perceived, the first mapping function generation module 231 can extract brightness It is worth the pixel in the region changed in constant range.The region that brightness value changes in constant range can be defined as that texture is presented Region.
Additionally, the first mapping function generation module 231 can calculate present texture (brightness value becomes in constant range in region The region of change) pixel frequency distribution.
Additionally, the first mapping function generation module 231 can generate the first mapping function, in the first mapping function, present There is big slope near the brightness value at position more than the pixel quantity in the region of texture:And in the region of presentation texture There is small slope near brightness value at the few position of pixel quantity.
Used as another example, for the gray scale of display image directly perceived, the first mapping function generation module 231 can extract brightness Value is constantly and the pixel in region that continuously changes.Can be by brightness value constantly and the region that continuously changes is defined as that ash is presented The region of degree.
Additionally, the first mapping function generation module 231 can calculate and the region of gray scale be presented (brightness value is constantly and continuously The region of change) pixel frequency distribution.
Additionally, the first mapping function generation module 231 can generate the first mapping function, in the first mapping function, present Brightness value at the big position of the pixel quantity in the region of gray scale has big slope, and the pixel quantity that the region of gray scale is presented Brightness value at few position has small slope.
By this way, for every image information included in the image for intuitively showing first area R1, first reflects Penetrating function generation module 231 can generate various first mapping function MF1.
Specifically, for every image information included in the image for intuitively showing first area R1, the first mapping letter Number generation modules 231 can calculate the frequency distribution of the pixel including characteristic information, and the generation of the frequency distribution based on generation the One mapping function MF1.
Next, will be described to the second mapping function generation module 232.
Figure 13 shows that image of the graphics processing unit based on second area shown in Fig. 3 generates the second mapping function Example.
As shown in figure 13, image generation second of the second mapping function generation module 232 based on second area R2 maps letter Number MF2.
Here, the second mapping function MF2 refer to using in original image I1 second area R2 as high dynamic range images Image be converted into the parametric function of low dynamic range echograms.In other words, the image of first area R1 passes through the second mapping function MF2 is converted into low dynamic range echograms.
Specifically, the first mapping function MF1 by brightness value between minimum original luminance value L1min and reference brightness value m models High dynamic range images between enclosing are converted into brightness value between minimum display brightness value L2min and the second reference brightness value n models Low dynamic range echograms between enclosing.Here, as described above, the second reference brightness value n can be set by user, or can be set by display Standby 100 designer is suitably set in advance.
Specifically, the second mapping function generation module 232 extracts the monochrome information of second area R2, and based on extraction Monochrome information generates the second mapping function MF2.
As shown in FIG. 13A, the monochrome information of second area R2 can be based on the brightness histogram G1 acquisitions of original image I1.
Second mapping function generation module 232 can generate the second mapping function MF2, for fully keeping second area The lightness of R2, and prevent included image information in the image of second area R2 from losing.
For example, the second mapping function generation module 232 can set for the distribution of the luminance area more than pixel quantity is displayed in display Brightness range wider on standby 100, and for the few luminance area distribution of pixel quantity be displayed on display device 100 compared with Narrow brightness range.
Specifically, as shown in FIG. 13A, when the quantity for increasing pixel with brightness value is reduced, the generation of the second mapping function Module 232 can generate the second mapping function MF2, as shown in Figure 13 B, in the second mapping function MF2, as brightness value increases oblique Rate reduces.Specifically, in order to generate the second mapping function MF2, the second mapping function generation module 232 can be based on by brightness The accumulative brightness histogram that histogram G1 is integrated and obtains generates the second mapping function MF2.
However, the second mapping function MF2 not limited to this generated by the second mapping function generation module 232.
For example, the second mapping function generation module 232 can generate linear tone mapping function, logarithm tone mapping function Deng.
High dynamic range images are converted into low dynamic range echograms by linear tone mapping function, to cause HDR The brightness value of image and the brightness value of low dynamic range echograms have linear relationship.
Linear tone mapping function, as the tone mapping function for maintaining the contrast between pixel, has the advantage that Seldom there is vision difference between original image I1 and display image I2.
High dynamic range images are converted into low dynamic range echograms by linear tone mapping function, to cause HDR The brightness value of image and the brightness value of low dynamic range echograms have the relation of logarithmic function.
In logarithm tone mapping function, such feature has been used:According to Weber's law Human Visual System similar to Logarithmic function increases.Weber's law shows that the eyes of the mankind can sense the slight change of the lightness in dark space, but be difficult The big change of the lightness in sensing clear zone.
Logarithm tone mapping function increases the lightness of image generally according to the feature of logarithmic function, and in the dark of image Area has high-contrast effect.
By this way, the second mapping function generation module 232 can generate the second mapping function, or base based on logarithmic function The second mapping function is generated in zone system.
Next, will be described to tone mapping block 240.
Figure 14 shows the exemplary tone mapping function of the graphics processing unit generation shown in Fig. 3.Figure 15 shows When the display device of correlation technique performs the result obtained when tone maps to high-dynamics image.Figure 16 is shown according to implementation method Display device the result that obtains when tone maps is performed to high-dynamics image.
As shown in Figure 14, Figure 15 and Figure 16, tone mapping block 240 combines the mapping letters of the first mapping function MF1 and second Count MF2, generation tone mapping function MF and tone mapping is performed to original image I1 using tone mapping function MF.
Here, tone mapping function MF refers to that will be converted into low dynamic model as the original image I1 of high dynamic range images Enclose the parametric function of image.In other words, original image I1 is converted into as low dynamic range echograms according to tone mapping function MF Display image I2.
Specifically, original image I1 is converted into display image I2 by tone mapping function MF, wherein, original image I1's is bright Between minimum original luminance value L1min and maximum original luminance value L1max scopes, the brightness value of display image I2 is situated between angle value Between minimum display brightness value L2min and maximum display brightness value L2max scopes.
When the second mapping function MF2 that will be shown in the shown in Figure 11 B first mapping function MF1 and Figure 13 B is combined, The tone mapping function MF according to the implementation method shown in Figure 14 can be generated.
The tone mapping function MF for generating by this way can keep the image as the second area R2 of low brightness area Lightness, and keep as high-brightness region first area R1 image characteristic information.
Tone mapping block 240 can perform tone mapping on original image I1, and generate the first image I2.Specifically Ground, tone mapping block 240 can to all pixels included in tone mapping function MF application original images I1, and because This, performs tone mapping.
Here, the first image I2 has and the brightness range identical brightness range that can be exported from display device 100.
By this way, compared with the tone mapping function MF3 based on logarithmic function, given birth to by tone mapping block 240 Into the tone mapping function MF of characteristic information of lightness and high-brightness region of kept low brightness area can more intuitively show Show the image in high-brightness region.
For example, original that ought be using the tone mapping function MF3 based on logarithmic function shown in Figure 15 A to being shown in Fig. 5 When beginning image I1 performs tone mapping, the display image I2 shown in Figure 15 B is exported.
Additionally, the original image I1 worked as using the tone mapping function MF shown in Figure 16 A to being shown in Fig. 5 performs tone During mapping, the display image I2 shown in Figure 16 B is exported.
As shown in fig. 15b, in the display figure mapped by the tone mapping function MF3 executeds tone based on logarithmic function As in I2, will not intuitively display image in high-brightness region R1.On the other hand, as shown in fig 16b, mapped by by tone In the display image I2 of the tone mapping function MF executeds tone mapping of the generation of module 240, in high-brightness region R1 intuitively Show image.
Next, will be described to details enhancing module 250.
In order to provide the user further intuitively image, details enhancing is related to the image I2's of executed tone mapping Treatment.
This details enhancing may include various image processing techniques, such as, contrast enhancing, histogram equalization, image Sharpen and image smoothing, wherein, the difference between the clear zone and dark space that maximize image is strengthened by contrast;By histogram Equalization regulation histogram, so as to the image for becoming the image being distributed with low contrast there is unified contrast distribution;It is logical Cross image sharpening meticulously transition diagram picture;And by image smoothing mildly transition diagram picture.
Details enhancing module 250 can using morning known various image processing techniques to the first image I2 treatment, And output performs the enhanced second image I3 of details.
By this way, original image I1 can be divided into first area R1 (that is, high luminance areas by graphics processing unit 200 Domain) and second area R2 (that is, low brightness area), the figure of the gray scale of the texture or image at edge, image based on such as image As feature performs tone mapping on the R1 of first area, and the lightness based on image performs tone on second area R2 and reflects Penetrate.
Therefore, graphics processing unit 200 can process original image I1, to cause the original graph as high dynamic range images As I1 is visually displayed on the display panel 143 with low-dynamic range.
Hereinafter, the operation of the display device 100 according to implementation method will be described.
Figure 17 shows the exemplary high dynamic range images display operation of the display device according to implementation method.
As shown in figure 17, the display of the high dynamic range images of display device 100 operation (1000) will be described.
Display device 100 receives image (1010) from the external world.Display device 100 can be by content receipt unit 130 from outer Boundary's reception content, and view data ID and metadata MD included in the received content of extraction.
Metadata MD is the data for including the information on view data ID, and metadata MD may include with the list of scene The monochrome information or the monochrome information in units of frame of position.Specifically, metadata MD may include the high-high brightness of entire content C Value, minimum luminance value and average brightness value, the maximum brightness value of included image in each scene, minimum luminance value and average Brightness value, or form maximum brightness value, minimum luminance value and the average brightness value of the image of each frame.
Extract view data ID and metadata MD, then, the linearity (1020) that display device 100 will be received. In order to obtain original image I1, display device 100 can linearly change view data.
Specifically, the graphics processing unit 200 of display device 100 can be used the second nonlinear mapping function F2, and will Image data restoration is original image I1.Additionally, graphics processing unit 200 can be based on included in the original image I1 for recovering The colour of each in pixel, calculates the monochrome information of original image I1.
By linearity, then, original image I1 is divided into multiple regions (1030) by display device 100.Display sets Original image I1 can be divided into first area R1 and second area R2 by standby 100, wherein, first area R1 is high-brightness region, Second area R2 is low brightness area.
Specifically, original image I1 can be divided into first area R1 and by the graphics processing unit 200 of display device 100 Two region R2, wherein, first area R1 includes that brightness value is equal to or more than the pixel of reference brightness value, and second area R2 includes bright Pixel of the angle value less than reference brightness value m.
Image is divided, then, the generation of display device 100 first mapping function MF1 and the second mapping function MF2 (1040). What display device 100 can generate the image of the first mapping function MF1 and second area R2 of the image of first area R1 second reflects Penetrate function MF2.
Specifically, the graphics processing unit 200 of display device 100 can be extracted including such as from the image of first area R1 The pixel of the characteristic information of edge, texture and gray scale, and the generation of the histogram based on the pixel including characteristic information first reflects Penetrate function MF1.
Additionally, the brightness histogram generation second that graphics processing unit 200 can be based on the image of second area R2 maps letter Number MF2.
After the first mapping function MF1 and the second mapping function MF2 is generated, the generation tone mapping letter of display device 100 Number and execution tone mapping (1050) on original image I1.Display device 100 can be used and be combined with the first mapping function MF1 With the tone mapping function MF of the second mapping function MF2, and according to original image I1 generate the first image I2.
Specifically, the graphics processing unit 200 of display device 100 can be combined the mapping letters of the first mapping function MF1 and second Number MF2, and generate tone mapping function MF.Additionally, graphics processing unit 200 can be to tone mapping function MF application original graphs As I1, and the first image I2 is generated according to original image I1.
Tone mapping is performed, then, 100 couples of the first image I2 of display device perform details enhancing (1060).In order to enter one Step intuitively shows the first image I2, and display device 100 can perform the enhanced image procossing of such as contrast to the first image I2.
Specifically, the graphics processing unit 200 of display device 100 can be enhanced to the first image I2 execution such as contrast Details strengthens, and thus generates the second image I3.
Perform details enhancing, then, the display image of display device 100 (1070).Display device 100 can be by display unit 140 the second image I3 of display.
By this way, display device 100 can by original image I1 be divided into first area R1 (that is, high-brightness region) and First area R2 (that is, low brightness area), the characteristics of image at edge, texture or gray scale based on such as image is in first area R1 Upper execution tone mapping, and the lightness based on image performs tone mapping on second area R2.
Therefore, display device 100 can process original image I1, to cause the original image I1 as high dynamic range images It is visually displayed on the display panel 143 with low-dynamic range.
Above to the display device 100 according to implementation method and including graphics processing unit 200 it is exemplary Structurally and operationally it is described.
However, graphics processing unit included in display device 100 is not limited to the image procossing list shown in Fig. 3 Unit 200, but various display devices 100 may include various graphics processing units.
Hereinafter, various graphics processing units included in the display device 100 according to implementation method will be retouched State.Configuration identical configuration with above-described graphics processing unit 200 makes to be designated by like reference numerals throughout.
Included another exemplary graphics processing unit in the display device that Figure 18 shows according to implementation method.Figure 19 and Figure 20 shows that the graphics processing unit shown in Figure 18 generates the example of tone mapping function.
As shown in figure 18, graphics processing unit 200 ' may include:Image receiver module 205, is configured to receive view data ID and metadata MD;Linearization block 210, is configured to linearly change view data;Region division module 220, is configured to be based on Brightness divides image;First mapping function generation module 231, is configured to generate the tone mapping function of high-brightness region;Tone Mapping block 240 ', is configured to perform tone mapping;And details enhancing module 250, it is configured to the mapping of executed tone Image performs post-processing operation.
Image receiver module 205 extracts view data ID and metadata from the content C received by content receipt unit 130 MD.Here, content C includes representing the view data ID and the metadata MD related to view data ID of original image.Metadata MD may include the monochrome information of view data ID.When content C is such as video, metadata MD then may include entire content C's In monochrome information, content C in the monochrome information of each frame included in the monochrome information and content C of each included scene At least one.
The view data ID linearisations that linearization block 210 will be received from image receiver module 205, and analyze linearisation Image brightness.Specifically, when in the metadata MD of content C not include maximum brightness value L1max, minimum luminance value L1min Or during average brightness value, linearization block 210 can directly calculate maximum brightness value L1max, most according to the original image of linearisation Small brightness value L 1min and average brightness value.
Be divided into for original image based on the first reference brightness value m according to the brightness of multiple pixels by region division module 220 First area R1 and second area R2.Specifically, original image can be divided into first area and second by region division module 220 Region, wherein, first area includes that brightness is equal to or more than the pixel of reference brightness value m, and second area includes brightness less than the The pixel of one reference brightness value m.
First mapping function generation module 231 is extracted from the R1 of first area includes the pixel of characteristic information, and is based on The histogram of the pixel extracted generates the first mapping function MF1.Here, characteristic information may include included in the R1 of first area The marginal information of image, the texture information of image and image half-tone information.
Tone mapping block 240 ' generates tone mapping function MF based on original image I1 and the first mapping function MF1, and And tone mapping is performed to original image I1 using tone mapping function MF.
Here, tone mapping function MF refers to that will be converted into low dynamic model as the original image I1 of high dynamic range images Enclose the parametric function of image.In other words, original image I1 is converted into being used as low dynamic range echograms by tone mapping function MF Display image I2.
First, monochrome information of the tone mapping block 240 ' based on original image I1 generates interim tone mapping function MF '. Here, interim tone mapping function MF ' can be used to ultimately generate tone mapping function MF.
As shown in Figure 19 A, the monochrome information of original image I1 can be based on the brightness histogram G1 acquisitions of original image I1.
Tone mapping block 240 ' can generate interim tone mapping function MF ', for fully maintaining original image I1 Lightness, and prevent included image information in original image I1 from losing.
For example, tone mapping block 240 ' can be displayed on display device 100 for the distribution of the luminance area more than pixel quantity Brightness range wider, and for the few luminance area distribution of pixel quantity be displayed in it is narrower bright on display device 100 Degree scope.
Specifically, as shown in Figure 19 A, when with brightness value increase, the quantity of pixel is reduced, tone mapping block 240 ' can generate interim tone mapping function MF ', as shown in Figure 19 B, in interim tone mapping function MF ', with brightness value Increase slope to reduce.Specifically, in order to generate interim tone mapping function MF ', tone mapping block 240 ' can be by by bright The accumulative brightness histogram that degree histogram G1 is integrated and obtains is defined as interim tone mapping function MF '.
However, the interim tone mapping function MF ' not limited to this generated by tone mapping block 240 '.
For example, tone mapping block 240 ' based on early known logarithmic function generation tone mapping function, or can be based on Early known zone system generation tone mapping function.
Generated the tone mapping block 240 ' of interim tone mapping function MF ' by interim tone mapping function MF ' with from The first mapping function MF1 combinations that first mapping function generation module 231 is received, and generate tone mapping function MF.
Tone mapping block 240 ' can be used various methods by interim tone mapping function MF ' and the first mapping function MF1 Combination.
For example, tone mapping block 240 ' can be by the interim tone mapping function MF ' of high-brightness region and the first mapping letter Number MF1 synthesis, and generate tone mapping function MF.
Specifically, the executable normalization of tone mapping block 240 ', to cause that the output of the first mapping function MF1 has Value between " 0 " and " 1 ", by the interim tone mapping function MF ' of more than reference brightness value m and normalized first mapping function MF1 synthesizes, and thus generates tone mapping function MF.
Therefore, perform tone to original image I1 by interim tone mapping function MF ' to map, and can be reflected by first Penetrate function MF1 and perform tone mapping again to image included in the first area R1 in original image I1.
Used as another example, tone mapping block 240 ' can replace the interim of high-brightness region with the first mapping function MF1 Tone mapping function MF '.Specifically, tone mapping block 240 ' can replace interim tone mapping letter with the first mapping function MF1 The part of more than the reference brightness value m in number MF '.
In this case, tone mapping block 240 ' can calculate the bright of low-dynamic range corresponding with reference brightness value m Angle value l, and based between the reference brightness value l of low-dynamic range and the maximum brightness value L2max of low-dynamic range for calculating Difference, the output area of the first mapping function of scale MF1.Specifically, tone mapping block 240 ' can the mapping letter of scale first The output area of number MF1, to cause the output of the first mapping function MF1 dynamic with low between the reference brightness value l of low-dynamic range Between the maximum brightness value L2max scopes of state scope.
When the first mapping function MF1 groups shown in interim tone mapping function MF ' and Figure 11 B that will be shown in Figure 19 B During conjunction, the tone mapping function MF shown in Figure 20 can be generated.
The tone mapping function MF for generating by this way can keep the lightness of original image I1 and keep as high The characteristic information of the image of the first area R1 of luminance area.
The tone mapping block 240 ' for having generated tone mapping function MF can be used tone mapping function MF to original image I1 performs tone mapping, and generates the first image I2.Specifically, tone mapping block 240 ' can be answered to tone mapping function MF With all pixels included in original image I1, and therefore, tone mapping is performed.
Here, the first image I2 has and the brightness range identical brightness range that can be exported from display device 100.
By this way, the lightness of the kept low brightness area for being generated by tone mapping block 240 ' and high brightness The tone mapping function MF of the characteristic information in region can more intuitively show height compared with the tone mapping function based on logarithmic function Image in luminance area.
In order to provide the user further intuitively image, the image that details enhancing module 250 is mapped executed tone I2 treatment.Here, details enhancing may include various image processing techniques, such as, contrast enhancing, histogram equalization, Image sharpening and image smoothing, wherein, the difference between the clear zone and dark space that maximize image is strengthened by contrast;By straight Side's figure equalization regulation histogram is so as to the image for becoming the image being distributed with low contrast to have unified contrast distribution; By image sharpening meticulously transition diagram picture;And by image smoothing mildly transition diagram picture.
Hereinafter, the operation of the display device 100 according to implementation method will be described.
Figure 21 shows the another exemplary high dynamic range images display operation of the display device according to implementation method.
As shown in figure 21, the high dynamic range images of display device 100 display operation (1100) will be described.
Display device 100 receives image (1110) from the external world.Display device 100 can be by content receipt unit 130 from outer Boundary's reception content, and view data ID and metadata MD included in the received content of extraction.
Metadata MD is the data for including the information on view data ID, and metadata MD may include that with scene be list The monochrome information or the monochrome information in units of frame of position.Specifically, metadata MD may include the high-high brightness of entire content C Value, minimum luminance value and average brightness value, the maximum brightness value of included image in each scene, minimum luminance value and average Brightness value, or form maximum brightness value, minimum luminance value and the average brightness value of the image of each frame.
Extract view data ID and metadata MD, then, the linearity (1120) that display device 100 will be received. In order to obtain original image I1, display device 100 can linearly change view data.
Specifically, the graphics processing unit 200 of display device 100 can be used the second nonlinear mapping function F2, and will Image data restoration is original image I1.Additionally, graphics processing unit 200 can be based on included in the original image I1 for recovering The colour of each in pixel, calculates the monochrome information of original image I1.
By linearity, then, display device 100 detects first area (1130) from original image I1.Display device 100 can detect as the first area R1 of high-brightness region from original image I1.
Specifically, the graphics processing unit 200 of display device 100 can detect first area R1, first from original image I1 Region R1 includes that brightness value is equal to or more than the pixel of reference brightness value m.
Detection first area R1, then, display device 100 generates the first mapping function MF1 (1140).Display device 100 The first mapping function MF1 of the image of first area R1 can be generated.
Specifically, the graphics processing unit 200 of display device 100 can be extracted including such as from the image of first area R1 The pixel of the characteristic information of edge, texture and gray scale, and the generation of the histogram based on the pixel including characteristic information first reflects Penetrate function MF1.
After the first mapping function MF1 is generated, display device 100 generates tone mapping function and to original image I1 Perform tone mapping (1150).Display device 100 can generate interim tone mapping function MF ', and according to original image I1 profits The first image I2 is generated with interim tone mapping function MF ' and the first mapping function MF1.
Specifically, the brightness histogram generation that the graphics processing unit 200 of display device 100 can be based on original image I1 is faced When tone mapping function MF ', interim tone mapping function MF ' is combined with the first mapping function MF1 and generated tone mapping letter Number MF.
Additionally, graphics processing unit 200 can be to tone mapping function MF application original image I1, thus according to original image I1 generates the first image I2.
Tone mapping is performed, then, 100 couples of the first image I2 of display device perform details enhancing (1160).In order to enter one Step intuitively shows the first image I2, and display device 100 can perform the enhanced image procossing of such as contrast to the first image I2.
Specifically, the graphics processing unit 200 of display device 100 can be enhanced to the first image I2 execution such as contrast Details strengthens, and thus generates the second image I3.
Perform details enhancing, then, the display image of display device 100 (1170).Display device 100 can be by display unit 140 the second image I3 of display.
By this way, display device 100 can detect first area R1, the lightness pair based on image from original image I1 Original image I1 performs tone mapping, and then, the characteristics of image based on such as edge, texture or gray scale is held on the R1 of first area Row tone maps.
Therefore, display device 100 can process original image I1, to cause the original image I1 as high dynamic range images It is visually displayed on the display panel 143 with low-dynamic range.
Included another exemplary graphics processing unit in the display device that Figure 22 shows according to implementation method.Additionally, Figure 23 shows that the graphics processing unit shown in Figure 22 performs the example of tone mapping to the image of first area.Figure 24 shows Graphics processing unit shown in Figure 22 performs the example of tone mapping to the image of second area.Additionally, Figure 25 shows root Operation is shown according to the another exemplary high dynamic range images of the display device of implementation method.
As shown in figure 22, graphics processing unit 200 " may include:Image receiver module 205, is configured to receive view data ID and metadata MD;Linearization block 210, is configured to linearly change view data;Region division module 220, is configured to basis Brightness divides image;First tone mapping block 241, is configured to perform tone mapping on high-brightness region;Second tone reflects Module 242 is penetrated, is configured to perform tone mapping in low brightness area;Image synthesis unit 260, is configured to executed tone The image synthesis of mapping;And details enhancing module 250, it is configured to perform post-processing operation to image.
Image receiver module 205 extracts view data ID and metadata from the content C received by content receipt unit 130 MD.Here, content C includes representing the view data ID and the metadata MD related to view data ID of original image.Metadata MD may include the monochrome information of view data ID.When content C is such as video, metadata MD may include that entire content C's is bright In degree information, content C in the monochrome information of each frame included in the monochrome information and content C of each included scene extremely It is few one.
The view data ID linearisations that linearization block 210 will be received from image receiver module 205, and analyze linearisation Image brightness.Specifically, when in the metadata MD of content C not include maximum brightness value L1max, minimum luminance value L1min Or during average brightness value, linearization block 210 can directly calculate maximum brightness value L1max, most according to the original image of linearisation Small brightness value L 1min and average brightness value.
The first reference brightness value m that region division module 220 is based on the brightness according to multiple pixels divides original image Into first area R1 and second area R2.Specifically, original image can be divided into first area and by region division module 220 Two regions, wherein, first area includes that brightness is equal to or more than the pixel of reference brightness value m, and second area includes brightness Less than the pixel of the first reference brightness value m.
Image of the first tone mapping block 241 based on first area R1 generates the first mapping function MF1, and uses the One mapping function MF1 performs tone mapping to the image of first area R1.
Here, the first mapping function MF1 refers to be converted into the image as high dynamic range images of first area R1 The parametric function of low dynamic range echograms.In other words, the image of first area R1 is converted into low dynamic by the first mapping function MF1 State range image.
Specifically, the first mapping function MF1 by brightness value between the first reference brightness value m and maximum original luminance value High dynamic range images between L1max scopes are converted into brightness value between the second reference brightness value n and maximum display brightness value Low dynamic range echograms between L2max scopes.
First tone mapping block 241 is extracted includes the pixel of characteristic information, and the Nogata based on the pixel extracted The first mapping function MF1 of figure generation.Here, characteristic information may include included image in the R1 of first area marginal information, The texture information of image and the half-tone information of image.
For example, for showing edge region directly perceived, the first tone mapping block 241 can be included from the R1 of first area Such pixel is extracted in pixel:The brightness value of the pixel is different from adjacent image point and brightness equal to or more than reference value Value;And the histogram based on the pixel extracted generates the first mapping function MF1.
Additionally, the first tone mapping block 241 performs tone according to the first mapping function MF1 to the image of first area R1 Mapping, and generate first area display image I2a.
For example, the first tone mapping block 241 can be used the first mapping function MF1 shown in Figure 23 A to being shown in Fig. 8 The image of first area R1 perform tone mapping, and export first area display image I2a as shown in fig. 23b.
Image of the second tone mapping block 242 based on second area R2 generates the second mapping function MF2, and uses the Two mapping function MF2 perform tone mapping to the image of second area R2.
Here, the second mapping function MF2 refers to be converted into the image as high dynamic range images of second area R2 As the parametric function of low dynamic range echograms.In other words, the image of second area R2 is converted into by the second mapping function MF2 Low dynamic range echograms.
Specifically, the second mapping function MF2 by brightness value between minimum original luminance value L1min and the first reference brightness value High dynamic range images between m scopes are converted into brightness value between minimum display brightness value L2min and the second reference brightness value n Low dynamic range echograms between scope.
The brightness histogram of the second image of the tone mapping block 242 based on second area R2 generates the second mapping function MF2.Specifically, the second tone mapping block 242 can be based on being accumulated by the brightness histogram of the image to second area R2 The accumulative brightness histogram for dividing and obtaining generates the second mapping function MF2.
However, present embodiment not limited to this.Second tone mapping block 242 can be based on linear function, logarithmic function etc. Generate the second mapping function MF2.
Additionally, the second tone mapping block 242 performs tone according to the second mapping function MF2 to the image of second area R2 Mapping, and generate second area display image I2b.
For example, when the image of the second area R2 using the second mapping function MF2 shown in Figure 24 A to being shown in Fig. 8 When performing tone mapping, the second tone mapping block 242 can generate second area display image I2b as shown in fig. 24b.
First area display image I2a that image synthesis unit 260 will be received from the first tone mapping block 241 with from the The second area display image I2b synthesis that two tone mapping block 242 is received, and generate the first image I2.
For example, image synthesis unit 260 can show the second area shown in first area display image I2a and Figure 24 B Diagram synthesizes as I2b, and generates the first image I2.
By this way, the first tone mapping block 241 can be used the characteristic information of high-brightness region to perform tone and map. Second tone mapping block 242 can be used the brightness information of low brightness area to perform tone mapping.Additionally, image synthesis unit The first area display image I2a that will can be exported from the first tone mapping block 241 is exported with from the second tone mapping block 242 Second area display image I2b synthesis.
In order to provide the user further intuitively image, the image that details enhancing module 250 is mapped executed tone I2 treatment.Here, details enhancing may include various image processing techniques, such as, contrast enhancing, histogram equalization, Image sharpening and image smoothing, wherein, the difference between the clear zone and dark space that maximize image is strengthened by contrast;By straight Side's figure equalization regulation histogram, so as to the figure for becoming the image being distributed with low contrast there is unified contrast distribution Picture;By image sharpening meticulously transition diagram picture;And by image smoothing mildly transition diagram picture.
Hereinafter, the operation of the display device 100 according to implementation method will be described.
Figure 25 shows the another exemplary high dynamic range images display operation of the display device according to implementation method.
As shown in figure 25, the high dynamic range images of display device 100 display operation (1200) will be described.
Display device 100 receives image (1210) from the external world.Display device 100 can be by content receipt unit 130 from outer Boundary's reception content, and view data ID and metadata MD included in the received content of extraction.
Metadata MD is the data for including the information on view data ID, and metadata MD may include that with scene be list The monochrome information or the monochrome information in units of frame of position.Specifically, metadata MD may include the high-high brightness of entire content C Value, minimum luminance value and average brightness value, the maximum brightness value of included image in each scene, minimum luminance value and average Brightness value, or form maximum brightness value, minimum luminance value and the average brightness value of the image of each frame.
Extract view data ID and metadata MD, then, the linearity (1120) that display device 100 will be received. In order to obtain original image I1, display device 100 can linearly change view data.
Specifically, the graphics processing unit 200 of display device 100 can be used the second nonlinear mapping function F2, and will Image data restoration is original image I1.Additionally, graphics processing unit 200 can be based on included in the original image I1 for recovering The colour of each in pixel, calculates the monochrome information of original image I1.
By linearity, then, original image I1 is divided into multiple regions (1230) by display device 100.Display sets Original image I1 can be divided into first area R1 and second area R2 by standby 100, wherein, first area R1 is high-brightness region, Second area R2 is low brightness area.
Specifically, original image I1 can be divided into first area R1 and by the graphics processing unit 200 of display device 100 Two region R2, wherein, first area R1 includes that brightness value is equal to or more than the pixel of reference brightness value m, and second area R2 Pixel including brightness value less than reference brightness value m.
After image is divided, display device 100 generates the first mapping function MF1 and the image to first area R1 is performed Tone maps (1240).
Display device 100 can generate the first mapping function MF1 of the image of first area R1.Specifically, display device 100 Graphics processing unit 200 characteristic information including such as edge, texture and gray scale can be extracted from the image of first area R1 Pixel, and the first mapping function MF1 is generated based on the histogram related to the pixel including characteristic information.
Additionally, display device 100 can perform tone mapping to the image of first area R1.Specifically, display device 100 Graphics processing unit 200 can be used the first mapping function MF1, and the image to first area R1 performs tone mapping, and generates First area display image I2a.
Additionally, display device 100 generates the second mapping function MF2 and the image to second area R2 performs tone mapping (1250)。
Display device 100 can generate the second mapping function MF2 of the image of second area R2.Specifically, display device 100 Graphics processing unit 200 can be based on second area R2 brightness histogram generate the second mapping function MF2.
Additionally, display device 100 can perform tone mapping to the image of second area R2.Specifically, display device 100 Graphics processing unit 200 can be used the second mapping function MF2, and the image to second area R2 performs tone mapping, and generates Second area display image I2b.
Tone mapping is performed, then, image synthesis (1260) that display device 100 maps executed tone.Specifically, The graphics processing unit 200 of display device 100 can synthesize first area display image I2a and second area display image I2b, And generate the first image I2.
Then, 100 couples of the first image I2 of display device perform details enhancing (1270).In order to further intuitively show One image I2, display device 100 can perform the enhanced image procossing of such as contrast to the first image I2.
Specifically, the graphics processing unit 200 of display device 100 can be enhanced to the first image I2 execution such as contrast Details strengthens, and thus generates the second image I3.
Perform details enhancing, then, the display image of display device 100 (1280).Display device 100 can be by display unit 140 the second image I3 of display.
By this way, original image I1 can be divided into display device 100 the first area R1 as high-brightness region With the first area R2 as low brightness area, the characteristics of image at edge, texture or gray scale based on such as image is in the firstth area Tone mapping is performed on the R1 of domain, and the lightness based on image performs tone mapping on second area R2.
Therefore, display device 100 can process original image I1, to cause the original image I1 of high dynamic range images directly perceived Be displayed on the display panel 143 with low-dynamic range.
Figure 26 is shown according to another exemplary graphics processing unit included in implementation method display device.Figure 27 shows 3rd mapping function of the graphics processing unit generation shown in Figure 26.
As shown in Figure 26 and Figure 27, graphics processing unit 200 " ' may include:Image receiver module 205, is configured to reception figure As data ID and metadata MD;Linearization block 210, is configured to linearly change view data;3rd mapping function generation module 233, it is configured to generate the tone mapping function of high dynamic range images;Tone mapping block 240, is configured to execution tone and reflects Penetrate;And details enhancing module 250, it is configured to perform post-processing operation to the image of executed tone mapping.
Image receiver module 205 extracts view data ID and metadata from the content C received by content receipt unit 130 MD.Here, content C includes representing the view data ID and the metadata MD related to view data ID of original image.Metadata MD may include the monochrome information of view data ID.When content C is such as video, metadata MD may include that entire content C's is bright In degree information, content C in the monochrome information of each frame included in the monochrome information and content C of each included scene extremely It is few one.
The view data ID linearisations that linearization block 210 will be received from image receiver module 205, and analyze linearisation Image brightness.Specifically, when in the metadata MD of content C not include maximum brightness value L1max and minimum luminance value L1min When, linearization block 210 can directly calculate maximum brightness value L1max and minimum luminance value according to the original image of linearisation L1min。
3rd mapping function generation module 233 receives metadata MD from image receiver module 205, and based on being received Metadata MD generates the 3rd mapping function MF3.Here, metadata MD may include the monochrome information of entire content C, i.e. in whole Hold the maximum brightness value L1max and minimum luminance value L1min of C.
Additionally, the 3rd mapping function MF3 can be limited to content C maximum brightness value L1max and minimum luminance value L1min it Between.In other words, the maximum being input into the 3rd mapping function MF3 is the maximum brightness value L1max of content C, and is reflected to the 3rd The minimum value for penetrating function MF3 inputs is the minimum luminance value L1min of content C.
By this way, the 3rd mapping function MF3 for being generated by the monochrome information based on entire content C, can be to whole Original image I1 included in content C performs tone mapping.In other words, even if frame or scene change, when content C does not send out During changing, then the 3rd mapping function MF3 will not change.
Original image I1 can be low-light level part or hi-lite based on the 3rd reference brightness value Th points.Low-light level part Can differently be mapped by the 3rd mapping function MF3 with hi-lite.In other words, the mapping letter of mapping low-light level part The mapping function of number and mapping hi-lite can be different from each other.
In this case, the 3rd reference brightness value Th of original image I1 can be with the target average brightness of the first image I2 Value Ave_target correspondences.In other words, the 3rd reference brightness value Th is mapped as target average brightness value Ave_target.It is average bright Angle value refers to the average value of the brightness value of included all pixels output from display panel 143.Target average brightness value Ave_target is the desired value of average brightness value.Such target average brightness value Ave_target can be according to display panel 143 type and performance is defined in advance.
Specifically, the 3rd reference brightness value Th of original image I1 can be with the predeterminated target average brightness value of the first image I2 Ave_target is identical.
3rd mapping function MF3 may include the mapping function MF3-1 of low-light level part and the mapping function of hi-lite MF3-2。
As shown in figure 27, low-light level part that can be to brightness value less than the 3rd reference brightness value Th carries out Linear Mapping.Tool Body ground, when the 3rd reference brightness value Th is identical with target average brightness value Ave_target, the low-light level portion of original image I1 The brightness value for dividing is identical with the brightness value of the low-light level part of the first image I2.
Specifically, low-light level part can be mapped according to equation 2.
【Equation 2】
L2=G1L1
(wherein, L1 represents the brightness value being input into the 3rd mapping function, and L2 represents the brightness from the output of the 3rd mapping function Value, and G1 represents constant)
In equation 2, the value of G1 can become according to the 3rd reference brightness value Th and target average brightness value Ave_target Change.Specifically, it is determined that G1 is with so that the 3rd reference brightness value Th is mapped as target average brightness value Ave_target.
Specifically, when the 3rd reference brightness value Th is identical with target average brightness value Ave_target, it is " 1 " that G1 has Value.
As shown in figure 27, hi-lite that can be by brightness value more than the 3rd reference brightness value Th carries out Nonlinear Mapping.
In order to map hi-lite, equation 3 can be used.
【Equation 3】
L2=a [1- (L1-1)2n]+(1-a)L1
(wherein, L1 represents the brightness value being input into the 3rd mapping function, and L2 represents the brightness from the output of the 3rd mapping function Value, and a and n represent constant.)
In equation 3, the value of " n " can be predefined, and the value of " n " can be according to original included in entire content C Maximum brightness value L1max and minimum luminance value the L1min change of beginning image I1.
Tone mapping block 240 performs tone mapping using the 3rd mapping function MF3 and to original image I1.
Specifically, the brightness value of all pixels included in original image I1 is input into the 3rd mapping function MF3, and And the first image I2 is generated based on the brightness value exported from the 3rd mapping function MF3.In this case, brightness value is less than the 3rd The pixel of reference brightness value Th can be mapped by equation 2, and brightness value is equal to or more than the 3rd reference brightness value Th Pixel can be mapped by equation 3.
In order to provide the user further intuitively image, the image that details enhancing module 250 is mapped executed tone I2 treatment.Here, details enhancing may include various image processing techniques, such as, contrast enhancing, histogram equalization, Image sharpening and image smoothing, wherein, the difference between the clear zone and dark space that maximize image is strengthened by contrast;By straight Side's figure equalization regulation histogram is so as to the image for becoming the image being distributed with low contrast to have unified contrast distribution; By image sharpening meticulously transition diagram picture;And by image smoothing mildly transition diagram picture.
Hereinafter, the operation of the display device 100 according to implementation method will be described.
Figure 28 shows the another exemplary high dynamic range images display operation of the display device according to implementation method.
As shown in figure 28, the high dynamic range images of display device 100 display operation (1300) will be described.
Display device 100 receives image (1310) from the external world.Display device 100 can be by content receipt unit 130 from outer Boundary's reception content, and view data ID and metadata MD included in the received content of extraction.
Metadata MD is the data for including the information on view data ID, and metadata MD may include that with scene be list The monochrome information or the monochrome information in units of frame of position.Specifically, metadata MD may include the high-high brightness of entire content C Value, minimum luminance value and average brightness value, the maximum brightness value of included image in each scene, minimum luminance value and average Brightness value, or form maximum brightness value, minimum luminance value and the average brightness value of the image of each frame.
Extract view data ID and metadata MD, then, the linearity (1320) that display device 100 will be received. In order to obtain original image I1, display device 100 can linearly change view data.
Specifically, the graphics processing unit 200 of display device 100 can be used the second nonlinear mapping function F2, and will Image data restoration is original image I1.Additionally, graphics processing unit 200 can be based on included in the original image I1 for recovering The colour of each in pixel, calculates the monochrome information of original image I1.
Linearized graph picture, then, display device 100 generates the 3rd mapping function MF3 (1330).Display device 100 can base The 3rd mapping function MF3 is generated in metadata MD.Specifically, display device 100 can be based on the maximum brightness value of entire content C L1max and minimum luminance value L1min generates the 3rd mapping function MF3.
When the brightness value of pixel included in original image I1 is less than three reference brightness values, the 3rd mapping function MF3 is equation 2.When the brightness value of pixel included in original image I1 is equal to or more than three reference brightness values, the Three mapping function MF3 are equations 3.
The 3rd mapping function MF3 is generated, then, display device 100 performs tone mapping (1340) to original image I1.
Specifically, be input into the brightness value of all pixels included in original image I1 to the 3rd and reflect by display device 100 Function MF3 is penetrated, and the first image I2 is generated based on the brightness value exported from the 3rd mapping function MF3.In this case, when When brightness value is less than three reference brightness values, included pixel can be mapped by equation 2 in original image I1.When bright When angle value is equal to or more than three reference brightness values, included pixel can be mapped by equation 3 in original image I1.
Tone mapping is performed, then, 100 couples of the first image I2 of display device perform details enhancing (1350).In order to enter one Step intuitively shows the first image I2, and display device 100 can be performed at the enhanced image of such as contrast on the first image I2 Reason.
Specifically, the graphics processing unit 200 of display device 100 can perform such as contrast enhancing on the first image I2 Details enhancing, thus generate the second image I3.
Perform details enhancing, then, the display image of display device 100 (1360).Display device 100 can be by display unit 140 the second image I3 of display.
By this way, display device 100 can perform linear tone mapping in the low brightness area of original image I1, with And non-linear tone mapping is performed on high-brightness region.
Included another exemplary graphics processing unit in the display device that Figure 29 shows according to implementation method.Figure 30 and Figure 31 shows the 4th mapping function of the graphics processing unit generation shown in Figure 29.
As shown in Figure 29, Figure 30 and Figure 31, graphics processing unit 200 " " may include:Image receiver module 205, is configured to Receive view data ID and metadata MD;Linearization block 210, is configured to linearly change view data;4th mapping function is given birth to Into module 234, it is configured to generate the tone mapping function of high dynamic range images;Tone mapping block 240, is configured to perform color Adjust mapping;And details enhancing module 250, it is configured to perform post-processing operation to the image of executed tone mapping.
Image receiver module 205 extracts view data ID and metadata from the content C received by content receipt unit 130 MD.Here, content C includes representing the view data ID and the metadata MD related to view data ID of original image.Metadata MD may include the monochrome information of view data ID.When content C is such as video, metadata MD then may include entire content C's In monochrome information, content C in the monochrome information of each frame included in the monochrome information and content C of each included scene At least one.
The view data ID linearisations that linearization block 210 will be received from image receiver module 205, and analyze linearisation Image brightness.Specifically, when in the metadata MD of content C not include maximum brightness value L1max and minimum luminance value L1min When, linearization block 210 can directly calculate maximum brightness value L1max and minimum luminance value according to the original image of linearisation L1min。
4th mapping function generation module 234 receives metadata MD from image receiver module 205, and based on being received Metadata MD generates the 4th mapping function MF4.Here, metadata MD may include the brightness of included each scene in content C Information, i.e. the maximum brightness value L1max and minimum luminance value L1min of each scene.
Additionally, the 4th mapping function MF4 can be limited to the maximum brightness value L1max and minimum luminance value of each scene Between L1min.In other words, the maximum being input into the 4th mapping function MF4 is the maximum brightness value L1max of correspondence scene, with And to the 4th mapping function MF4 be input into minimum value be correspondence scene minimum luminance value L1min.
By this way, the 3rd mapping function MF3 for being generated by the monochrome information based on scene, can be in corresponding scene Included original image I1 performs tone mapping.In other words, even if frame changes, when scene does not change, then Four mapping function MF4 will not change.However, even if content C does not change, when scene changes, the 4th mapping function MF4 also changes.
4th mapping function MF4 can be according to the scene average luminance value Ave_ of the average brightness value for indicating whole scene Scene changes.Additionally, original image I1 can be divided into low-light level part or hi-lite.Low-light level part and hi-lite Can differently be mapped by the 4th mapping function MF4.In other words, the mapping function of mapping low-light level part and mapping are highlighted The mapping function for spending part can be different from each other.
First, the 4th mapping function during the 4th reference brightness value Th will be less than to scene average luminance value Ave_scene MF4 is described.
4th mapping function MF4 may include the mapping function MF4-1 and original image of the low-light level part of original image I1 The mapping function MF4-2 of the hi-lite of I1.
In this case, the low-light level part of original image I1 and hi-lite can be based on the 4th reference brightness value Th Divided.Additionally, the 4th reference brightness value Th of original image I1 can be with the target average brightness value Ave_ of the first image I2 Target correspondences.In other words, the 4th reference brightness value Th is mapped as target average brightness value Ave_target.Average brightness value is Refer to the average value of the brightness value exported in included all pixels from display panel 143.Target average brightness value Ave_ Target is the desired value of average brightness value.Such target average brightness value Ave_target can be according to display panel 143 Type and performance are defined in advance.Specifically, the 4th reference brightness value Th of original image I1 can be pre- with the first image I2 The average brightness value Ave_target that sets the goal is identical.
As shown in figure 30, low-light level part that can be by brightness value less than the 4th reference brightness value Th carries out Linear Mapping.Tool Body ground, when the 4th reference brightness value Th is identical with target average brightness value Ave_target, the low-light level portion of original image I1 The brightness value for dividing is identical with the brightness value of the low-light level part of the first image I2.
Specifically, low-light level part can be mapped by equation 2.
【Equation 2】
L2=G1L1
(wherein, L1 represents the brightness value being input into the 4th mapping function, and L2 represents the brightness from the output of the 4th mapping function Value, and G1 is constant.)
In equation 2, the value of G1 can become according to the 4th reference brightness value Th and target average brightness value Ave_target Change.Specifically, G1 is determined into so that the 4th reference brightness value Th is mapped as target average brightness value Ave_target.
Specifically, when the 4th reference brightness value Th is identical with target average brightness value Ave_target, it is " 1 " that G1 has Value.
As shown in figure 30, brightness can be equal to equal to or more than the 4th reference brightness or the hi-lite of Th carries out non-thread Property mapping.
Hi-lite can be mapped by equation 3.
By this way, when scene average brightness value Ave_scene is less than the 4th reference brightness value Th, original image is worked as When the brightness value of included pixel is less than the 4th reference brightness value Th in I1, then given birth to by the 4th mapping function generation module 234 Into the 4th mapping function MF4 be equation 2, or when the brightness value of pixel included in original image I1 is equal to or more than the During four reference brightness value Th, then the 4th mapping function MF4 is equation 3.
Next, the when the 4th reference brightness value Th will be equal to or more than to scene average luminance value Ave_scene the 4th Mapping function MF4 is described.
4th mapping function MF4 may include the mapping function MF4-1 and original image of the low-light level part of original image I1 The mapping function MF4-2 of the hi-lite of I1.
In this case, the low-light level part of original image I1 and hi-lite can be based on scene average luminance value Ave_scene is divided.Additionally, the scene average luminance value Ave_scene of original image I1 can be with the mesh of the first image I2 Mark average brightness value Ave_target correspondences.In other words, scene average luminance value Ave_scene is mapped as target average brightness value Ave_target.Average brightness value refers to the average of the brightness value of output in included all pixels from display panel 143 Value.Target average brightness value Ave_target is the desired value of average brightness value.Such target average brightness value Ave_ Target can in advance be defined according to the type of display panel 143 and performance.
As shown in figure 31, low-light level part that can be by brightness value less than scene average luminance value Ave_scene is carried out linearly Mapping.
Specifically, low-light level part can be mapped by equation 5.
【Equation 5】
L2=G2L1
(wherein, L1 represents the brightness value being input into the 4th mapping function, and L2 represents the brightness from the output of the 4th mapping function Value, and G2 is constant.)
In equation 5, the value of G2 can be according to scene average luminance value Ave_scene and target average brightness value Ave_ Target changes.Specifically, it is determined that G2 is with so that scene average luminance value Ave_scene is mapped as target average brightness value Ave_ target。
Specifically, when scene changes, due to scene average luminance value changes, therefore G2 just may whenever scene changes Change.
As shown in figure 31, the hi-lite that brightness value is equal to or more than scene average luminance value Ave_scene can be entered Row Nonlinear Mapping.
Hi-lite can be mapped by equation 6.
【Equation 6】
L2=a [1- (L1-1)2n]+(1-a)L1
(wherein, L1 represents the brightness value being input into the 4th mapping function, and Ave_target represents target average brightness value, L2 represents the brightness value from the output of the 4th mapping function, and Ave_scene represents scene average luminance value, and a and n are constants.)
In equation 6, the value of n can be predefined, and the value of a can be according to original image included in each scene The maximum brightness value L1max and minimum luminance value L1min of I1 determine.
By this way, when scene average brightness value Ave_scene is equal to or more than the 4th reference brightness value Th, original is worked as When the brightness value of included pixel is less than scene average luminance value Ave_scene in beginning image I1, then by the 4th mapping function 4th mapping function MF4 of the generation of generation module 234 is equation 5, or when the brightness of pixel included in original image I1 When value is equal to or more than scene average luminance value Ave_scene, then the 4th mapping function MF4 is equation 6.
Tone mapping block 240 performs tone and maps using the 4th mapping function MF4 to original image I1.
Specifically, be input into the brightness value of all pixels included in original image I1 to by tone mapping block 240 Four mapping function MF4, and the first image I2 is generated based on the brightness value exported from the 4th mapping function MF4.
In this case, when scene average brightness value Ave_scene is less than the 4th reference brightness value Th, brightness value is small Can be mapped by equation 2 in the pixel of the 3rd reference brightness value Th, and brightness value is equal to or more than the 3rd with reference to bright The pixel of angle value Th can be mapped by equation 3.
In addition, when scene average brightness value Ave_scene is equal to or more than the 4th scene average luminance value Th, brightness value Pixel less than scene average luminance value Ave_scene can be mapped by equation 5, and brightness value is equal to or more than field The pixel of scape average brightness value Ave_scene can be mapped by equation 6.
In order to provide the user further intuitively image, the image that details enhancing module 250 is mapped executed tone I2 treatment.Here, details enhancing may include various image processing techniques, such as, contrast enhancing, histogram equalization, Image sharpening and image smoothing, wherein, the difference between the clear zone and dark space that maximize image is strengthened by contrast;By straight Side's figure equalization regulation histogram is so as to the image for becoming the image being distributed with low contrast to have unified contrast distribution; By image sharpening meticulously transition diagram picture;And by image smoothing mildly transition diagram picture.
Hereinafter, the operation of the display device 100 according to implementation method will be described.
Figure 32 shows the display operation of the another exemplary high dynamic range images according to implementation method display device.
As shown in figure 32, the display of the high dynamic range images of display device 100 operation (1400) will be described.
Display device 100 receives image (1210) from the external world.Display device 100 can be by content receipt unit 130 from outer Boundary's reception content, and view data ID and metadata MD included in the received content of extraction.
Metadata MD is the data for including the information on view data ID, and metadata MD may include that with scene be list The monochrome information or the monochrome information in units of frame of position.Specifically, metadata MD may include the high-high brightness of entire content C Value, minimum luminance value and average brightness value, the maximum brightness value of included image in each scene, minimum luminance value and average Brightness value, or form maximum brightness value, minimum luminance value and the average brightness value of the image of each frame.
Extract view data ID and metadata MD, then, the linearity (1420) that display device 100 will be received. In order to obtain original image I1, display device 100 can linearly change view data.
Specifically, the graphics processing unit 200 of display device 100 can be used the second nonlinear mapping function F2, and can It is original image I1 by image data restoration.Additionally, graphics processing unit 200 can be based on included in the original image I1 for recovering Pixel in the colour of each, calculate the monochrome information of original image I1.
Linearized graph picture, then, display device 100 generates the 4th mapping function MF4 (1330).Display device 100 can base The 4th mapping function MF4 is generated in metadata MD.Specifically, display device 100 can be based on the maximum brightness value of each scene L1max, minimum luminance value L1min and scene average luminance value Ave_scene generate the 4th mapping function MF4.
Specifically, when scene average brightness value Ave_scene is less than the 4th reference brightness value Th, when in original image I1 When the brightness value of included pixel is less than the 4th reference brightness value Th, then the 4th mapping function MF4 is equation 2, or when original When the brightness value of included pixel is equal to or more than the 4th reference brightness value Th in beginning image I1, then the 4th mapping function MF4 It is equation 3.
When scene average brightness value Ave_scene is equal to or more than the 4th reference brightness value Th, when in original image I1 When the brightness value of included pixel is less than scene average luminance value Ave_scene, then the 4th mapping function MF4 is equation 5, Or when the brightness value of pixel included in original image I1 is equal to or more than scene average luminance value Ave_scene, then the Four mapping function MF4 are equations 6.
The 4th mapping function MF4 is generated, then, display device 100 performs tone mapping (1440) to original image I1.
Specifically, be input into the brightness value of all pixels included in original image I1 to the 4th and reflect by display device 100 Function MF4 is penetrated, and the first image I2 is generated based on the brightness value exported from the 4th mapping function MF4.
In this case, when scene average brightness value Ave_scene is less than the 4th reference brightness value Th, brightness value is small Can be mapped by equation 2 in the pixel of the 3rd reference brightness value Th, and brightness value is equal to or more than the 3rd with reference to bright The pixel of angle value Th can be mapped by equation 3.
In addition, when scene average brightness value Ave_scene is equal to or more than the 4th scene average luminance value Th, brightness value Pixel less than scene average luminance value Ave_scene can be mapped by equation 5, and brightness value is equal to or more than field The pixel of scape average brightness value Ave_scene can be mapped by equation 6.
Tone mapping is performed, then, 100 couples of the first image I2 of display device perform details enhancing (1450).In order to enter one Step intuitively shows the first image I2, and display device 100 can be performed at the enhanced image of such as contrast on the first image I2 Reason.
Specifically, the graphics processing unit 200 of display device 100 can perform such as contrast enhancing on the first image I2 Details enhancing, thus generate the second image I3.
Perform details enhancing, then, the display image of display device 100 (1460).Display device 100 can be by display unit 140 the second image I3 of display.
By this way, display device 100 can perform linear tone mapping in the low brightness area of original image I1, with And non-linear tone mapping is performed on high-brightness region.
Although some implementation methods have been shown and described, it is to be understood that not departing from the original of implementation method In the case of reason and spirit, those skilled in the art can be changed to these implementation methods, wherein, the scope of implementation method exists It is defined in claim and its equivalent.

Claims (25)

1. a kind of display device, including:
Content receipt unit, is configured to receive the monochrome information of high dynamic range images and the high dynamic range images, wherein, The high dynamic range images are the inputs with the dynamic range of images bigger than the display dynamic range of the display device Image;
Graphics processing unit, is configured to perform tone mapping based on the monochrome information, and the high dynamic range images are turned Change low dynamic range echograms into;And
Display unit, is configured to the display low dynamic image,
Wherein, the monochrome information includes the image maximum brightness value and image minimum luminance value of the high dynamic range images.
2. display device according to claim 1, wherein, the monochrome information includes included described high dynamic in scene The scene maximum brightness value and scene minimum luminance value of state range image.
3. display device according to claim 1, wherein, the monochrome information includes the included high dynamic in frame The frame maximum brightness value and frame minimum luminance value of range image.
4. display device according to claim 1, wherein, the monochrome information includes included described in entire content The content maximum brightness value and content minimum luminance value of high dynamic range images.
5. display device according to claim 1, wherein, described image processing unit is in the high dynamic range images Detection brightness value be equal to or more than reference brightness value first area, and the area image based on the first area feature Information performs tone mapping to the area image of the first area, wherein the characteristic information includes marginal information, line At least one of reason information and half-tone information.
6. display device according to claim 5, wherein, administrative division map of the described image processing unit in the first area As interior detection fringe region, and histogram the first mapping function of generation based on pixel included in the fringe region.
7. display device according to claim 6, wherein, first mapping function has according in the fringe region The quantity of included pixel and the slope that changes.
8. display device according to claim 7, wherein, in first mapping function, brightness value is at the edge The quantity of included pixel is wrapped more than brightness value by the slope at the position of the first value in the fringe region in region The quantity of the pixel for including is the slope at the position of second value, wherein the second value is less than the described first value.
9. display device according to claim 6, wherein, first mapping function is by the fringe region The accumulative histogram that the histogram of included pixel is integrated and obtains.
10. display device according to claim 5, wherein, region of the described image processing unit in the first area Detection texture region in image, and the generation of the histogram based on pixel included in the texture region first maps letter Number.
11. display devices according to claim 5, wherein, region of the described image processing unit in the first area Detection gray areas in image, and the generation of the histogram based on pixel included in the gray areas first maps letter Number.
12. display devices according to claim 5, wherein, described image processing unit is based on the HDR figure The brightness value of picture generates the second mapping function.
13. display devices according to claim 12, wherein, described image processing unit is according to second mapping function The high dynamic range images are performed with the mapping of second function tone to generate the image mapped through tone, and according to described the One mapping function performs first function tone and maps to the image mapped through tone of the second tone mapping described in executed.
14. display devices according to claim 5, wherein, described image processing unit is based on the HDR figure The brightness value of the second area as in generates the second mapping function, wherein the brightness value of the second area is bright less than the reference Angle value.
15. display devices according to claim 14, wherein, described image processing unit is based on first mapping function Tone mapping function is generated with second mapping function, and according to the tone mapping function by the HDR figure As being converted into the low dynamic range echograms.
16. display devices according to claim 1, wherein, described image processing unit is to the high dynamic range images In the first pixel among included multiple pixels perform linear tone mapping, and to second among the multiple pixel Pixel performs non-linear tone mapping, wherein, the brightness value of first pixel is less than reference brightness value, second pixel Brightness value is equal to or more than the reference brightness value.
17. display devices according to claim 1, wherein, when the high dynamic range images included in scene When scene average luminance value is less than reference brightness value, described image processing unit is to included in the high dynamic range images The first pixel among multiple pixels performs linear tone mapping, and the second pixel among the multiple pixel is performed non- Linear tone mapping, wherein, the brightness value of first pixel is less than the reference brightness value, the brightness value of second pixel Equal to or more than the reference brightness value.
18. display devices according to claim 1, wherein, when the high dynamic range images included in scene When scene average luminance value is equal to or more than reference brightness value, described image processing unit is to institute in the high dynamic range images Including multiple pixels among the first pixel perform the mapping of linear tone, and to the second pixel among the multiple pixel Non-linear tone mapping is performed, wherein, the brightness value of first pixel is less than the scene average luminance value, second picture The brightness value of element is equal to or more than the scene average luminance value.
The method of 19. control display devices, including:
The monochrome information of high dynamic range images and the high dynamic range images is received, wherein, the HDR figure It seem the input picture with the dynamic range of images bigger than the display dynamic range of the display device;
Tone mapping is performed based on the monochrome information, the high dynamic range images are converted into low dynamic range echograms; And
The low dynamic image is shown,
Wherein, the monochrome information includes the image maximum brightness value and image minimum luminance value of the high dynamic range images.
20. methods according to claim 19, wherein, the monochrome information includes the high dynamic included in scene The scene maximum brightness value and scene minimum luminance value of range image.
21. methods according to claim 19, wherein, the monochrome information includes being formed the HDR figure of frame The frame maximum brightness value and frame minimum luminance value of picture.
22. methods according to claim 19, wherein, the monochrome information includes the height included in entire content The content maximum brightness value and content minimum luminance value of dynamic image.
23. methods according to claim 19, wherein, the execution tone mapping includes:
Detect that brightness value is equal to or more than the first area of reference brightness value in the high dynamic range images;
The characteristic information generation tone mapping function of the area image based on the first area;And
Tone mapping is performed to the high dynamic range images according to the tone mapping function, by the HDR figure As being converted into the low dynamic image;And
In marginal information of the wherein described characteristic information including the high dynamic range images, texture information and half-tone information extremely Few one.
A kind of 24. methods, including:
The first area of image is determined, wherein the first area has brightness higher than the second area of described image;
It is determined that first mapping function and second mapping function corresponding with the first area and the second area, wherein, First mapping function strengthens one or more characteristics of image, and second mapping function increases lightness;And
Described image is mapped in response to brightness, using first mapping function and second mapping function, to protect The lightness for holding the image of the second area and the characteristic information of the image for keeping the first area.
25. methods according to claim 24, also include:
First mapping function and second mapping function are combined to generate united mapping function, wherein,
Carrying out mapping to described image includes:Described image is entered in response to brightness of image, using the united mapping function Row mapping.
CN201580054198.4A 2014-10-06 2015-10-01 Display apparatus and method of controlling the same Active CN106796775B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911350815.1A CN110992914B (en) 2014-10-06 2015-10-01 Display apparatus and method of controlling the same

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR20140134135 2014-10-06
KR10-2014-0134135 2014-10-06
KR10-2015-0024271 2015-02-17
KR1020150024271A KR102308507B1 (en) 2014-10-06 2015-02-17 Display and controlling method thereof
PCT/KR2015/010387 WO2016056787A1 (en) 2014-10-06 2015-10-01 Display device and method of controlling the same

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201911350815.1A Division CN110992914B (en) 2014-10-06 2015-10-01 Display apparatus and method of controlling the same

Publications (2)

Publication Number Publication Date
CN106796775A true CN106796775A (en) 2017-05-31
CN106796775B CN106796775B (en) 2020-01-17

Family

ID=55801793

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580054198.4A Active CN106796775B (en) 2014-10-06 2015-10-01 Display apparatus and method of controlling the same

Country Status (2)

Country Link
KR (1) KR102308507B1 (en)
CN (1) CN106796775B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108024104A (en) * 2017-12-12 2018-05-11 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled
CN108090879A (en) * 2017-12-12 2018-05-29 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled
CN108109180A (en) * 2017-12-12 2018-06-01 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled
CN108769804A (en) * 2018-04-25 2018-11-06 杭州当虹科技股份有限公司 A kind of format conversion method of high dynamic range video
CN110225253A (en) * 2019-06-12 2019-09-10 杨勇 A kind of high dynamic range images processing method and system
CN110246470A (en) * 2018-03-08 2019-09-17 三星显示有限公司 Execute the method for image adaptive tone mapping and the display device using this method
CN111754946A (en) * 2020-07-03 2020-10-09 深圳Tcl新技术有限公司 Image quality optimization method, display device and computer readable storage medium
CN114051126A (en) * 2021-12-06 2022-02-15 北京达佳互联信息技术有限公司 Video processing method and video processing device
US11676547B2 (en) 2017-07-07 2023-06-13 Semiconductor Energy Laboratory Co., Ltd. Display system and operation method of the display system

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102582643B1 (en) * 2016-09-19 2023-10-05 삼성전자주식회사 Display apparatus and method for processing image thereof
CN109923580B (en) * 2016-11-04 2023-12-15 特利丹菲力尔有限责任公司 Dynamic range compression for thermal video
US10735688B2 (en) 2017-07-13 2020-08-04 Samsung Electronics Co., Ltd. Electronics apparatus, display apparatus and control method thereof
KR102553764B1 (en) * 2017-07-13 2023-07-10 삼성전자주식회사 Electronics apparatus, Display apparatus and contorl method thereof
KR102525546B1 (en) * 2017-11-21 2023-04-26 삼성디스플레이 주식회사 Image processing method and image processor performing the same
KR102550846B1 (en) * 2018-03-06 2023-07-05 삼성디스플레이 주식회사 Method of performing an image-adaptive tone mapping and display device employing the same
KR20200101048A (en) * 2019-02-19 2020-08-27 삼성전자주식회사 An electronic device for processing image and image processing method thereof
KR102661824B1 (en) * 2019-03-26 2024-04-26 엘지전자 주식회사 Signal processing device and image display apparatus including the same
US11849232B2 (en) 2020-11-03 2023-12-19 Samsung Electronics Co., Ltd. Integrated image sensor with internal feedback and operation method thereof
KR102375369B1 (en) * 2020-11-26 2022-03-18 엘지전자 주식회사 Apparatus and method for tone mapping

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050117799A1 (en) * 2003-12-01 2005-06-02 Chiou-Shann Fuh Method and apparatus for transforming a high dynamic range image into a low dynamic range image
US20080297460A1 (en) * 2007-05-31 2008-12-04 Peng Huajun Method of displaying a low dynamic range image in a high dynamic range
KR20100081886A (en) * 2009-01-07 2010-07-15 한양대학교 산학협력단 Adaptive tone mapping apparatus and method, and image processing system using the method
KR20110088050A (en) * 2010-01-28 2011-08-03 엘지전자 주식회사 Apparatus and mehtod for image qualtiy improving in image display device
US20120170843A1 (en) * 2011-01-05 2012-07-05 Peng Lin Methods for performing local tone mapping

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040048790A (en) * 2002-12-03 2004-06-10 삼성전자주식회사 Apparatus and Method for control luminance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050117799A1 (en) * 2003-12-01 2005-06-02 Chiou-Shann Fuh Method and apparatus for transforming a high dynamic range image into a low dynamic range image
US20080297460A1 (en) * 2007-05-31 2008-12-04 Peng Huajun Method of displaying a low dynamic range image in a high dynamic range
KR20100081886A (en) * 2009-01-07 2010-07-15 한양대학교 산학협력단 Adaptive tone mapping apparatus and method, and image processing system using the method
KR20110088050A (en) * 2010-01-28 2011-08-03 엘지전자 주식회사 Apparatus and mehtod for image qualtiy improving in image display device
US20120170843A1 (en) * 2011-01-05 2012-07-05 Peng Lin Methods for performing local tone mapping

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11676547B2 (en) 2017-07-07 2023-06-13 Semiconductor Energy Laboratory Co., Ltd. Display system and operation method of the display system
CN108090879A (en) * 2017-12-12 2018-05-29 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled
CN108109180A (en) * 2017-12-12 2018-06-01 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled
CN108024104A (en) * 2017-12-12 2018-05-11 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled
CN108109180B (en) * 2017-12-12 2020-10-02 上海顺久电子科技有限公司 Method for processing input high dynamic range image and display equipment
CN110246470B (en) * 2018-03-08 2024-03-19 三星显示有限公司 Method for performing image adaptive tone mapping and display apparatus employing the same
CN110246470A (en) * 2018-03-08 2019-09-17 三星显示有限公司 Execute the method for image adaptive tone mapping and the display device using this method
CN108769804A (en) * 2018-04-25 2018-11-06 杭州当虹科技股份有限公司 A kind of format conversion method of high dynamic range video
CN110225253A (en) * 2019-06-12 2019-09-10 杨勇 A kind of high dynamic range images processing method and system
CN111754946B (en) * 2020-07-03 2022-05-06 深圳Tcl新技术有限公司 Image quality optimization method, display device and computer readable storage medium
CN111754946A (en) * 2020-07-03 2020-10-09 深圳Tcl新技术有限公司 Image quality optimization method, display device and computer readable storage medium
CN114051126A (en) * 2021-12-06 2022-02-15 北京达佳互联信息技术有限公司 Video processing method and video processing device
CN114051126B (en) * 2021-12-06 2023-12-19 北京达佳互联信息技术有限公司 Video processing method and video processing device

Also Published As

Publication number Publication date
KR102308507B1 (en) 2021-10-06
CN106796775B (en) 2020-01-17
KR20160040981A (en) 2016-04-15

Similar Documents

Publication Publication Date Title
CN106796775A (en) Display device and the method for controlling the display device
US11721294B2 (en) Display device and method of controlling the same
CN108141508B (en) Imaging device and method for generating light in front of display panel of imaging device
US10951875B2 (en) Display processing circuitry
CN101185113B (en) Double displays device
US9666113B2 (en) Display, image processing unit, and display method for improving image quality
US10832636B2 (en) Image processing apparatus, image processing method, and program
CN103747225B (en) Based on the high dynamic range images double-screen display method of color space conversion
CN109983530A (en) Ambient light adaptive display management
KR102590142B1 (en) Display apparatus and control method thereof
US20090010538A1 (en) Apparatus and method for automatically computing gamma correction curve
KR102176398B1 (en) A image processing device and a image processing method
US10366673B2 (en) Display device and image processing method thereof
KR102086163B1 (en) Pixel look-up table controller and method for real-time correction of led display information
US10462337B2 (en) Liquid crystal display device and image processing method for same
JP4523368B2 (en) Stereoscopic image generation apparatus and program
US11257443B2 (en) Method for processing image, and display device
US20190082138A1 (en) Inverse tone-mapping to a virtual display
KR20150057405A (en) Display driving device and display device including the same
JP5100873B1 (en) Crosstalk correction amount evaluation apparatus and crosstalk correction amount evaluation method
CN101290763B (en) Image brightness dynamically regulating method
US20200251069A1 (en) Color image display adaptation to ambient light
US20240161706A1 (en) Display management with position-varying adaptivity to ambient light and/or non-display-originating surface light
CN109685859B (en) Three-dimensional color automatic adjustment method based on 3D lookup table
Kim et al. Normalized tone-mapping operators for color quality improvement in 3DTV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant