CN101212545A - Method and device for mapping high-dynamic range graphics to low-dynamic range graphics - Google Patents

Method and device for mapping high-dynamic range graphics to low-dynamic range graphics Download PDF

Info

Publication number
CN101212545A
CN101212545A CNA2007100509558A CN200710050955A CN101212545A CN 101212545 A CN101212545 A CN 101212545A CN A2007100509558 A CNA2007100509558 A CN A2007100509558A CN 200710050955 A CN200710050955 A CN 200710050955A CN 101212545 A CN101212545 A CN 101212545A
Authority
CN
China
Prior art keywords
light intensity
value
dynamic range
pixel
intensity value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2007100509558A
Other languages
Chinese (zh)
Inventor
陈雷霆
何明耘
蔡洪斌
邱航
房春兰
何晓曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CNA2007100509558A priority Critical patent/CN101212545A/en
Publication of CN101212545A publication Critical patent/CN101212545A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Generation (AREA)

Abstract

The invention discloses a method for realizing a reflection of graphics from high dynamic range to low dynamic range and a device thereof, which relates to a graph processing technique, providing a method for compressing a reflection from high dynamic range to low dynamic range and a device thereof which can effectively guarantee real-time requests and protect more details. The method comprises the steps: firstly the RGB value of every pixel is read from graphics texture of high dynamic range; the RGB value is transformed into practical light intensity value; then the practical light intensity value is transformed into average light intensity value; the light intensity value of every pixel is zoomed according to the necessary bright effect after reflection; at last the light intensity value of pixel which is reflected is calculated by the formula Ld(x, y)={L(x, y)[1+L(x, y)/L2white]+ Sigma }/1+L(x,y). The invention can compress the graphics of high dynamic range high-effectively and preserve the details and contrast, then reflect to the low dynamic range to suit the present regular display equipment in the low dynamic range. The invention also can effectively guarantee the real-time requests, and has the advantages of easiness in realizing and good scalability. The invention meets the requests of three-dimensional scenes (such as three-dimensional games) and virtual reality.

Description

A kind of figure of high dynamic range is mapped to the method and the device of low-dynamic range
Technical field
The present invention relates to a kind of graph processing technique, particularly a kind of figure compressing mapping of high dynamic range is to method and the device of low-dynamic range to adapt to the existing conventional display device.
Background technology
In recent years, (High Dynamic Range, HDR) technology began to have increasing application in fields such as computer graphics, virtual realities to high dynamic range.HDR is relative with traditional LDR (lowdynamic range), LDR adopts 8 texture formats (every pixel 24 or 32 colors), this brand-new model of HDR is represented the RGB and the brightness value of each pixel with actual physics parameter or linear function, parameter no longer is limited to integer, can reach bigger scope and pinpoint accuracy more.
The ratio of the maximum of piece image intensity level and minimum value is called as dynamic range (DynamicRange).By the automatic adjusting of eye pupil, to starlight, the dynamic range that human eye can be differentiated object can reach 100000000: 1 from bright daylight, even in same adaptation figure, need not regulate, human eye also can be differentiated 10000: 1 brightness range.Yet the luminance dynamic range that conventional display device can be rebuild only is 100: 1.
The figure of HDR is a large amount of existing in actual life, therefore needs in some way the dynamic range of image to be carried out convergent-divergent, makes it to mate the conventional display device that can only export LDR.This mode is called color range and rebuilds (Tone Reproduction) or tone mapping (Tone Mapping), it provide a kind of mode with real figure the brightness value convergent-divergent or be mapped to the scope that display device can show.Except the compression brightness range, also must keep the organoleptic quality (Perceptual Quality) of original image, must keep information such as original image contrast, bright degree, details as method for reconstructing.
The method for reconstructing of dynamic range roughly is divided into two kinds, and a kind of is spatial domain constant (Spatially Uniform), perhaps is the global scope dynamic compression.These class methods are when carrying out the dynamic range conversion to image, and each pixel (Pixel) goes up uses same conversion curve, and conversion curve can specify or obtain according to the content of image in advance.The dynamic range reconstruction technique of adjusting based on histogram with people (A visibility matching tone reproductionoperator for high dynamic range scenes) such as Ward Larson in this class algorithm is a sign, its deficiency is each zone that constant conversion curve can not adapting to image, causes result images to lose on details, color, bright degree.
Another kind is that the spatial domain changes (Spatially Varying), perhaps is local dynamic range compression.These class methods are carried out different conversion at the different zone of image.According to human visual system's (HVS) different models, various distinct methods are standard with what keep picture quality all in compression of dynamic range in a certain respect.Until the latter stage nineties, the whole bag of tricks all is to adjust at different HVS models on multilayered model, but because the filter function characteristic that adopts on the low-frequency image is not good, the serious halation (halo) that object edge produces in result images is these class methods of puzzlement problems for many years always.1999, people such as Tumblin have proposed LCIS method (Lcis:A boundary hierarchy for detail-preserving contrastreduction), by definition to the different details of image, improved resultant image quality, but make the operation of the method very poor efficiency that becomes, speed is slow excessively.
Goal of the invention
The objective of the invention is to solve the prior art above shortcomings, provide a kind of high dynamic range compression to be mapped to the method and apparatus of low-dynamic range, can ensure the real-time requirement effectively, efficient is higher, can also keep the details, color of figure, bright degree etc., and be easy to realize, be with good expansibility.
Purpose of the present invention realizes by following technical proposals:
A kind of figure of high dynamic range is mapped to the method for low-dynamic range, comprises the steps:
A, elder generation read the rgb value of each pixel from the graphical textures of high dynamic range;
B, convert rgb value to the actual light intensity value;
C, utilize following formula to convert the actual light intensity value to the average intensity value then,
L _ w = exp [ 1 N Σ x , y log ( δ + L w ( x , y ) ) ]
Wherein, L w(x, y) expression actual light intensity value;
X, y represent the two-dimensional coordinate of each pixel;
Figure S2007100509558D00022
Expression average intensity value;
N represents the quantity of texture pixel;
δ is a very little constant, can be taken as 0.0001;
D, the bright effect required according to mapping back are come the light intensity value of each pixel of convergent-divergent by following formula,
L ( x , y ) = a L w ( x , y ) L ‾ w
Wherein, the span of α for more gloomy figure, is got lower α value between 0 to 1, otherwise, then get higher α value;
E, at last by following formula, calculate the light intensity value of mapped pixel,
L d ( x , y ) = L ( x , y ) [ 1 + L ( x , y ) L white 2 ] + σ 1 + L ( x , y )
Wherein, L (x y) is the light intensity value of mapped pixel;
L WhiteIt is the largest light intensity value;
σ is a self-defining value, when figure does not need complete black local time, can be set to nonzero value by σ, as 0.005.
Among the described step b, convert rgb value to actual light intensity value by following formula:
L w=0.27R+0.67G+0.06B
In the described steps d, the general recommendations of α value gets 0.18.
In above-mentioned each technical scheme, among the described step c, in frame buffer, read in the light intensity value of each pixel, in two pipelines, carry out, in first pipeline, create a little render target, draw a rectangular blocks and cover whole render target, on each pixel in target cache, pixel shader program add up light intensity value in the source cache and storage computation result; In second pipeline, the target cache in first pipeline that superposes is moved back-page calculating then, then includes in the texture of generation
Figure S2007100509558D00032
A kind of figure of high dynamic range is mapped to the device of low-dynamic range, includes:
Pixel rgb value reader is used for reading from the graphical textures of high dynamic range the rgb value of each pixel;
Rgb value light intensity transducer is used for converting rgb value to the actual light intensity value;
Light intensity mean value transducer is used for the actual light intensity value is converted to the average intensity value by following formula,
L _ w = exp [ 1 N Σ x , y log ( δ + L w ( x , y ) ) ]
Wherein, L w(x, y) expression actual light intensity value;
X, y represent the two-dimensional coordinate of each pixel;
Figure S2007100509558D00034
Expression average intensity value;
N represents the quantity of texture pixel;
δ is a very little constant, can be taken as 0.0001;
Pixel light intensity value scaler is used for coming the light intensity value of each pixel of convergent-divergent according to the required bright effect in mapping back by following formula,
L ( x , y ) = a L w ( x , y ) L _ w
Wherein, the span of α for more gloomy figure, is got lower α value between 0 to 1, otherwise, then get higher α value;
Pixel light intensity value mapper is used for obtaining by following formula the light intensity value of mapped pixel,
L d ( x , y ) = L ( x , y ) [ 1 + L ( x , y ) L white 2 ] + σ 1 + L ( x , y )
Wherein, L (x y) is the light intensity value of mapped pixel;
L WhiteIt is the largest light intensity value;
σ is a self-defining value, when figure does not need complete black local time, can be set to nonzero value by σ, as be taken as 0.005.
Include the calculator that rgb value is converted to the actual light intensity value by following formula in the described rgb value light intensity transducer:
L w=0.27R+0.67G+0.06B
In the described pixel light intensity value scaler, the general recommendations of α value is taken as 0.18.
Described light intensity mean value transducer comprises frame buffer and two pipelines, read in the light intensity value that each pixel is arranged in the frame buffer, in first pipeline, create a little render target piece is arranged, be provided with a rectangular blocks and cover whole render target piece, also has the light intensity value accumulative register in first pipeline, utilize on pixel shader program each pixel in target cache the light intensity value in the source cache that adds up, and storage computation result; Include the stack calculator in second pipeline, the target cache of first pipeline that is used to superpose is moved back-page calculating then, then includes in the texture of generation
Figure S2007100509558D00042
The present invention adopts said method or device, can compress the high dynamic range figure efficiently, and preserve its details and contrast, be mapped to low-dynamic range, the display device that can adapt to the existing conventional low-dynamic range, and can ensure the real-time requirement effectively, have be easy to realize, good advantages such as extensibility, can satisfy the requirement of three-dimensional scenic (as 3d gaming), virtual reality.
Embodiment
The present invention is further illustrated below in conjunction with specific embodiment.
Embodiment 1: a kind of figure of high dynamic range is mapped to the method for low-dynamic range, comprises the steps:
A, elder generation read the rgb value of each pixel from the graphical textures of high dynamic range;
B, convert rgb value to the actual light intensity value by following formula:
L w=0.27R+0.67G+0.06B;
C, utilize following formula to convert the actual light intensity value to the average intensity value then,
L _ w = exp [ 1 N Σ x , y log ( δ + L w ( x , y ) ) ]
Wherein, L w(x, y) expression actual light intensity value;
X, y represent the two-dimensional coordinate of each pixel;
Figure S2007100509558D00052
Expression average intensity value;
N represents the quantity of texture pixel;
δ=0.0001;
D, the bright effect required according to mapping back are come the light intensity value of each pixel of convergent-divergent by following formula,
L ( x , y ) = a L w ( x , y ) L ‾ w
Wherein, the span of α for more gloomy figure, is got lower α value between 0 to 1, otherwise, then get higher α value, be taken as 0.18 in the present embodiment;
E, at last by following formula, calculate the light intensity value of mapped pixel,
L d ( x , y ) = L ( x , y ) [ 1 + L ( x , y ) L white 2 ] + σ 1 + L ( x , y )
Wherein, L (x y) is the light intensity value of mapped pixel, and this value promptly is the corresponding high dynamically value of light intensity that can show on conventional low-dynamic range display;
L WhiteBe the largest light intensity value, this is a HDR value, and brightness value is greater than L WhitePixel all can be mapped as complete white, L WhiteCan be set as an infinitary value, be used for the light intensity value that produces is mapped as a value that can show;
σ is a self-defining value, when figure does not need complete black local time, can be set to nonzero value by σ,
Be taken as 0.005 in the present embodiment.
In the present embodiment, described step c carries out in two pipelines, in first pipeline, create a little render target, draw a rectangular blocks and cover whole render target, on each pixel in target cache, pixel shader program add up light intensity value in the source cache and storage computation result; In second pipeline, the target cache in first pipeline that superposes is moved back-page calculating then, then includes in the texture of generation
Figure S2007100509558D00055
In the present embodiment, taked the frame per second of said method front and back to contrast as following table:
Do not adopt this method Adopt this method
Current frame rate 136.727fps 77.8443fps
Average frame per second 136.938fps 77.8844fps
The worst frame per second 122.388fps 75.0247fps
Best frame per second 137.177fps 137.117fps
Embodiment 2: basic step such as embodiment 1, wherein the α value is 0.42.
Embodiment 3: basic step such as embodiment 1, wherein the α value is 0.56.
Embodiment 4: basic step such as embodiment 1, wherein the α value is 0.78.
Embodiment 5: basic step such as embodiment 1, wherein the α value is 0.88.
Embodiment 6: a kind of figure of high dynamic range is mapped to the device of low-dynamic range, includes:
Pixel rgb value reader is used for reading from the graphical textures of high dynamic range the rgb value of each pixel;
Rgb value light intensity transducer is used for converting rgb value to the actual light intensity value by following formula:
L w=0.27R+0.67G+0.06B;
Light intensity mean value transducer is used for the actual light intensity value is converted to the average intensity value by following formula,
L ‾ w = exp [ 1 N Σ x , y log ( δ + L w ( x , y ) ) ]
Wherein, L w(x, y) expression actual light intensity value;
X, y represent the two-dimensional coordinate of each pixel;
Expression average intensity value;
N represents the quantity of texture pixel;
δ=0.0001;
Pixel light intensity value scaler is used for coming the light intensity value of each pixel of convergent-divergent according to the required bright effect in mapping back by following formula,
L ( x , y ) = a L w ( x , y ) L ‾ w
Wherein, the span of α for more gloomy figure, is got lower α value between 0 to 1, otherwise, then get higher α value, be taken as 0.18 in the present embodiment;
Pixel light intensity value mapper is used for obtaining by following formula the light intensity value of mapped pixel,
L d ( x , y ) = L ( x , y ) [ 1 + L ( x , y ) L white 2 ] + σ 1 + L ( x , y )
Wherein, L (x y) is the light intensity value of mapped pixel, and this value promptly is the corresponding high dynamically value of light intensity that can show on conventional low-dynamic range display.;
L WhiteBe the largest light intensity value, this is a HDR value, and brightness value is greater than L WhitePixel all can be mapped as complete white, L WhiteCan be set as an infinitary value, be used for the light intensity value that produces is mapped as a value that can show;
σ is a self-defining value, when figure does not need complete black local time, can be set to nonzero value by σ, is taken as 0.005 in the present embodiment.
In the present embodiment, light intensity mean value transducer comprises frame buffer and two pipelines, read in the light intensity value that each pixel is arranged in the frame buffer, in first pipeline, create a little render target piece is arranged, be provided with a rectangular blocks and cover whole render target piece, also have the light intensity value accumulative register in first pipeline, utilize on pixel shader program each pixel in target cache, light intensity value in the source cache and storage computation result add up; Include the stack calculator in second pipeline, the target cache of first pipeline that is used for superposeing is moved back-page calculating then, then includes in the texture of generation
Embodiment 7: basic structure such as embodiment 1, wherein the α value is 0.42.
Embodiment 8: basic structure such as embodiment 1, wherein the α value is 0.56.
Embodiment 9: basic structure such as embodiment 1, wherein the α value is 0.78.
Embodiment 10: basic structure such as embodiment 1, wherein the α value is 0.88.

Claims (8)

1. the figure of a high dynamic range is mapped to the method for low-dynamic range, it is characterized in that: comprise the steps,
A, elder generation read the rgb value of each pixel from the graphical textures of high dynamic range;
B, convert rgb value to the actual light intensity value;
C, utilize following formula to convert the actual light intensity value to the average intensity value then,
L _ w = exp [ 1 N Σ x , y log ( δ + L w ( x , y ) ) ]
Wherein, L w(x, y) expression actual light intensity value;
X, y represent the two-dimensional coordinate of each pixel;
Expression average intensity value;
N represents the quantity of texture pixel;
δ is a very little constant, can be taken as 0.0001;
D, the bright effect required according to mapping back are come the light intensity value of each pixel of convergent-divergent by following formula,
L ( x , y ) = a L w ( x , y ) L ‾ w
Wherein, the span of α for more gloomy figure, is got lower α value between 0 to 1, otherwise, then get higher α value;
E, at last by following formula, calculate the light intensity value of mapped pixel,
L d ( x , y ) = L ( x , y ) [ 1 + L ( x , y ) L white 2 ] + σ 1 + L ( x , y )
Wherein, L (x y) is the light intensity value of mapped pixel;
L WhiteIt is the largest light intensity value;
σ is a self-defining value, when figure does not need complete black local time, can be set to nonzero value by σ, as 0.005.
2. a kind of figure of high dynamic range is mapped to the method for low-dynamic range according to claim 1, it is characterized in that: among the described step b, convert rgb value to actual light intensity value by following formula,
L w=0.27R+0.67G+0.06B。
3. be mapped to the method for low-dynamic range as the figure of a kind of high dynamic range as described in the claim 2, it is characterized in that: in the described steps d, the α value gets 0.18.
4. be mapped to the method for low-dynamic range as the figure of a kind of high dynamic range as described in claim 1 or 2 or 3, it is characterized in that: among the described step c, in frame buffer, read in the light intensity value of each pixel, in two pipelines, carry out, in first pipeline, create a little render target, draw a rectangular blocks and cover whole render target, on each pixel in target cache, pixel shader program add up light intensity value in the source cache and storage computation result; In second pipeline, the target cache of first pipeline that superposes is moved back-page calculating then, then includes in the texture of generation
5. the figure of a high dynamic range is mapped to the device of low-dynamic range, it is characterized in that: include,
Pixel rgb value reader is used for reading from the graphical textures of high dynamic range the rgb value of each pixel;
Rgb value light intensity transducer is used for converting rgb value to the actual light intensity value;
Light intensity mean value transducer is used for the actual light intensity value is converted to the average intensity value by following formula,
L _ w = exp [ 1 N Σ x , y log ( δ + L w ( x , y ) ) ]
Wherein, L w(x, y) expression actual light intensity value;
X, y represent the two-dimensional coordinate of each pixel;
Figure S2007100509558C00023
Expression average intensity value;
N represents the quantity of texture pixel;
δ is a very little constant, can be taken as 0.0001;
Pixel light intensity value scaler is used for coming the light intensity value of each pixel of convergent-divergent according to the required bright effect in mapping back by following formula,
L ( x , y ) = a L w ( x , y ) L _ w
Wherein, the span of α for more gloomy figure, is got lower α value between 0 to 1, otherwise, then get higher α value;
Pixel light intensity value mapper is used for obtaining by following formula the light intensity value of mapped pixel,
L d ( x , y ) = L ( x , y ) [ 1 + L ( x , y ) L white 2 ] + σ 1 + L ( x , y )
Wherein, L (x y) is the light intensity value of mapped pixel;
L WhiteIt is the largest light intensity value;
σ is a self-defining value, when figure does not need complete black local time, can be set to nonzero value by σ, as 0.005.
6. be mapped to the device of low-dynamic range as the figure of a kind of high dynamic range as described in the claim 5, it is characterized in that: include in the described rgb value light intensity transducer by following formula and convert rgb value the calculator of actual light intensity value to,
L w=0.27R+0.67G+0.06B
7. be mapped to the device of low-dynamic range as the figure of a kind of high dynamic range as described in the claim 6, it is characterized in that: in the described pixel light intensity value scaler, the α value is taken as 0.18.
8. be mapped to the device of low-dynamic range as the figure of a kind of high dynamic range as described in claim 5 or 6 or 7, it is characterized in that: described light intensity mean value transducer comprises frame buffer and two pipelines, the frame buffer internal memory contains the light intensity value of each pixel, in first pipeline, create a little render target piece is arranged, be provided with a rectangular blocks and cover whole render target piece, also has the light intensity value accumulative register in first pipeline, utilize on pixel shader program each pixel in target cache, light intensity value in the source cache and storage computation result add up; Include the stack calculator in second pipeline, the target cache of first pipeline that is used for superposeing is moved back-page calculating then, then includes in the texture of generation
Figure S2007100509558C00031
CNA2007100509558A 2007-12-24 2007-12-24 Method and device for mapping high-dynamic range graphics to low-dynamic range graphics Pending CN101212545A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2007100509558A CN101212545A (en) 2007-12-24 2007-12-24 Method and device for mapping high-dynamic range graphics to low-dynamic range graphics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2007100509558A CN101212545A (en) 2007-12-24 2007-12-24 Method and device for mapping high-dynamic range graphics to low-dynamic range graphics

Publications (1)

Publication Number Publication Date
CN101212545A true CN101212545A (en) 2008-07-02

Family

ID=39612188

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2007100509558A Pending CN101212545A (en) 2007-12-24 2007-12-24 Method and device for mapping high-dynamic range graphics to low-dynamic range graphics

Country Status (1)

Country Link
CN (1) CN101212545A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101707666A (en) * 2009-11-26 2010-05-12 北京中星微电子有限公司 Adjusting method and device with high dynamic range
CN101415117B (en) * 2008-11-18 2010-06-09 浙江大学 Transmission method for high presence image based on GPGPU
CN102341825A (en) * 2009-03-03 2012-02-01 微软公司 Multi-modal tone-mapping of images
CN103400342A (en) * 2013-07-04 2013-11-20 西安电子科技大学 Mixed color gradation mapping and compression coefficient-based high dynamic range image reconstruction method
CN106796771A (en) * 2014-10-15 2017-05-31 精工爱普生株式会社 The method and computer program of head-mounted display apparatus, control head-mounted display apparatus
CN106878694A (en) * 2015-12-10 2017-06-20 瑞昱半导体股份有限公司 high dynamic range signal processing system and method
CN107888943A (en) * 2016-09-30 2018-04-06 顶级公司 Image procossing
CN108063898A (en) * 2016-11-07 2018-05-22 株式会社电装 Image forming apparatus
CN108496199A (en) * 2015-06-24 2018-09-04 三星电子株式会社 Utilize the tone master manufacturing system of creation intention metadata
CN112689138A (en) * 2019-10-18 2021-04-20 华为技术有限公司 Image signal conversion processing method and device and terminal equipment

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101415117B (en) * 2008-11-18 2010-06-09 浙江大学 Transmission method for high presence image based on GPGPU
CN102341825A (en) * 2009-03-03 2012-02-01 微软公司 Multi-modal tone-mapping of images
CN102341825B (en) * 2009-03-03 2013-12-04 微软公司 Multi-modal tone-mapping of images
CN101707666A (en) * 2009-11-26 2010-05-12 北京中星微电子有限公司 Adjusting method and device with high dynamic range
CN103400342A (en) * 2013-07-04 2013-11-20 西安电子科技大学 Mixed color gradation mapping and compression coefficient-based high dynamic range image reconstruction method
CN106796771A (en) * 2014-10-15 2017-05-31 精工爱普生株式会社 The method and computer program of head-mounted display apparatus, control head-mounted display apparatus
CN108496199A (en) * 2015-06-24 2018-09-04 三星电子株式会社 Utilize the tone master manufacturing system of creation intention metadata
CN106878694A (en) * 2015-12-10 2017-06-20 瑞昱半导体股份有限公司 high dynamic range signal processing system and method
CN107888943A (en) * 2016-09-30 2018-04-06 顶级公司 Image procossing
CN107888943B (en) * 2016-09-30 2022-05-03 顶级公司 Image processing
CN108063898A (en) * 2016-11-07 2018-05-22 株式会社电装 Image forming apparatus
CN108063898B (en) * 2016-11-07 2020-12-22 株式会社电装 Image generation apparatus
CN112689138A (en) * 2019-10-18 2021-04-20 华为技术有限公司 Image signal conversion processing method and device and terminal equipment

Similar Documents

Publication Publication Date Title
CN101212545A (en) Method and device for mapping high-dynamic range graphics to low-dynamic range graphics
Chang et al. Automatic contrast-limited adaptive histogram equalization with dual gamma correction
Duan et al. Tone-mapping high dynamic range images by novel histogram adjustment
d'Eon et al. Efficient rendering of human skin
CN101605270B (en) Method and device for generating depth map
CN102096941B (en) Consistent lighting method under falsehood-reality fused environment
Duan et al. Fast tone mapping for high dynamic range images
JP5573316B2 (en) Image processing method and image processing apparatus
Lu et al. Illustrative interactive stipple rendering
CN100478994C (en) High dynamic range material color applying drawing method
CN103778614A (en) Enhancing dynamic ranges of images
TW200601185A (en) Image generation apparatus and image generation method
JP2004164593A (en) Method and apparatus for rendering 3d model, including multiple points of graphics object
TW201344632A (en) 3D texture mapping method, apparatus with function for selecting level of detail by image content and computer readable storage medium storing the method
CN108805829A (en) Video data processing method, device, equipment and computer readable storage medium
CN111199518A (en) Image presentation method, device and equipment of VR equipment and computer storage medium
CN107451974B (en) Self-adaptive reproduction display method for high dynamic range image
CN104268169B (en) A kind of remote sensing image data immediate processing method based on PS softwares
Xu Real-Time Realistic Rendering and High Dynamic Range Image Display and Compression
CN110599426B (en) Underwater image enhancement method for optimizing CLAHE
Thakur et al. Fast tone mapping for high dynamic range images
CN108900825A (en) A kind of conversion method of 2D image to 3D rendering
Yang et al. Real-time ray traced caustics
CN1256706C (en) Grain transferring method based on multiple master drawings
KR100914312B1 (en) Method and system of generating saliency map using graphics hardware and programmable shader and recording medium therewith

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Open date: 20080702