KR20160060404A - Image processing apparatus and method - Google Patents

Image processing apparatus and method Download PDF

Info

Publication number
KR20160060404A
KR20160060404A KR1020140162652A KR20140162652A KR20160060404A KR 20160060404 A KR20160060404 A KR 20160060404A KR 1020140162652 A KR1020140162652 A KR 1020140162652A KR 20140162652 A KR20140162652 A KR 20140162652A KR 20160060404 A KR20160060404 A KR 20160060404A
Authority
KR
South Korea
Prior art keywords
resolution
shading
result
light
frame
Prior art date
Application number
KR1020140162652A
Other languages
Korean (ko)
Inventor
하인우
박승인
안민수
이형욱
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020140162652A priority Critical patent/KR20160060404A/en
Publication of KR20160060404A publication Critical patent/KR20160060404A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing

Abstract

An image processing apparatus is provided. The image processing apparatus performs rendering using direct light and at least one indirect light for the 3D model. The device performs light shading at the first resolution and surface shading at the second resolution. The first resolution may be lower than the second resolution.

Description

[0001] IMAGE PROCESSING APPARATUS AND METHOD [0002]

Relates to the field of image processing, and more particularly to the field of rendering that represents global illumination or indirect illumination effects.

Interest in real-time rendering of 3D models is increasing in various fields such as 3D games, virtual reality animation, and movies. When a 3D scene is rendered using global illumination technology, virtual point lights (VPLs) representative of indirect illumination effects such as diffraction and reflection of light in the image space are sampled. A large number of these VPLs are sometimes sampled, which requires a large amount of computation for visibility checking and shading in the rendering process.

According to one aspect, a first shader that performs a light-shading operation with a first resolution associated with at least one light source for a 3D model; A second shader for performing a surface shading operation of the 3D model at a second resolution different from the first resolution; And a processor for combining the light shading result of the first resolution and the surface shading result of the second resolution to generate a rendering result. The first resolution may be lower than the second resolution. The processor then upscales the light shading result of the first resolution to the second resolution, and combines the upscaled light shading result with the surface shading result to generate the rendering result.

According to an exemplary embodiment, the first resolution may correspond to the interlace method and the one frame resolution may be half of the second resolution. For example, light shading can be selectively performed only for an even line or for an odd line in a frame image. According to another embodiment, the first resolution may be one fourth of the second resolution. Light shading can be performed with the same aspect ratio, and this can be used for rendering by upscaling it by a factor of four.

According to an exemplary embodiment, the ratio of the first resolution to the second resolution may vary with time. For example, the ratio of the first resolution to the second resolution may be adaptively changed according to a scene characteristic. For example, the first shader may further lower the first resolution in the dynamic scene than in the static scene. The first shader may further lower the first resolution in a scene having a higher brightness than a scene having a lower brightness by referring to the brightness of the scene being rendered. In addition, it is also possible to further lower the first resolution as the number of indirect light reflected in the rendering increases.

According to another aspect, a first shader that performs a light shading operation associated with at least one light source for a 3D model; A second shader for performing a surface shading operation of the 3D model; And a processor for generating a rendering result by combining the first frame light shading result, which is a result of combining the light shading results of the plurality of frames, and the first frame surface shading result. The apparatus may further include a buffer for buffering and providing a light shading result corresponding to a predetermined number of frames to the processor.

According to one embodiment, the first shader performs the light shading operation at a first resolution and the second shader performs the surface shading at a second resolution, wherein the first resolution and the second The resolution is different. The first resolution may be lower than the second resolution. Illustratively, the first frame light shadowing result may be a weighted average of light shading results of the plurality of frames. The first frame light shading result may be interpolation or extrapolation of light shading results of the plurality of frames.

According to another aspect, a first shader that performs a light shading operation associated with at least one light source for a 3D model, corresponding to at least one frame prior to a first frame; A second shader for performing a surface shading operation of the 3D model corresponding to the first frame; And a processor for generating a rendering result corresponding to the first frame by combining the result of the light shading corresponding to the at least one frame with the result of the surface shading corresponding to the first frame Processing apparatus is provided. According to one embodiment, the first shader performs the light shading once every two frames. Wherein the first shader performs the light shading operation at a first resolution and the second shader performs the surface shading at a second resolution wherein the first resolution and the second resolution are different . Illustratively, the first resolution is lower than the second resolution.

According to an exemplary embodiment, the first shader performs light-shading operation for a plurality of pixels grouped in turn in different frames. In this case, in a second frame in which a light shading operation is performed for a first pixel among the plurality of pixels, the first shader performs a light shading before the second frame with respect to another pixel except for the first pixel, The results can be recycled.

According to another aspect of the present invention, a light shading operation of a 3D model is performed, and a light shading operation is alternately performed on the plurality of pixels in different frames for a plurality of pixels to be grouped, Pixels for which shading is not performed reuse the light shading operation result of a previous frame of the first frame; A second shader for performing a surface shading operation of the 3D model in each frame; And a processor for combining the light shading result and the surface shading result to generate a rendering result.

According to another aspect, there is provided an image processing method of an image processing apparatus including at least one processor. The method includes: a light shading step of performing a light shading operation associated with at least one light source for a 3D model at a first resolution; A surface shading step of performing a surface shading operation of the 3D model at a second resolution different from the first resolution; And an image generation step of generating a rendering result by combining the light shading result of the first resolution and the surface shading result of the second resolution. The first resolution may be lower than the second resolution.

According to an exemplary embodiment, the image processing method may further include a resolution adjusting step of adjusting the light shading result of the first resolution to be equal to the second resolution. In this case, the image generation step generates the rendering result by combining the light shading result adjusted to the second resolution with the surface shading result.

The image processing method may include analyzing characteristics of the scene to be rendered; And a resolution determining step of adaptively adjusting the first resolution according to the analyzed characteristics of the scene. Illustratively, the characteristics of the scene may be inter-frame difference. Then, the resolution determining step further lowers the first resolution in the dynamic scene than the static scene, referring to the characteristics of the scene. The characteristic of the god may be new brightness. The resolution determining step further lowers the first resolution in a scene having a higher brightness than a scene having a lower brightness with reference to the scene brightness. Meanwhile, the resolution determining step may further lower the first resolution as the number of indirect light reflected in the rendering increases.

1 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment of the present invention.
Figure 2 shows the 3D model and lights being rendered.
3 illustrates a rendering in accordance with one embodiment.
FIGS. 4A to 4C are views for explaining a process of making a surface shading resolution and a light shading resolution different according to an embodiment.
FIG. 5 is a view for explaining a process of combining a surface shading result and a light shading result.
6 is a block diagram showing an image processing apparatus according to another embodiment.
7 to 8 are views for explaining a process of processing a light shading process according to the embodiments.
9 to 10 are flowcharts illustrating an image processing method according to an embodiment.

In the following, some embodiments will be described in detail with reference to the accompanying drawings. However, it is not limited or limited by these embodiments. Like reference symbols in the drawings denote like elements.

The terms used in the following description are chosen to be generic and universal in the art to which they are related, but other terms may exist depending on the development and / or change in technology, customs, preferences of the technician, and the like. Accordingly, the terminology used in the following description should not be construed as limiting the technical thought, but should be understood in the exemplary language used to describe the embodiments.

Also, in certain cases, there may be a term chosen arbitrarily by the applicant, in which case the meaning of the detailed description in the corresponding description section. Therefore, the term used in the following description should be understood based on the meaning of the term, not the name of a simple term, and the contents throughout the specification.

1 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment of the present invention. The apparatus 100 performs global illumination rendering on the 3D model. The device 100 may be implemented by at least a portion of a graphics processing unit (GPU). The apparatus 100 includes a first shader 110 and a second shader 120. The first shader 110 performs light shading to calculate color value shading of direct light and indirect light for the 3D model. Light shading is a process of checking and rendering the visibility of a scene in consideration of color, brightness, direction, etc., for direct or indirect light. And the second shader 120 performs surface shading. Surface shading is a process of rendering a scene in consideration of the surface normal of the object surface, texture color, and the like. The processor 130 may combine the results of this light shading and the surface shading, for example by multiply, a rendering result image for the scene may be generated.

According to one embodiment, the first shader 110 and the second shader 120 perform shading at different resolutions. When light-surface shading is performed on an existing GPU, light and surface shading are performed at the same resolution. In this case, based on the resolution of the image to be output by the GPU, the input resolution and the shader processing resolution given to the shader, such as deferred rending, could all be set to be the same. Light shading requires a lot of indirect light, such as a visibility test for a virtual point light (VPL), and the amount of computation is very large. Light has a smooth, diffusive effect in space, and therefore, even if the light-shading resolution is somewhat reduced, there is no significant effect on the overall rendering quality. The apparatus 100 according to the embodiments efficiently uses the computational resources by setting the light shading resolution and the surface shading resolution to be different from each other according to the optimum resolution.

Illustratively, the first shader 110 performs light shading at a first resolution. And the second shader 120 performs surface shading with a second resolution. The second resolution may be equal to the output resolution of the GPU. Of course, it can be set differently from the output resolution. As described above, the first resolution associated with light shading may be lower than the second resolution associated with surface shading. For example, the first resolution may be a half resolution or a quarter resolution of the second resolution. The processor 130 may upscale the light shading result generated at the lower resolution to the surface resolution to render the resulting image.

According to an exemplary embodiment, the ratio of the first resolution to the second resolution may vary with time. For example, the ratio of the first resolution to the second resolution may be adaptively changed according to a scene characteristic. For example, the first shader 110 may lower the first resolution in a dynamic scene rather than a static scene. In dynamic scenes, fast computation is more important than rendering quality. The first shader 110 may then lower the light shading resolution in a brighter scene than a darker scene by referring to the scene brightness or luminance of the scene being rendered. The sensitivity of the human eye to the difference in brightness is higher in a darker environment than in a brighter environment. In addition, it is also possible to further lower the first resolution as the number of indirect light reflected in the rendering increases. Indirect light, such as many VPLs, requires a large amount of computation for light-shading operations, and full-resolution light-shading for many VPLs can have redundancy of resources . Therefore, in the case of many frames sampled with VPLs, the computational overhead can be avoided by lowering the light-shading resolution.

Figure 2 shows the 3D model and lights being rendered. An object 210 and an object 220 are shown. A scene viewed from the time point at which the 3D model is rendered 230 is also referred to as a screen space. In some embodiments, tile-based deferred rendering (TBDR) may be performed by dividing the screen space into a plurality of tiles. This tile-based processing is advantageous for GPU acceleration through parallel processing. Sampled VPLs 201, 212, 222, etc. are shown. The VPLs may be 3D models rendered at the point of the direct light 200 and sampled on the rendering results. The VPLs are sampled in the visible portion 221 at the point of the direct light 200 and the VPL is sampled sparse in the portion 211 that is not visible. In some cases, these VPLs may be sampled from several tens to tens of thousands, and the amount of computation is very large when light shading is taken into account. Therefore, in the embodiments, the light shading resolution can be lowered. Various embodiments are described with reference to FIG. 3 and the following.

3 illustrates a rendering in accordance with one embodiment. The shaded result 310 from the surface shader has a relatively high resolution. For example, this resolution may be 1920 * 1080 FHD (Full HD). The shaded result 320 from the light shader may have a lower resolution. Illustratively, this resolution may be qHD (quarter HD) of 960 * 540. Since the light is smooth and diffusively diffused, the light shading result 320 has a low resolution, but the effect on the result quality due to artifact generation by aliasing is relatively small, Do.

On the other hand, as described above, the first resolution at which the light shader performs light shading may be adaptively changed. In other words, new characteristics can be analyzed. The new characteristics may include various things such as the complexity of the 3D model, the texture characteristics of the objects of the 3D model, the object mobility between frames, the rate of change of the camera view to be rendered, the illuminance of the scene, . In a dynamic scene where the naturalness of screen transition is more important than the precise color representation, it is possible to further lower the light shading resolution for faster processing. And because the human eye quickly finds artifacts due to the difference in brightness in darker places than in the light, the light shader may lower the light-shading resolution in light gods than in darker gods. Also, in this process, the number of direct light or the number of sampled VPL may be considered. As an example, if there are many VPLs to be considered in the sampled scene, it is possible to lower the light-shading resolution because the amount of computation required for the light-shading operation is large.

According to one embodiment, the rendering may be a tile-based default rendering, and in this case, the resolution adjustment may be performed for each tile or grid. The ratio of the surface shading resolution to the light shading resolution may be applied to all the tiles equally or the characteristics may be analyzed according to the tiles so that a ratio corresponding to the individual tiles may be applied. In this way, the cell allocation efficiency of the parallel processing GPU can be further improved.

The result 320 provided by the light shader at a lower resolution can be upscaled to an image 321 of the same resolution as the surface shader result 310. [ This up-scale process can be performed by many methods that have been introduced in the prior art, and processing such as anti-aliasing may be performed through interpolation. By thus multiplying the upscaled image 321 and the surface shading result 310, the resulting image 330 of the global illumination rendering can be generated.

FIGS. 4A to 4C are views for explaining a process of making a surface shading resolution and a light shading resolution different according to an embodiment. Surface shading is performed in Fig. 4A. For each of the pixels 410, 411, 412, and 413, a color value is determined in consideration of the surface normal of the object surface, the texture color, and the like. The processed result 401 may be combined with the light shading result to produce a rendered image. The light shading process is performed in Fig. 4B. Pixel 420 corresponds to pixels 410, 411, 412, and 413 of FIG. 4A when upscaled. The pixel 420 may also be understood as a super pixel corresponding to the pixels 410, 411, 412 and 413 of FIG. 4A. For each of the VPLs to be considered, a shading operation is performed referring to the VPL color, brightness, direction, and the like.

Meanwhile, a result 403 of upscaling the light shading result 402 in FIG. 4B is shown in FIG. 4C. The resolution has become equal to correspond to each of the pixels 410, 411, 412, and 413 of FIG. 4A. In this process, the pixels 430, 431, 432, Can have the same value. Of course, proper post-processing may be performed to prevent aliasing and the like in this process.

FIG. 5 is a view for explaining a process of combining a surface shading result and a light shading result. The processor can generate the resulting image 500 by combining the shading results of the surface shader (e.g., 401 in FIG. 4A) and the result of upscaling the shading results of the light shader (e.g., 403 in FIG. 4C) have. Through a pixel by pixel operation, the value of pixel 510 may be generated by multiplying pixel 410 of FIG. 4A and pixel 430 of FIG. 4C. In a similar manner, the color values for the other pixels 511, 512 and 513 are also calculated.

6 is a block diagram showing an image processing apparatus according to another embodiment. Apparatus 600 includes a first shader 610 and a second shader 640. The first shader 610 performs light shading on the 3D model. And the second shader 640 performs surface shading. In the embodiment of FIG. 1, the resolution of the light shading is set to be lower than the surface shading resolution. This has contributed to reducing the redundancy of the light-shading operation using spatial coherency. In this embodiment, temporal coherency is used in conjunction with and / or instead of this embodiment to reduce redundancy of the light shading operation.

In an exemplary application, the first shader 610, which performs light shading, may skip light shading of some frames instead of light shading in all frames. In such a skipped frame, the light shading result of another neighboring frame can be recycled. For example, the first shader 610 can perform light shading at a lower rate than the FPS, while the second shader 640 performs surface shading with a predetermined FPS (frame per second) . It is possible to perform write shading for frames having an odd frame index and to reuse them for frames having an even frame index. That is, light shading is performed with a lower FPS than surface shading, but the ratio of how low the FPS is can vary. For example, in dynamic scenes it is possible to lower the FPS more than light-shading for faster computation. And the brighter the glow, the lower the light-shading FPS can be. Furthermore, it is possible to further lower the FPS of the light shading in a region where the number of direct light or the number of sampled VPLs is large. For this purpose, the buffer 620 buffers the result of the write-shading operation for a certain period of time. Again, adjusting the FPS of the light shading is not incompatible with the embodiment of the resolution adjustment described in FIG. Therefore, optimal computation efficiency can be obtained by adjusting the resolution while adaptively adjusting the FPS.

On the other hand, according to one embodiment, the FPS of the light shading may be different from the FPS of the surface shading, or alternatively, the previous frame result may be recycled for the light shading result. The recycling here may adjust the light shading of the current frame, for example, using the light shading result of a predetermined number of previous frames stored in the buffer 620. [ For example, by way of averaging or weighted sum the light shading results of the previous several frames, interpolating or extrapolating the results of previous frames, You can get light shading results. By this processing, flickering due to the sampling change of the VPL can be remarkably reduced. In the weighted average, the closer the time index is to the frame, the larger the weight may be given to obtain a higher correlation. The processor 630 may combine the light shading result and the surface shading result to generate a rendering result image for the scene. This process will be described with reference to FIGS. 7 and 8. FIG.

FIGS. 7 to 8B are views for explaining a process of processing a light shading process according to the embodiments. Referring to FIG. 7, it is shown that the light shading results of two frames preceding the current T frame are buffered. The light shading result 712 of the (T-2) frame is delayed by two frames and enters the summer. The light shading result 711 of the (T-1) frame is delayed by one frame and enters the summer. The result 711 and result 712 are weighted averaged together with the light shading result 710 of the current frame T frame to produce a corrected light shading result 720 to use for the current frame. The processor can then obtain the resulting image 740 in combination with the current frame's surface shading result 730 and the corrected light shading result 720. The resolution of the light shading results 710, 711, and 712 may be the same as the resolution of the surface shading result 730. However, in some embodiments, the resolution of the light shading results 710, 711, and 712 may be different from the resolution of the surface shading results 730. The calibrated light shading result 720 may then be upscaled after the weighted average.

On the other hand, referring to FIG. 8A, an embodiment is shown in which the light shading is performed with a lower FPS than the surface shading. In the illustrated example, light shading was performed every two frames. In other words, the FPS of light shading is half the FPS of surface shading. Then, the light shading result 813 of the (T-3) frame is recycled as the light shading result 812 of the (T-2) frame. Likewise, the light shading result 811 of the (T-1) frame is recycled as the light shading result 810 of the (T) frame which is the current frame. This process can be implemented by causing the light shading result 811 of the (T-1) frame to be delayed by one frame and combined with the surface shading result 820 of the current frame (T) frame, A resulting image 830 is generated. The resolution of the light shading results (811 and 813, etc.) may be the same as the resolution of the surface shading result (820). However, in another embodiment, the resolution of the light shading results 811 and 813 may be different from the resolution of the surface shading result 820. [ The upscale of the light shading result may then be performed prior to the multiply processing by the processor.

On the other hand, in the illustrated example, the ratio of light-shading FPS to surface-shading FPS was 1: 2, but this ratio can be set differently. Also, it may not be set and maintained once, but it may vary depending on the characteristics of God. As described above, in dynamic gods rather than static gods, the light shading rate may be smaller in a light god than in a dark god. Furthermore, the FPS ratio may vary depending on the number of VPLs sampled as described above.

According to one embodiment, such frame-by-frame light shading can also be performed by grouping the pixels and sequentially updating the pixels in the groups in different frames. Referring to FIG. 8B, light-shading target pixels for a plurality of frames are shown. In this embodiment, four adjacent pixels b 0 , b 1 , b 2, and b 3 are grouped into one group. In the (T-3) frame, a light-shading operation is performed on the b 0 pixel. And the light-shading operation result of b 0 pixels can be recycled in subsequent frames corresponding to the number of pixels in the group. For example, in the (T-3) frame, a light-shading operation is actually performed on the b 0 pixel, and in the (T-2) frame, the (T-1) frame and the (T) frame, this calculation result is recycled. And the light-shading operation for b 0 pixels can be performed again in the (T + 1) frame. In this way, in the (T-2) frame, the light-shading operation is performed on the b 1 pixel, and this value can be recycled in the (T-1) frame, (T) frame and (T + 1) frame. A light shading operation is performed on the b 2 pixel in the (T-1) frame, and this value can be recycled in the (T) frame, (T + 1) frame and (T + 2) frame. Further, in the (T) frame, a light-shading operation is performed on b 3 pixels, and this value can be recycled in the (T + 1) frame, the (T + 2) frame and the (T + 3) frame. Then a new light-shading operation on b 3 pixels in the (T + 4) frame will be performed.

9 to 10 are flowcharts illustrating an image processing method according to an embodiment. Referring to FIG. 9, in step 910, the light shader performs light shading for the 3D model at the first resolution. In this light shading, a large amount of processing is performed for many direct rays and / or VPLs, such as checking visibility in consideration of color, brightness, direction, and the like. However, according to one embodiment, at this stage, the amount of computation is reduced by lowering the light-shading resolution and / or lowering the shading FPS. Illustratively, the first resolution, which is the light shading resolution, may be a half resolution or a quarter resolution of the second resolution, which is the surface shading resolution. In the embodiment where the first resolution and the second resolution are different from each other, the resolution adjustment of step 930 is performed. The difference in resolution is shown in FIG. 3, and the resolution is adjusted as described above with reference to FIGS.

Then, the processor combines the surface shading result generated in step 920 with the resolution adjusted light shading result to generate a resultant image (940). The above process is as described above with reference to Figs.

Referring to FIG. 10, there is shown an embodiment in which the characteristics of the scene are reflected in the resolution and / or shading FPS adjustment of light shading. At 1010, the properties of the scene are analyzed. In addition to analyzing God, other factors such as VPL sample status can be analyzed together. For example, whether God is dynamic or static, whether the overall brightness of the god is dim or bright, and how many VPLs are sampled can be analyzed as such new features. Then, in step 1020, a first resolution to perform light shading is determined. Illustratively, but not exclusively, with respect to the above-described novel properties, the first resolution can be determined to be lower in the dynamic scene than in the static scene. Also, the first resolution may be determined to be a lower value in a scene that is brighter than a dark scene. Further, as described above, the first resolution may be determined to be a lower value in the scene of a frame in which VPLs are sampled in a large amount.

In addition to or in place of this resolution adjustment, a light-shading FPS adjustment step (not shown) may be performed. For example, according to the god characteristics analyzed in step 1010, the light shading FPS is adjusted to a smaller value in the dynamic god than in the static god, in the light god rather than the dark god. And the greater the number of sampled VPLs, the smaller the light-shading FPS may be determined. More details are as described with reference to Figs. 1 to 8. Fig.

The embodiments described above may be implemented in hardware components, software components, and / or a combination of hardware components and software components. For example, the devices, methods, and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, such as an array, a programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced. Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

Claims (27)

  1. A first shader for performing a light-shading operation associated with at least one light source for a 3D model at a first resolution;
    A second shader for performing a surface shading operation of the 3D model at a second resolution different from the first resolution; And
    A processor for generating a rendering result by combining the light shading result of the first resolution and the surface shading result of the second resolution,
    And the image processing apparatus.
  2. The method according to claim 1,
    Wherein the first resolution is lower than the second resolution.
  3. 3. The method of claim 2,
    Wherein the processor upscales the light shading result of the first resolution to the second resolution and combines the upscaled light shading result with the surface shading result to generate the rendering result.
  4. 3. The method of claim 2,
    Wherein the first resolution corresponds to an interlace method and the one frame resolution is half the second resolution.
  5. The method according to claim 1,
    Wherein the first shader further lowers the first resolution in the dynamic scene than the static scene.
  6. The method according to claim 1,
    Wherein the first shader further reduces the first resolution in a scene having a higher brightness than a scene having a lower brightness by referring to the brightness of the scene to be rendered.
  7. The method according to claim 1,
    Wherein the first shader further reduces the first resolution as the number of indirect light reflected in the rendering increases.
  8. A first shader for performing a light shading operation associated with at least one light source for the 3D model;
    A second shader for performing a surface shading operation of the 3D model; And
    A processor that generates a rendering result by combining a first frame light shading result, which is a result of combining light shading results of a plurality of frames, and a first frame surface shading result,
    And the image processing apparatus.
  9. 9. The method of claim 8,
    A buffer for buffering a light shading result corresponding to a predetermined number of frames and providing the result to the processor
    Further comprising:
  10. 9. The method of claim 8,
    Wherein the first shader performs the light shading operation at a first resolution and the second shader performs the surface shading at a second resolution wherein the first resolution and the second resolution are different Image processing apparatus.
  11. 11. The method of claim 10,
    Wherein the first resolution is lower than the second resolution.
  12. 9. The method of claim 8,
    Wherein the first frame light shading result is a weighted average of light shading results of the plurality of frames.
  13. 9. The method of claim 8,
    Wherein the first frame light shading result interpolates or extrapolates the light shading result of the plurality of frames.
  14. A first shader for performing a light shading operation associated with at least one light source for the 3D model, corresponding to at least one frame before the first frame;
    A second shader for performing a surface shading operation of the 3D model corresponding to the first frame; And
    A processor for generating a rendering result corresponding to the first frame by combining the result of the light shading corresponding to the at least one frame with the result of the surface shading corresponding to the first frame,
    And the image processing apparatus.
  15. 15. The method of claim 14,
    Wherein the first shader performs the light shading once every two frames.
  16. 15. The method of claim 14,
    Wherein the first shader performs the light shading operation at a first resolution and the second shader performs the surface shading at a second resolution, wherein the first resolution is lower than the second resolution Processing device.
  17. 15. The method of claim 14,
    Wherein the first shader alternately performs light-shading operation on a plurality of pixels grouped in different frames.
  18. 18. The method of claim 17,
    In a second frame in which a light shading operation is performed for a first pixel of the plurality of pixels, the first shader recycles the light shading result before the second frame to another pixel except for the first pixel The image processing apparatus.
  19. A light shading operation is performed on the plurality of pixels in different frames for the plurality of pixels to be grouped and the light shading operation is alternately performed for the plurality of pixels to be grouped, Pixels for reusing a light shading operation result of a previous frame of the first frame;
    A second shader for performing a surface shading operation of the 3D model in each frame; And
    A processor for generating a rendering result by combining the light shading result and the surface shading result,
    And the image processing apparatus.
  20. An image processing method of an image processing apparatus including at least one processor, the method comprising:
    A light shading step of performing a light shading operation associated with at least one light source for the 3D model at a first resolution;
    A surface shading step of performing a surface shading operation of the 3D model at a second resolution larger than the first resolution; And
    An image generating step of generating a rendering result by combining a light shading result of the first resolution and a surface shading result of the second resolution,
    And an image processing method.
  21. 21. The method of claim 20,
    A resolution adjustment step of adjusting the light shading result of the first resolution to be equal to the second resolution
    Further comprising:
    Wherein the image generating step combines a light shading result adjusted to be equal to the second resolution with the surface shading result to generate the rendering result.
  22. 21. The method of claim 20,
    Analyzing characteristics of the scene to be rendered; And
    A resolution determining step of adaptively adjusting the first resolution according to the analyzed characteristics of the scene;
    Further comprising the steps of:
  23. 23. The method of claim 22,
    The characteristic of the scene is an interframe difference,
    Wherein the resolution determining step further lowers the first resolution in the dynamic scene than the static scene by referring to the characteristics of the scene.
  24. 23. The method of claim 22,
    The characteristic of the new is the new brightness,
    Wherein the resolution determining step lowers the first resolution at a scene having a higher brightness than a scene having a lower brightness by referring to the scene brightness.
  25. 23. The method of claim 22,
    Wherein the resolution determining step further reduces the first resolution as the number of indirect light reflected in the rendering increases.
  26. A computer-readable recording medium containing a program for causing a computing device to perform an image processing method, the method comprising:
    A light shading step of performing a light shading operation associated with at least one light source for the 3D model at a first resolution;
    A surface surface step of performing a surface shading operation of the 3D model at a second resolution different from the first resolution; And
    Combining the light shading result of the first resolution and the surface shading result of the second resolution,
    Readable recording medium.
  27. 13. A program stored on a recording medium, the program causing a computing device to perform an image processing method comprising:
    A light shader that performs light shading operations associated with at least one light source for a 3D model at a first resolution;
    Performing a surface shading operation of the 3D model at a second resolution different from the first resolution; And
    Generating a rendering result by combining the light shading result of the first resolution and the surface shading result of the second resolution;
    And a program stored in the recording medium.
KR1020140162652A 2014-11-20 2014-11-20 Image processing apparatus and method KR20160060404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140162652A KR20160060404A (en) 2014-11-20 2014-11-20 Image processing apparatus and method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR1020140162652A KR20160060404A (en) 2014-11-20 2014-11-20 Image processing apparatus and method
JP2015225941A JP2016100017A (en) 2014-11-20 2015-11-18 Image processing apparatus and method
US14/945,694 US10262454B2 (en) 2014-11-20 2015-11-19 Image processing apparatus and method
CN201510809715.6A CN105631926B (en) 2014-11-20 2015-11-20 Image processing equipment and method
EP15195546.5A EP3023942A3 (en) 2014-11-20 2015-11-20 Image processing apparatus and method

Publications (1)

Publication Number Publication Date
KR20160060404A true KR20160060404A (en) 2016-05-30

Family

ID=54782427

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140162652A KR20160060404A (en) 2014-11-20 2014-11-20 Image processing apparatus and method

Country Status (5)

Country Link
US (1) US10262454B2 (en)
EP (1) EP3023942A3 (en)
JP (1) JP2016100017A (en)
KR (1) KR20160060404A (en)
CN (1) CN105631926B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965893B2 (en) * 2013-06-25 2018-05-08 Google Llc. Curvature-driven normal interpolation for shading applications
GB2543778B (en) * 2015-10-27 2018-08-08 Imagination Tech Ltd Systems and methods for processing images of objects
US10134174B2 (en) * 2016-06-13 2018-11-20 Microsoft Technology Licensing, Llc Texture mapping with render-baked animation
US10453241B2 (en) * 2017-04-01 2019-10-22 Intel Corporation Multi-resolution image plane rendering within an improved graphics processor microarchitecture
US10235799B2 (en) * 2017-06-30 2019-03-19 Microsoft Technology Licensing, Llc Variable rate deferred passes in graphics rendering

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064393A (en) * 1995-08-04 2000-05-16 Microsoft Corporation Method for measuring the fidelity of warped image layer approximations in a real-time graphics rendering pipeline
US6466185B2 (en) 1998-04-20 2002-10-15 Alan Sullivan Multi-planar volumetric display system and method of operation using psychological vision cues
EP1258837A1 (en) 2001-05-14 2002-11-20 Thomson Licensing S.A. Method to generate mutual photometric effects
US7167176B2 (en) 2003-08-15 2007-01-23 Microsoft Corporation Clustered principal components for precomputed radiance transfer
US7061489B2 (en) 2003-08-15 2006-06-13 Microsoft Corporation Precomputed radiance transfer for rendering objects
US8217970B2 (en) 2004-07-27 2012-07-10 Dolby Laboratories Licensing Corporation Rapid image rendering on dual-modulator displays
US7633503B2 (en) 2005-03-22 2009-12-15 Microsoft Corporation Local, deformable precomputed radiance transfer
CN1878245A (en) 2005-06-07 2006-12-13 明基电通股份有限公司 Method for correcting digital image exposure
US7768523B2 (en) 2006-03-09 2010-08-03 Microsoft Corporation Shading using texture space lighting and non-linearly optimized MIP-maps
EP2005732A1 (en) 2006-03-31 2008-12-24 Philips Electronics N.V. Adaptive rendering of video content based on additional frames of content
JP2009163658A (en) 2008-01-10 2009-07-23 Hitachi Ltd Input/output controller and its firmware update method
KR100949859B1 (en) 2008-08-05 2010-03-25 재단법인서울대학교산학협력재단 Method for accelerating realtime perspective volume rendering using temporal exploitation
KR101104684B1 (en) 2008-12-02 2012-01-16 이화여자대학교 산학협력단 Method for rendering skin material based on brdf and its apparatus
KR20110014795A (en) 2009-08-06 2011-02-14 삼성전자주식회사 Image processing apparatus and method
US8866813B2 (en) 2011-06-30 2014-10-21 Dreamworks Animation Llc Point-based guided importance sampling
KR101334187B1 (en) 2011-07-25 2013-12-02 삼성전자주식회사 Apparatus and method for rendering
US9235921B2 (en) * 2011-11-08 2016-01-12 Imagination Technologies Limited Profiling ray tracing renderers
US8644596B1 (en) * 2012-06-19 2014-02-04 Google Inc. Conversion of monoscopic visual content using image-depth database
US9378582B2 (en) 2012-07-31 2016-06-28 Siemens Product Lifecycle Management Software Inc. Rendering of design data
CN103335630B (en) 2013-07-17 2015-11-18 北京航空航天大学 Low cost laser scanner
CN103701807B (en) 2013-12-26 2016-08-24 华为技术有限公司 The data transmission method and an apparatus environment vdi
GB2543779B (en) * 2015-10-27 2018-07-25 Imagination Tech Ltd Systems and methods for processing images of objects

Also Published As

Publication number Publication date
EP3023942A2 (en) 2016-05-25
JP2016100017A (en) 2016-05-30
US10262454B2 (en) 2019-04-16
CN105631926A (en) 2016-06-01
CN105631926B (en) 2019-09-24
US20160148420A1 (en) 2016-05-26
EP3023942A3 (en) 2016-08-31

Similar Documents

Publication Publication Date Title
US9754407B2 (en) System, method, and computer program product for shading using a dynamic object-space grid
JP5538748B2 (en) Graphics processing system
RU2433477C1 (en) Image dynamic range expansion
KR101286318B1 (en) Displaying a visual representation of performance metrics for rendered graphics elements
US8111264B2 (en) Method of and system for non-uniform image enhancement
Dayal et al. Adaptive frameless rendering
JP6116217B2 (en) Method for processing computer graphics and apparatus for processing computer graphics
CN101803395B (en) Rendering improvement for 3d display
JP2008515058A (en) Flexible anti-aliasing for embedded devices
JP6392370B2 (en) An efficient re-rendering method for objects to change the viewport under various rendering and rasterization parameters
TWI616846B (en) A graphics subsystem, a computer-implemented method and a computing device for enhanced anti-aliasing by varying sample patterns spatially and/or temporally
US8102428B2 (en) Content-aware video stabilization
KR101650999B1 (en) Rendering mode selection in graphics processing units
US7843462B2 (en) System and method for displaying a digital video sequence modified to compensate for perceived blur
TWI578266B (en) Varying effective resolution by screen location in graphics processing by approximating projection of vertices onto curved viewport
US9129443B2 (en) Cache-efficient processor and method of rendering indirect illumination using interleaving and sub-image blur
US20190139269A1 (en) Enhanced anti-aliasing by varying sample patterns spatially and/or temporally
CN105745914B (en) Method and system for inverse tone mapping (ITM)
KR20130029149A (en) Method and apparatus for graphic processing using post shader
US8624894B2 (en) Apparatus and method of early pixel discarding in graphic processing unit
US8035641B1 (en) Fast depth of field simulation
McGuire Ambient occlusion volumes
Zhou et al. Accurate depth of field simulation in real time
JP4938850B2 (en) Graphic processing unit with extended vertex cache
US9495790B2 (en) Gradient adjustment for texture mapping to non-orthonormal grid