WO2023210884A1 - Dispositif et procédé d'élimination de bruit basés sur de moyens non locaux - Google Patents

Dispositif et procédé d'élimination de bruit basés sur de moyens non locaux Download PDF

Info

Publication number
WO2023210884A1
WO2023210884A1 PCT/KR2022/015083 KR2022015083W WO2023210884A1 WO 2023210884 A1 WO2023210884 A1 WO 2023210884A1 KR 2022015083 W KR2022015083 W KR 2022015083W WO 2023210884 A1 WO2023210884 A1 WO 2023210884A1
Authority
WO
WIPO (PCT)
Prior art keywords
noise removal
pixel
image
reference image
window
Prior art date
Application number
PCT/KR2022/015083
Other languages
English (en)
Korean (ko)
Inventor
최한준
윤형민
권혁주
최우석
Original Assignee
주식회사 실리콘아츠
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 실리콘아츠 filed Critical 주식회사 실리콘아츠
Publication of WO2023210884A1 publication Critical patent/WO2023210884A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering

Definitions

  • the present invention relates to noise removal technology, and more specifically, non-local average-based noise removal that can obtain better image quality noise removal results while increasing processing speed despite the low number of samples per pixel in the path tracing process. It relates to devices and methods.
  • An embodiment of the present invention seeks to provide a non-local average-based noise removal device and method that can obtain better image quality noise removal results while increasing processing speed despite the low number of samples per pixel in the path tracing process. .
  • One embodiment of the present invention is capable of denoising the color image by generating an albedo and normal image in addition to the color image for a low number of samples per pixel in path tracing, and generating and referencing a reference image from the albedo and normal image.
  • a non-local average-based noise removal device generates a color image corresponding to a 3D scene through path tracing for a 3D scene. wealth; a reference image generator that generates a reference image corresponding to the color image when the number of samples per pixel (SPP) of the path tracing is less than or equal to a preset threshold; and a noise removal unit that removes noise included in the color image by applying a weight value referenced in the reference image.
  • SPP samples per pixel
  • the reference image generator generates an albedo image and a normal image for the three-dimensional scene during the path tracing process, calculates the average of RGB component values for each pixel of the normal image, and generates the albedo image.
  • Each pixel value of the reference image can be calculated by independently summing the average of the corresponding pixel of the normal image for each RGB component value for each pixel.
  • the reference image generator calculates a weighted sum of RGB component values for each pixel of the normal image, and applies the following equation regarding the weighted sum to each of the RGB component values for each pixel of the albedo image. Thus, each pixel value of the reference image can be calculated.
  • Ref_r (Ar + (p1 x Nr + p2 x Ng + p3 x Nb)) / 2
  • Ref_g (Ag + (p1 x Nr + p2 x Ng + p3 x Nb)) / 2
  • Ref_b (Ab + (p1 x Nr + p2 x Ng + p3 x Nb)) / 2
  • Ref_r, Ref_g, and Ref_b are the pixel-wise component values of the reference image
  • Ar, Ag, and Ab are the pixel-wise component values of the albedo image
  • Nr, Ng, and Nb are the pixel-wise component values of the normal image
  • the reference image generator may filter the reference image by applying at least one filter including a median filter.
  • the noise remover defines a search range by selecting a specific pixel position of the reference image as a center pixel, and at least one window corresponding to a first window including the center pixel and included in the search area. Determine a second window, determine the weight value according to similarity between the first window and the at least one second window, and determine the weight value based on the center pixel position of the at least one second window on the color image. By accumulating pixel values to which weight values are applied, noise at the specific pixel location can be removed.
  • the noise remover may determine the weight value to have a higher value as the similarity increases.
  • the noise remover may determine the weight value to have a higher value as the distance between the first window and the at least one second window becomes shorter.
  • the non-local average-based noise removal method generates a color image corresponding to the 3D scene through path tracing for the 3D scene through the color image generator. ) generating; Generating, through the reference image generator, a reference image corresponding to the color image when the number of samples per pixel (SPP) of the path tracing is less than or equal to a preset threshold; and removing noise included in the color image by applying a weight value referenced in the reference image through the noise removal unit.
  • SPP samples per pixel
  • the disclosed technology can have the following effects. However, since it does not mean that a specific embodiment must include all of the following effects or only the following effects, the scope of rights of the disclosed technology should not be understood as being limited thereby.
  • the non-local average-based noise removal device and method according to an embodiment of the present invention increases processing speed by applying NLM of the existing filter method instead of AI-type noise removal, and refers to the reference image generated from the albedo and normal image. It is possible to obtain better image quality noise removal results than the existing NLM method.
  • the non-local average-based noise removal device and method according to an embodiment of the present invention can achieve high performance with a small size (gate count) and can generate high-quality photorealistic graphic images even with low SPP, thereby improving the performance of path tracing. can be maximized.
  • FIG. 1 is a diagram explaining a noise removal system according to the present invention.
  • FIG. 2 is a diagram explaining the system configuration of the noise removal device of FIG. 1.
  • FIG. 3 is a diagram explaining the functional configuration of the noise removal device of FIG. 1.
  • Figure 4 is a flowchart explaining the non-local average-based noise removal method according to the present invention.
  • Figure 7 is a diagram explaining the existing non-local average (NLM) method.
  • Figure 8 is a diagram explaining the non-local average method according to the present invention.
  • Figure 9 is a diagram explaining the process of generating a reference image from an albedo image and a normal image according to the present invention.
  • Figure 10 is a diagram comparing and explaining for each SPP before and after noise removal (Referred NLM) processing according to the present invention.
  • first and second are used to distinguish one component from another component, and the scope of rights should not be limited by these terms.
  • a first component may be named a second component, and similarly, the second component may also be named a first component.
  • identification codes e.g., a, b, c, etc.
  • the identification codes do not explain the order of each step, and each step clearly follows a specific order in context. Unless specified, events may occur differently from the specified order. That is, each step may occur in the same order as specified, may be performed substantially simultaneously, or may be performed in the opposite order.
  • the present invention can be implemented as computer-readable code on a computer-readable recording medium
  • the computer-readable recording medium includes all types of recording devices that store data that can be read by a computer system.
  • Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, and optical data storage devices. Additionally, the computer-readable recording medium can be distributed across computer systems connected to a network, so that computer-readable code can be stored and executed in a distributed manner.
  • FIG. 1 is a diagram explaining a noise removal system according to the present invention.
  • the noise removal system 100 may include a user terminal 110, a noise removal device 130, and a database 150.
  • the user terminal 110 may correspond to a terminal device operated by a user.
  • a user may be understood as one or more users, and each of the one or more users may correspond to one or more user terminals 110. That is, in Figure 1, it is represented as one user terminal 110, but the first user is the first user terminal, the second user is the second user terminal, ..., the nth (where n is a natural number) user is the first user terminal. n may each correspond to a user terminal.
  • the user terminal 110 can be implemented as a device that constitutes the noise removal system 100 according to the present invention, and the noise removal system 100 provides users with clear photos or images with noise removed. For this purpose, it can be transformed and operated in various forms.
  • the user terminal 110 may be implemented as a smartphone, laptop, or computer that can be operated by being connected to the noise removal device 130, but is not necessarily limited thereto, and may also be implemented as a variety of devices, including a tablet PC, etc. .
  • the user terminal 110 may be connected to the noise removal device 130 through a network, and a plurality of user terminals 110 may be connected to the noise removal device 130 at the same time.
  • the noise removal device 130 may be implemented as a server corresponding to a computer or program that performs the non-local average-based noise removal method according to the present invention. Additionally, the noise removal device 130 may be connected to the user terminal 110 through a wired network or a wireless network such as Bluetooth, WiFi, or LTE, and may transmit and receive data with the user terminal 110 through the network. Additionally, the noise removal device 130 may be implemented to operate in connection with an independent external system (not shown in FIG. 1).
  • the noise removal device 130 may be implemented as a cloud server, and in another embodiment, the noise removal device 130 may be implemented as a single graphics processing unit (Graphics Processing Unit, GPU).
  • GPU Graphics Processing Unit
  • the database 150 may correspond to a storage device that stores various information required during the operation of the noise removal device 130.
  • the database 150 may store information about path tracing or information about an NLM-based noise removal algorithm, but is not necessarily limited thereto, and the noise removal device 130 may store information about a noise removal algorithm based on NLM.
  • information collected or processed can be stored in various forms.
  • the database 150 is shown as a device independent of the noise removal device 130, but is not necessarily limited thereto, and may be implemented as a logical storage device included in the noise removal device 130. Of course.
  • FIG. 2 is a diagram explaining the system configuration of the noise removal device of FIG. 1.
  • the noise removal device 130 may include a processor 210, a memory 230, a user input/output unit 250, and a network input/output unit 270.
  • the processor 210 can execute a noise removal procedure according to an embodiment of the present invention, and manage the memory 230 that is read or written in this process, and the volatile memory and non-volatile memory in the memory 230. You can schedule the synchronization time between The processor 210 can control the overall operation of the noise removal device 130 and is electrically connected to the memory 230, the user input/output unit 250, and the network input/output unit 270 to control the data flow between them. You can.
  • the processor 210 may be implemented as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU) of the noise removal device 130.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the memory 230 may be implemented as a non-volatile memory such as a solid state disk (SSD) or a hard disk drive (HDD) and may include an auxiliary memory used to store all data required for the noise removal device 130, It may include a main memory implemented as volatile memory such as RAM (Random Access Memory). Additionally, the memory 230 can store a set of instructions for executing the noise removal method according to the present invention by being executed by the electrically connected processor 210.
  • SSD solid state disk
  • HDD hard disk drive
  • RAM Random Access Memory
  • the user input/output unit 250 includes an environment for receiving user input and an environment for outputting specific information to the user, and includes an input adapter such as, for example, a touch pad, a touch screen, an on-screen keyboard, or a pointing device. It may include an output device including a device and an adapter such as a monitor or touch screen. In one embodiment, the user input/output unit 250 may correspond to a computing device connected through a remote connection, and in such case, the noise removal device 130 may be performed as an independent server.
  • an input adapter such as, for example, a touch pad, a touch screen, an on-screen keyboard, or a pointing device. It may include an output device including a device and an adapter such as a monitor or touch screen.
  • the user input/output unit 250 may correspond to a computing device connected through a remote connection, and in such case, the noise removal device 130 may be performed as an independent server.
  • the network input/output unit 270 provides a communication environment for connection to the user terminal 110 through a network, for example, a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), and It may include an adapter for communication such as VAN (Value Added Network). Additionally, the network input/output unit 270 may be implemented to provide short-range communication functions such as WiFi and Bluetooth or wireless communication functions of 4G or higher for wireless transmission of learning data.
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • VAN Value Added Network
  • the network input/output unit 270 may be implemented to provide short-range communication functions such as WiFi and Bluetooth or wireless communication functions of 4G or higher for wireless transmission of learning data.
  • FIG. 3 is a diagram explaining the functional configuration of the noise removal device of FIG. 1.
  • the noise removal device 130 may include a color image generator 310, a reference image generator 330, a noise removal unit 350, and a control unit 370.
  • the present invention does not need to include all of the above functional components at the same time, and may be implemented by omitting some of the above components or selectively including some or all of the above components depending on each embodiment. there is. Additionally, the present invention may be implemented as an independent module that selectively includes some of the above components, and the noise removal method according to the present invention may be performed through interworking between each module. Hereinafter, the operation of each component will be described in detail.
  • the color image generator 310 may generate a color image corresponding to a 3D scene through path tracing for the 3D scene. That is, the color image generator 310 tracks rays randomly distributed within each pixel of the camera space and calculates the effect of interaction, such as collision or reflection, with objects in the 3D scene. By doing this, a color image of a 3D scene can be created. A color image can be created as a result of integrating various sampled pixel values, and the overall rendering quality of the color image can be determined depending on how many operations are repeatedly performed to calculate the pixel value of the same pixel.
  • the reference image generator 330 may generate a reference image corresponding to a color image when the number of samples per pixel (SPP) of path tracing is less than or equal to a preset threshold.
  • the reference image generator 330 may perform path tracing by setting the number of samples per pixel (SPP, Samples Per Pixel) to a low value below the threshold to increase processing speed.
  • the number of samples per pixel may correspond to the number of repetitions of the sampling operation performed on the same pixel in the color image. In other words, as the number of samples per pixel increases, the quality of the color image may improve, and as the number of samples per pixel decreases, the quality may decrease due to increased noise in the color image.
  • the reference image generator 330 may generate a reference image as an additional image in addition to the color image.
  • the reference image is an image corresponding to a color image and may correspond to an image that does not contain noise generated during the path tracing process. That is, each pixel of the reference image can correspond one-to-one to each pixel of the color image, and can be generated using other noise-free images.
  • the reference image generator 330 generates an albedo image and a normal image for a 3D scene during the path tracing process, and calculates the average of RGB component values for each pixel of the normal image.
  • each pixel value of the reference image can be calculated by independently summing the average of the corresponding pixels of the normal image for each RGB component value for each pixel of the albedo image.
  • the reference image generator 330 can generate a noise-free reference image using a noise-free albedo image and a normal image.
  • the albedo image may correspond to an image expressing only the color of the surface of an object excluding optical features.
  • a normal image may correspond to an image expressing a normal vector of a 3D scene.
  • the reference image generator 330 can generate an albedo image and a normal image while generating a color image through a path tracing process, and determines each pixel value of the reference image using each pixel value of the albedo image and the normal image. You can.
  • the reference image generator 330 may calculate the average of the RGB component values for each pixel, considering that the RGB values of the normal image do not include the meaning of color. Thereafter, the reference image generator 330 may calculate the value of each pixel of the reference image by adding the average of the corresponding pixels calculated from the normal image for each RGB component value for each pixel of the albedo image. That is, the R component value of the reference image can be calculated by adding the average of the RGB component values of the normal image to the R component value of the albedo image, and the G component value of the reference image can be calculated by adding the average of the RGB component values of the normal image to the G component value of the albedo image. It can be calculated by adding the average of the component values, and the B component value of the reference image can be calculated by adding the average of the RGB component values of the normal image to the B component value of the albedo image.
  • the reference image generator 330 calculates a weighted sum of RGB component values for each pixel of the normal image, and calculates the following weighted sum for each RGB component value for each pixel of the albedo image. By applying Equation 1, the value of each pixel of the reference image can be calculated.
  • Ref_r (Ar + (p1 x Nr + p2 x Ng + p3 x Nb)) / 2
  • Ref_g (Ag + (p1 x Nr + p2 x Ng + p3 x Nb)) / 2
  • Ref_b (Ab + (p1 x Nr + p2 x Ng + p3 x Nb)) / 2
  • Ref_r, Ref_g, and Ref_b are the pixel-specific component values of the reference image
  • Ar, Ag, and Ab are the pixel-specific component values of the albedo image
  • Nr, Ng, and Nb are the pixel-specific component values of the normal image
  • a reference image can be created by adding each component to the component.
  • the reference image generator 330 may filter the reference image by applying at least one filter including a median filter. For example, the reference image generator 330 may reduce the effect of aliasing on the reference image through median filtering. Additionally, the reference image generator 330 may additionally update the reference image by selectively applying various filters in addition to the median filter.
  • the noise removal unit 350 may remove noise included in the color image by applying a weight value referenced in the reference image.
  • the noise removal unit 350 may use the weight extracted from the noise-free image to remove noise from the image containing noise.
  • the noise removal unit 350 can increase processing speed by applying NLM of the existing filter method instead of denoising of the AI method to remove noise, and uses the weight referenced from the reference image in the process to improve the processing speed of the existing NLM method. You can obtain noise removal results with better image quality than noise removal.
  • the noise removal unit 350 defines a search range by selecting a specific pixel location of the reference image as the center pixel, and the search range corresponds to a first window including the center pixel and the search range Determine at least one second window included in the window, determine a weight value based on the similarity between the first window and the at least one second window, and determine the weight value based on the center pixel position of the at least one second window on the color image. Noise at a specific pixel location can be removed by accumulating pixel values to which the value has been applied. This is explained in more detail in FIGS. 7 and 8.
  • the noise remover 350 may determine the weight value to have a higher value as the similarity between the first window and at least one second window increases. That is, the noise removal unit 350 can calculate a weight value by reflecting the degree of similarity between windows. For example, the noise removal unit 350 compares the pixel values of the center pixels of each window and assigns a higher weight as the difference in pixel values is smaller, and conversely, as the difference in pixel values becomes larger, it assigns a lower weight. can do. As another example, the noise removal unit 350 may calculate the similarity between windows by comparing all pixels of each window, and determine a weight based on the similarity.
  • the noise remover 350 may determine the weight value to have a higher value as the distance between the first window and at least one second window becomes shorter. For example, the noise removal unit 350 may calculate the distance between the center pixel positions of each window, and may assign a higher weight as the distance is larger, and may assign a lower weight as the distance is smaller.
  • the noise remover 350 may determine a weight value based on the similarity between the first window and the second window and perform normalization on the weight value. That is, the weight value can be converted to a value within a predetermined range through normalization. For example, the weight value can have a value between 0 and 1 through normalization.
  • the control unit 370 can control the overall operation of the noise removal device 130 and manage the control flow or data flow between the color image generation unit 310, the reference image generation unit 330, and the noise removal unit 350. there is.
  • Figure 4 is a flowchart explaining the non-local average-based noise removal method according to the present invention.
  • the noise removal device 130 generates a color image corresponding to the 3D scene through path tracing for the 3D scene through the color image generator 310. ) can be generated (step S410).
  • the noise removal device 130 generates a reference image corresponding to the color image when the number of samples per pixel (SPP) of path tracing is less than or equal to a preset threshold through the reference image generator 330. It can be done (step S430).
  • the noise removal device 130 may remove noise included in the color image by applying the weight value referenced in the reference image through the noise removal unit 350 (step S450).
  • images rendered at low SPP may contain a lot of noise.
  • NLM-based noise removal is performed to remove noise, an image of poor quality may be created as a result of the noise not being properly removed, as shown in Figure (b).
  • OID-based noise removal can create images with better quality, as shown in Figure (c).
  • an AI method Open Image Denoiser, Optix, etc.
  • noise removal using the NLM method using only color images can be applied because noise is greatly reduced.
  • Figure 7 is a diagram explaining the existing non-local average (NLM) method.
  • the non-local means-based noise removal algorithm is a series of filters developed from a bilateral filter.
  • the bilateral filter which assigns pixel-level similarity judgment as a weight, a plurality of pixels This can be done by assigning weights by extending the judgment of the similarity of regional groups composed of groups.
  • the non-local average (NLM) method uses a window of a certain size (i.e., first window) around the center pixel within a search range around the center pixel to remove noise, and a window of the same size within the search range. (i.e., the second window) can be compared to determine the degree of similarity as a weight value.
  • the non-local mean (NLM) method may correspond to an algorithm that removes noise by accumulating the center pixel values of the surrounding windows taking weight into account.
  • the non-local average (NLM) method may correspond to a method in which higher weights are assigned to windows as they are similar to each other, and lower weights are assigned to windows that are less similar, so that the center pixel values of similar neighboring windows help remove noise. If the noise is severe, it may be difficult to obtain weights by calculating similar degrees, and in this case, the noise removal effect may be reduced.
  • a search window can be defined by selecting a specific pixel location in the color image containing noise as the center pixel.
  • the search area may correspond to a square area with a size of 11 ⁇ 11 based on the center pixel.
  • a first window including the center pixel may be defined, and in the case of FIG. 7, it may correspond to a red center window with a size of 3 ⁇ 3.
  • a corresponding second window may be defined, and the second window may be determined to have the same size as the first window within the search area.
  • a plurality of second windows may be defined within the search area, and in the case of FIG. 7, they may correspond to two second windows formed at the upper part of the search area.
  • NLM non-local mean
  • noise removal using the non-local mean (NLM) method can be expressed as Equation 2 below.
  • C(x,y) is the pixel value of the center pixel of the first window from which noise has been removed
  • weight(i,j) is the weight of the center pixel of the second window
  • P(i,j) is the 2 This is the pixel value of the center pixel of the window.
  • NLM non-local mean
  • Figure 8 is a diagram explaining the non-local average method according to the present invention.
  • the noise removal device 130 can effectively remove noise included in a color image by performing the non-local average-based noise removal method according to the present invention.
  • the noise removal device 130 can accumulate only the accurate pixel values of surrounding pixels that should actually be reflected by using weights derived from the albedo and normal image that do not contain noise to remove noise in a non-local average method. Accordingly, the noise removal effect can be maximized.
  • the noise removal device 130 may generate a color image containing noise and a reference image without noise in a low SPP state.
  • the noise removal device 130 applies weight values derived from the reference image to the non-local average noise removal process performed on the color image, thereby increasing processing speed and obtaining noise removal results with better image quality. . That is, the weight value determined with reference to the reference image can be applied to each pixel value on the color image and used to remove noise at the target pixel location.
  • Figure 9 is a diagram explaining the process of generating a reference image from an albedo image and a normal image according to the present invention.
  • the noise removal device 130 may generate an albedo image and a normal image in addition to a color image when the SPP is below the threshold during the path tracing process.
  • the noise removal device 130 may generate a reference image using an albedo image and a normal image. More specifically, since the color itself has no meaning in the normal image, each component is multiplied by the parameters of p1, p2, and p3, the added value is added to each color component of the albedo, and then divided by 2 to obtain the value of the reference image. Each pixel value can be calculated.
  • Figure 10 is a diagram comparing and explaining for each SPP before and after noise removal (Referred NLM) processing according to the present invention.
  • the noise removal device 130 may perform NLM-based noise removal by applying a weight value derived from a reference image to a color image containing noise. That is, even in cases where noise is severe, such as when SPP is 1, 2, and 4, respectively, in the path tracing process for a 3D scene, a high noise removal effect (SPP1 denoised, SPP2 denoised, SPP4) is achieved through the noise removal method according to the present invention. denoised) can be obtained.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

La présente invention concerne un dispositif et un procédé d'élimination de bruit basés sur des moyens non locaux, le dispositif comprenant : une unité de génération d'image couleur pour générer une image couleur correspondant à une scène 3D en effectuant un traçage de trajet pour la scène 3D : une unité de génération d'image de référence pour générer, dans le cas où le nombre d'échantillons par pixel (SPP) dans le suivi de trajet est inférieur ou égal à une valeur seuil préconfigurée, une image de référence correspondant à l'image couleur ; et une unité d'élimination de bruit pour éliminer le bruit inclus dans l'image couleur par application d'une valeur de poids référencée dans l'image de référence.
PCT/KR2022/015083 2022-04-29 2022-10-07 Dispositif et procédé d'élimination de bruit basés sur de moyens non locaux WO2023210884A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020220053586A KR102638038B1 (ko) 2022-04-29 2022-04-29 비지역적 평균 기반의 노이즈 제거 장치 및 방법
KR10-2022-0053586 2022-04-29

Publications (1)

Publication Number Publication Date
WO2023210884A1 true WO2023210884A1 (fr) 2023-11-02

Family

ID=88518997

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/015083 WO2023210884A1 (fr) 2022-04-29 2022-10-07 Dispositif et procédé d'élimination de bruit basés sur de moyens non locaux

Country Status (2)

Country Link
KR (1) KR102638038B1 (fr)
WO (1) WO2023210884A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117665788A (zh) * 2024-02-01 2024-03-08 湖南科技大学 一种基于微波测量数据的噪声处理方法

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150037369A (ko) * 2013-09-30 2015-04-08 삼성전자주식회사 영상의 노이즈를 저감하는 방법 및 이를 이용한 영상 처리 장치

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100989760B1 (ko) 2008-12-29 2010-10-26 엠텍비젼 주식회사 이미지 처리 장치, 이미지 처리 장치의 노이즈 제거 방법 및 노이즈 제거 방법이 기록된 기록매체

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150037369A (ko) * 2013-09-30 2015-04-08 삼성전자주식회사 영상의 노이즈를 저감하는 방법 및 이를 이용한 영상 처리 장치

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
AYAN CHAKRABARTI; KALYAN SUNKAVALLI: "Single-image RGB Photometric Stereo With Spatially-varying Albedo", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 14 September 2016 (2016-09-14), 201 Olin Library Cornell University Ithaca, NY 14853 , XP080726486, DOI: 10.1109/3DV.2016.34 *
FUKAO YOSHIKI; KAWAHARA RYO; NOBUHARA SHOHEI; NISHINO KO: "Polarimetric Normal Stereo", 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), IEEE, 20 June 2021 (2021-06-20), pages 682 - 690, XP034011014, DOI: 10.1109/CVPR46437.2021.00074 *
HANGMING FAN; RUI WANG; YUCHI HUO; HUJUN BAO: "Real-time Monte Carlo Denoising with Weight Sharing Kernel Prediction Network", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 12 February 2022 (2022-02-12), 201 Olin Library Cornell University Ithaca, NY 14853, XP091159040, DOI: 10.1111/cgf.14338 *
ZUO XINXIN; WANG SEN; ZHENG JIANGBIN; PAN ZHIGENG; YANG RUIGANG: "Detailed Surface Geometry and Albedo Recovery from RGB-D Video under Natural Illumination", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, IEEE COMPUTER SOCIETY., USA, vol. 42, no. 10, 22 November 2019 (2019-11-22), USA , pages 2720 - 2734, XP011807139, ISSN: 0162-8828, DOI: 10.1109/TPAMI.2019.2955459 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117665788A (zh) * 2024-02-01 2024-03-08 湖南科技大学 一种基于微波测量数据的噪声处理方法
CN117665788B (zh) * 2024-02-01 2024-04-05 湖南科技大学 一种基于微波测量数据的噪声处理方法

Also Published As

Publication number Publication date
KR102638038B1 (ko) 2024-02-21
KR20230154355A (ko) 2023-11-08

Similar Documents

Publication Publication Date Title
WO2020138680A1 (fr) Appareil de traitement d'image, et procédé de traitement d'image associé
WO2019050360A1 (fr) Dispositif électronique et procédé de segmentation automatique d'être humain dans une image
EP3642802A1 (fr) Appareil d'édition d'image utilisant une carte de profondeur et son procédé
WO2020226317A1 (fr) Appareil de traitement d'image et procédé de traitement d'image associé
WO2011087289A2 (fr) Procédé et système pour réaliser le rendu de vues tridimensionnelles d'une scène
WO2021133001A1 (fr) Procédé et dispositif d'inférence d'image sémantique
WO2022131497A1 (fr) Appareil d'apprentissage et procédé de génération d'image, et appareil et procédé de génération d'image
WO2023210884A1 (fr) Dispositif et procédé d'élimination de bruit basés sur de moyens non locaux
EP2329655A2 (fr) Appareil et procédé pour obtenir une image à haute résolution
WO2021080145A1 (fr) Appareil et procédé de remplissage d'image
WO2020060019A1 (fr) Dispositif, procédé et système de détection de caractère
WO2017213439A1 (fr) Procédé et appareil de génération d'une image à l'aide de multiples autocollants
WO2022197066A1 (fr) Mélange de pixels pour synthétiser des trames vidéo avec gestion d'occlusion et de tatouage numérique
WO2020101434A1 (fr) Dispositif de traitement d'image et procédé de reciblage d'image
WO2024039121A1 (fr) Procédé et appareil de rendu sélectif multi-niveau pour améliorer la performance de rendu
WO2022097766A1 (fr) Procédé et dispositif de restauration de zone masquée
WO2012034469A1 (fr) Système et procédé d'interaction homme-machine à base de gestes et support de stockage informatique
WO2020138630A1 (fr) Dispositif d'affichage et procédé de traitement d'image associé
WO2020050550A1 (fr) Procédés et systèmes de réalisation d'opérations de modification sur un support
WO2020101300A1 (fr) Appareil de traitement d'image et son procédé de fonctionnement
WO2016098943A1 (fr) Procédé et système de traitement d'images pour améliorer la capacité de détection de visages
WO2021107592A1 (fr) Système et procédé de retouche d'image précise pour éliminer un contenu non souhaité d'images numériques
WO2015023106A1 (fr) Appareil et procédé de traitement d'image
WO2023224261A1 (fr) Procédé de dépersonnalisation d'une région liée à la confidentialité dans une image et dispositif de dépersonnalisation l'utilisant
WO2024147483A1 (fr) Interface de filtrage d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22940378

Country of ref document: EP

Kind code of ref document: A1