CN114697571A - Image sensing device and operation method thereof - Google Patents
Image sensing device and operation method thereof Download PDFInfo
- Publication number
- CN114697571A CN114697571A CN202111386521.1A CN202111386521A CN114697571A CN 114697571 A CN114697571 A CN 114697571A CN 202111386521 A CN202111386521 A CN 202111386521A CN 114697571 A CN114697571 A CN 114697571A
- Authority
- CN
- China
- Prior art keywords
- image
- noise
- value
- generates
- sensing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims description 7
- 238000011017 operating method Methods 0.000 claims abstract description 7
- 238000012937 correction Methods 0.000 claims description 26
- 230000006870 function Effects 0.000 claims description 19
- 238000003705 background correction Methods 0.000 claims description 11
- 230000035945 sensitivity Effects 0.000 claims description 11
- 238000013507 mapping Methods 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 16
- 238000013135 deep learning Methods 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 3
- 230000015556 catabolic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/268—Signal distribution or switching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/60—Image enhancement or restoration using machine learning, e.g. neural networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/82—Camera processing pipelines; Components thereof for controlling camera response irrespective of the scene brightness, e.g. gamma correction
- H04N23/83—Camera processing pipelines; Components thereof for controlling camera response irrespective of the scene brightness, e.g. gamma correction specially adapted for colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
- H04N25/611—Correction of chromatic aberration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Image Processing (AREA)
- Color Television Image Signal Generators (AREA)
- Studio Devices (AREA)
Abstract
The present application relates to an image sensing device and an operating method thereof. Disclosed is an image sensing device, including: a reverse pipeline adapted to generate an original image based on a source image without real noise; a noise generator adapted to generate a noise image corresponding to the real image by applying a noise value modeling the real noise value for each pixel to the original image; and a pipeline adapted to generate a dataset image corresponding to the source image based on the noise image.
Description
Technical Field
Various embodiments of the present disclosure relate to a semiconductor design technology, and more particularly, to an image sensing apparatus and an operating method thereof.
Background
An image sensing device is a device for capturing an image using the characteristic of a semiconductor that reacts to light. Image sensing devices are generally classified into Charge Coupled Device (CCD) image sensing devices and Complementary Metal Oxide Semiconductor (CMOS) image sensing devices. Recently, CMOS image sensing devices are widely used because they can allow analog and digital control circuits to be directly implemented on a single Integrated Circuit (IC).
Disclosure of Invention
Various embodiments of the present disclosure relate to an image sensing device capable of learning and denoising true noise, instead of gaussian noise, occurring therein based on a deep learning technique, and an operating method thereof.
According to an embodiment of the present disclosure, an image sensing apparatus may include: a reverse pipeline adapted to generate an original image based on a source image without real noise; a noise generator adapted to generate a noise image corresponding to the real image by applying a noise value obtained by modeling a real noise value for each pixel to the original image; and a pipeline adapted to generate a dataset image corresponding to the source image based on the noise image.
The noise generator may model the noise values based on each image value included in the original image.
The noise value may be calculated by including the square root of each image value.
The reverse pipeline may include: an inverse gamma module adapted to receive a source image and generate a first image prior to applying gamma correction thereto based on an inverse gamma function; a reverse demosaicing module adapted to receive the first image and generate a second image before a demosaicing operation is performed thereon based on the set color pattern; an inverse white balance module adapted to receive the second image and generate a third image before a white balance operation is performed thereon based on a gain value according to the sensitivity; and an inverse correction module adapted to receive the third image and generate an original image before lens shading correction is applied thereto based on the gain value according to the luminance.
The pipeline may include: a correction module adapted to receive a noise image and generate a fourth image to which lens shading correction is applied based on a gain value according to a position of the image; a white balance module adapted to receive the fourth image and generate a fifth image on which a white balance operation is performed based on a gain value according to the sensitivity; a demosaicing module adapted to receive the fifth image and generate a sixth image on which a demosaicing operation is performed; and a gamma module adapted to receive the sixth image and generate a data set image to which gamma correction is applied based on a gamma function.
The image sensing device may further comprise a learning processor adapted to learn true noise based on the dataset image and to remove the true noise from the true image.
According to an embodiment of the present invention, an image sensing apparatus may include: a noise processor adapted to generate a dataset image by applying a noise value modeling a true noise value for each pixel to a source image without true noise; and a learning processor adapted to learn the true noise based on the dataset image and remove the true noise from a true image corresponding to the source image.
The noise processor may convert a source image into an original image having a set color pattern and then model a noise value based on each image value included in the original image.
The noise processor may include: a reverse pipeline adapted to generate an original image based on a source image; a noise generator adapted to generate a noise image corresponding to the real image by applying the noise value to the original image; and a pipeline adapted to generate a dataset image based on the noise image.
The noise generator may model the noise values based on each image value included in the original image.
The noise value may be calculated by including the square root of each image value.
The counter-current water line may include: an inverse gamma module adapted to receive a source image and generate a first image prior to applying gamma correction thereto based on an inverse gamma function; a reverse demosaicing module adapted to receive the first image and generate a second image before a demosaicing operation is performed thereon based on a predetermined color pattern; an inverse white balance module adapted to receive the second image and generate a third image before a white balance operation is performed thereon based on a gain value according to the sensitivity; and an inverse correction module adapted to receive the third image and generate an original image before lens shading correction is applied thereto based on the gain value according to the luminance.
The pipeline may include: a correction module adapted to receive a noise image and generate a fourth image to which lens shading correction is applied based on a gain value according to a position of the image; a white balance module adapted to receive the fourth image and generate a fifth image on which a white balance operation is performed based on a gain value according to the sensitivity; a demosaicing module adapted to receive the fifth image and generate a sixth image on which a demosaicing operation is performed; and a gamma module adapted to receive the sixth image and generate a data set image to which gamma correction is applied based on a gamma function.
According to an embodiment of the present invention, an operating method of an image sensing apparatus may include: generating an original image from the image by operation of an inverse mapping pipeline during a learning mode period; modeling, for each pixel, a true noise value based on an image value included in an original image during a learning mode period; generating, by operation of the pipeline, a dataset image by applying a noise value modeling a true noise value to the original image during the learning mode period; and learning a noise value based on the original image and the dataset image.
The operating method may further include: generating a target image corresponding to the real image through operation of the pipeline during the capture mode period; and generating an output image by denoising, during a capture mode period, true noise applied to the true image from the target image according to a learning result of the noise value.
According to an embodiment of the present invention, an image sensing apparatus may include: a reverse pipeline adapted to convert a source image into an original image comprising image values corresponding to a plurality of pixels; a noise generator adapted to generate a noise image comprising a plurality of noise values for image values of an original image, wherein each noise value is determined on the basis of each of a plurality of pixels; a pipeline adapted to generate a dataset image based on a noisy image; and a learning processor adapted to remove noise from the real image based on the dataset image.
Drawings
Fig. 1 is a block diagram illustrating an image sensing apparatus according to an embodiment of the present disclosure.
Fig. 2 is a block diagram illustrating the image sensor shown in fig. 1 according to an embodiment of the present disclosure.
Fig. 3 is a diagram illustrating an example of the pixel array shown in fig. 2 according to an embodiment of the present disclosure.
Fig. 4 is a block diagram illustrating the image processor shown in fig. 1 according to an embodiment of the present disclosure.
Fig. 5 is a block diagram illustrating the noise processor shown in fig. 4 according to an embodiment of the present disclosure.
Fig. 6 is a block diagram illustrating an example of the reverse pipeline shown in fig. 5, according to an embodiment of the present disclosure.
Fig. 7A and 7B are graphs corresponding to the gamma function and the inverse gamma function, respectively, associated with the gamma module shown in fig. 6, according to an embodiment of the present disclosure.
Fig. 8 is a block diagram illustrating an example of the pipeline shown in fig. 5 according to an embodiment of the present disclosure.
Fig. 9 is a diagram illustrating an operation of the image sensing apparatus shown in fig. 1 according to an embodiment of the present disclosure.
Detailed Description
Various embodiments of the present disclosure are described below with reference to the accompanying drawings in order to describe the present disclosure in detail so that those skilled in the art to which the present disclosure pertains can easily carry out the technical spirit of the present disclosure.
It will be understood that when an element is referred to as being "connected to" or "coupled to" another element, it can be directly connected or coupled to the other element or be electrically connected or coupled to the other element with one or more elements interposed therebetween. In addition, it will be further understood that the terms "comprises," "comprising," "includes" and "including," when used in this specification, do not preclude the presence of one or more other elements, but may further comprise or have one or more other elements, unless otherwise specified. In the description throughout the specification, some components are described in the singular, but the present disclosure is not limited thereto, and it will be understood that these components may be formed in plural.
Fig. 1 is a block diagram illustrating an image sensing apparatus 10 according to an embodiment of the present disclosure.
Referring to fig. 1, the image sensing apparatus 10 may include an image sensor 100 and an image processor 200.
The image sensor 100 may generate a real image IMG from incident light.
The image processor 200 may generate an output image DIMG based on the real image IMG to which the real noise (hereinafter, referred to as "first real noise") is applied and the source image RGB without the real noise (hereinafter, referred to as "second real noise"). For example, the image processor 200 may apply real noise (hereinafter, referred to as "third real noise") to the source image RGB, learn the source image RGB having the third real noise, and generate the output image DIMG by denoising or removing the first read noise applied to the real image IMG according to the learning result. The third true noise may include a noise value that models the true noise value for each pixel.
The first to third true noises can be distinguished from gaussian noise. The first to third true noises may have different intensities depending on the level of the pixel signal, while the gaussian noise has the same intensity regardless of the level of the pixel signal. The real image IMG may be an image captured where the light source is insufficient so that the first real noise occurs, i.e., a low light image. The source image RGB may be an image previously stored in the image sensing device 10 or an image provided by an external device (not shown). For example, the source image RGB may be an image captured where the light source is sufficient so that the second true noise does not occur, i.e., a highlight image.
Fig. 2 is a block diagram illustrating the image sensor 100 shown in fig. 1 according to an embodiment of the present disclosure.
Referring to fig. 2, the image sensor 100 may include a pixel array 110 and a signal converter 120.
The pixel array 110 may include a plurality of pixels arranged in a row direction and a column direction (refer to fig. 3). The pixel array 110 may generate an analog type image value VPX for each row. For example, pixel array 110 may generate image values VPX from pixels arranged in a first row during a first row time and generate image values VPX from pixels arranged in an nth row during an nth row time (where "n" is an integer greater than 2).
The signal converter 120 may convert the analog type image value VPX into the digital type image value DPX. The real image IMG may comprise image values DPX. For example, the signal converter 120 may include an analog-to-digital converter.
Fig. 3 is a diagram illustrating an example of the pixel array 110 shown in fig. 2 according to an embodiment of the present disclosure.
Referring to fig. 3, the pixel array 110 may be arranged in a predetermined color filter pattern. For example, the predetermined color filter pattern may be a Bayer pattern. The Bayer pattern may be composed of repeating units each having 2 × 2 pixels. In each cell, two pixels G and G each having a green filter (hereinafter, referred to as "green") may be disposed to diagonally face each other at corners of the cell, and a pixel B having a blue filter (hereinafter, referred to as "blue") and a pixel R having a red filter (hereinafter, referred to as "red") may be disposed at other corners of the cell. The four pixels G, R, B and G are not necessarily limited to the arrangement shown in fig. 3, but may be disposed in various ways according to the Bayer pattern described above.
Although the present embodiment describes the pixel array 110 as having a Bayer pattern as an example, the present disclosure is not necessarily limited thereto, and may also have various patterns such as a quad pattern (quad pattern).
Fig. 4 is a block diagram illustrating the image processor 200 shown in fig. 1 according to an embodiment of the present disclosure.
Referring to fig. 4, the image processor 200 may include a noise processor 210 and a learning processor 220.
The noise processor 210 may generate a dataset image NRGB2 by applying noise values to the source image RGB. The data set image NRGB2 may be an image separated for each color channel. The noise processor 210 may convert the source image RGB into an original image IIMG having a predetermined color pattern (i.e., a Bayer pattern), and then model a noise value based on each image value included in the original image IIMG. The noise processor 210 may generate the data set image NRGB2 by applying a noise value to the original image IIMG. The noise processor 210 may generate a target image NRGB1 based on the real image IMG. The target image NRGB1 may be a separate image for each color channel.
The learning processor 220 may learn the third true noise based on the data set image NRGB2, and remove the first true noise from the target image NRGB1 corresponding to the true image IMG.
Fig. 5 is a block diagram illustrating the noise processor 210 shown in fig. 4 according to an embodiment of the present disclosure.
Referring to fig. 5, the noise processor 210 may include a counter pipeline 211, a noise generator 213, and a pipeline 215.
The reverse pipeline 211 may generate an original image IIMG based on the source image RGB. The source image RGB may be an image separated for each color channel, and the original image IIMG may be an image having a Bayer pattern. The reverse pipeline 211 may reverse the operation of the mapping pipeline 215 and generate the original image IIMG.
The noise generator 213 may generate a noise image NIMG corresponding to the real image IMG by applying noise values to the original image IIMG. According to an example, based on "equation 1" described below, the noise generator 213 may generate the output image values included in the noise image NIMG by applying the noise values to each input image value (i.e., pixel) included in the original image IIMG.
[ formula 1]
Here, "M" may refer to each output image value, "N" may refer to each input image value,may refer to a noise value modeled corresponding to each input image value, and "RV" may refer to a random value. The random value may refer to any value randomly selected among values following a standard normal distribution. The probability density function f (rv) from which random values can be selected can be calculated as shown in "equation 2" below.
[ formula 2]
According to another example, the noise generator 213 may generate the output image values included in the noise image NIMG by applying the noise values to each input image value included in the original image IIMG based on "equation 3" described below.
[ formula 3]
M=N*RV2
Here, "M" may refer to each output image value, "N" may refer to each input image value, and "RV 2" may refer to a random value. The random value may refer to any value randomly selected among values following a standard normal distribution. The probability density function f (RV2) from which random values can be selected can be calculated as shown in "equation 4" below.
[ formula 4]
As shown in the above "equation 4", when a random value is randomly selected among values following a standard normal distribution, a root value of each input image value may be used (i.e.,) As a standard variance value.
As described in "equation 1" to "equation 4" above, the noise generator 213 may model the noise value based on the image value (i.e., the input image value) included in the original image IIMG. Since the noise value can be calculated by including the respective square roots of the input image values, the noise value may have different intensities.
The pipeline 215 may generate a data set image NRGB2 based on the noise image NIMG and a target image NRGB1 based on the real image IMG. The noise image NIMG and the real image IMG may be images each having a Bayer pattern, and the data set image NRGB2 and the target image NRGB1 may be images separated for each color channel.
Fig. 6 is a block diagram illustrating an example of the reverse pipeline 211 shown in fig. 5, according to an embodiment of the present disclosure. Fig. 7A is a graph corresponding to a gamma function according to an embodiment of the present disclosure, and fig. 7B is a graph corresponding to an inverse gamma function according to an embodiment of the present disclosure.
Referring to fig. 6, the reverse waterline 211 may include an inverse gamma module 2111, an inverse demosaic module 2113, an inverse white balance module 2115, and an inverse correction module 2117.
The inverse gamma module 2111 may operate by inverse mapping the operation of the gamma module 2157 in fig. 8 to be described later. For example, the inverse gamma module 2111 may generate the source image RGB as the first image BRGB before gamma correction is applied thereto based on an inverse gamma function. The inverse gamma function may correspond to an inverse curve of the gamma function (refer to fig. 7B). The gamma function may represent an Output brightness value Output relative to an Input brightness value Input and Output and correspond to a logarithmic curve (refer to fig. 7A). The inverse gamma module 2111 may generate the first image BRGB by multiplying the inverse logarithmic values by image values included in the source image RGB, respectively.
The inverse demosaicing module 2113 may operate by inverse-mapping the operation of the demosaicing module 2155 in fig. 8 to be described later. For example, the inverse demosaicing module 2113 may generate the first image BRGB as the second image CIMG before the demosaicing operation is performed thereon based on a predetermined color pattern. The inverse demosaicing module 2113 may generate a second image CIMG having a Bayer pattern based on the first image BRGB separated for each color channel.
The inverse white balance module 2115 may operate by inverse mapping the operation of the white balance module 2153 in fig. 8 to be described later. For example, the inverse white balance module 2115 may generate the second image CIMG as the third image DIMG before the white balance operation is performed thereon based on the gain value according to the sensitivity. The inverse white balance module 2115 may generate the third image DIMG by dividing the image values included in the second image CIMG by the gain values, respectively. In this case, the inverse white balance module 2115 may generate the third image DIMG in various ways by randomly generating and applying gain values.
The inverse correction module 2117 may operate by inverse mapping the operation of the correction module 2151 in fig. 8 to be described later. For example, the inverse correction module 2117 may generate the third image DIMG as the original image IIMG before the lens shading correction is applied thereto based on the inverse gain value according to the position of the image. The inverse gain value may comprise a value that is the inverse of the gain value used by the correction module 2151. The inverse correction module 2117 may generate the original image IIMG by applying the inverse gain values to the image values included in the third image DIMG, respectively.
Fig. 8 is a block diagram illustrating an example of the pipeline 215 shown in fig. 5, according to an embodiment of the present disclosure.
Referring to fig. 8, pipeline 215 may include a correction module 2151, a white balance module 2153, a demosaic module 2155, and a gamma module 2157.
The correction module 2151 may generate the noise image NIMG or the real image IMG as a fourth image AIMG to which lens shading correction is applied, based on a gain value according to a position of an image. Lens shading correction is a technique for correcting a phenomenon in which a lens reduces brightness toward the outside of an image. The correction module 2151 may generate the fourth image AIMG by applying the gain values to the image values included in the noise image NIMG, respectively, or by applying the gain values to the image values included in the real image IMG, respectively.
The white balance module 2153 may generate the fourth image AIMG as a fifth image BIMG on which a white balance operation is performed, based on a gain value according to the sensitivity. The white balance operation is a technique of correcting sensitivity according to color variation. The white balance module 2153 may generate the fifth image BIMG by multiplying image values included in the fourth image AIMG by gain values, respectively.
The demosaicing module 2155 may generate the fifth image BIMG as a sixth image ARGB on which a demosaicing operation is performed. The demosaicing module 2155 may generate a sixth image ARGB separated for each color channel based on a fifth image BIMG having a Bayer pattern.
The gamma module 2157 may generate the sixth image ARGB as the data set image NRGB2 or the target image NRGB1 based on a gamma function. The gamma module 2157 may generate the data set image NRGB2 or the target image NRGB1 by multiplying the logarithmic values according to the luminance values by image values included in the sixth image ARGB, respectively.
Hereinafter, the operation of the image sensing device 10 according to the embodiment of the present disclosure having the above-described configuration is described.
Fig. 9 is a diagram illustrating an operation of the image sensing apparatus 10 shown in fig. 1 according to an embodiment of the present disclosure.
Referring to fig. 9, the image processor 200 may apply a third true noise to at least one source image RGB during the learning mode period and learn one source image RGB having the third true noise. The source image RGB may be a clean image from which the second true noise is denoised or removed, and the third true noise may comprise noise values modeling the true noise values for each pixel.
More specifically, the noise processor 210 may convert the source image RGB into the original image IIMG having a Bayer pattern during the learning mode period and then model a noise value based on each image value included in the original image IIMG. In this case, the noise processor 210 may generate the original image IIMG having the Bayer pattern through the operation of the inverse mapping pipeline 215. The noise processor 210 may generate the data set image NRGB2 by applying a noise value to the original image IIMG. In this case, the noise processor 210 may generate a separate data set image NRGB2 for each color channel through operation of the pipeline 215. The learning processor 220 may learn the third true noise in a supervised learning manner based on the source image RGB and the data set image NRGB 2.
The image sensor 100 may generate a real image IMG having a Bayer pattern from incident light during a capture mode period. The image processor 200 may generate an output image DIMG based on the real image IMG during the capture mode period. For example, the image processor 200 may generate a target image NRGB1 separated for each color channel by the operation of the pipeline 215, and generate an output image DIMG by denoising or removing the first true noise applied to the true image IMG from the target image NRGB1 according to the learning result.
Depending on the performance of the image sensor 100 and/or the image processor 200, the output image DIMG may be generated as an image whose level does not meet expectations (hereinafter referred to as a "lower-than-expected output image"). In this case, the image processor 200 may perform an additional learning operation (i.e., a fine adjustment operation) and use the output image DIMG that is lower than expected when the additional learning operation is performed. For example, the image processor 200 may generate a plurality of target images NRGB1 corresponding to the output images DIMG that are lower than expected in the same manner, and generate a plurality of output images DIMG based on the plurality of target images NRGB 1. The image processor 200 may perform an additional learning operation by using an average image of the plurality of output images DIMG as the source image RGB and each of the plurality of target images NRGB1 as the data set image NRGB 2. The performance degradation of the output image DIMG according to the performance of the image sensor 100 and/or the image processor 200 may be improved by the additional learning operation.
According to the embodiment of the disclosure, the real noise can be learned and denoised based on a deep learning technology.
According to the embodiments of the present disclosure, true noise, instead of gaussian noise, may be learned and denoised based on a deep learning technique, thereby obtaining a clean image from which the true noise is removed.
According to the embodiments of the present disclosure, since the dataset image to which the true noise is applied is generated to correspond to the source image, the present disclosure is easily compatible with the deep learning network developed in the related art.
While the present disclosure has been illustrated and described with respect to particular embodiments, the disclosed embodiments are provided for purposes of description and are not intended to be limiting. Further, it is noted that the present disclosure may be implemented in various ways by means of substitutions, changes and modifications falling within the scope of the appended claims, as will be recognized by those skilled in the art in light of the present disclosure.
Cross Reference to Related Applications
The present application claims priority from korean patent application No.10-2020-0185889, filed on 29/12/2020, the disclosure of which is incorporated herein by reference in its entirety.
Claims (17)
1. An image sensing device, comprising:
a reverse pipeline to generate an original image based on a source image without real noise;
a noise generator that generates a noise image corresponding to a real image by applying a noise value that models a real noise value for each pixel to the original image; and
a pipeline that generates a dataset image corresponding to the source image based on the noise image.
2. The image sensing device of claim 1, wherein the noise generator models the noise value based on each image value included in the original image.
3. The image sensing device of claim 2, wherein the noise value is calculated by including a square root of each of the image values.
4. The image sensing device according to claim 2, wherein the noise value is defined based on a square root value of each of the image values and a random value, the random value being any value randomly selected among values following a standard normal distribution.
5. The image sensing device of claim 1, wherein the counter-waterline comprises:
an inverse gamma module that receives the source image and generates a first image before gamma correction is applied thereto based on an inverse gamma function;
a reverse demosaicing module that receives the first image and generates a second image before a demosaicing operation is performed thereon based on a set color pattern;
an inverse white balance module that receives the second image and generates a third image before a white balance operation is performed thereon based on a gain value according to sensitivity; and
an inverse correction module that receives the third image and generates the original image before lens shading correction is applied thereto based on a gain value according to brightness.
6. The image sensing device of claim 1, wherein the pipeline comprises:
a correction module that receives the noise image and generates a fourth image to which lens shading correction is applied based on a gain value according to a position of an image;
a white balance module that receives the fourth image and generates a fifth image on which a white balance operation is performed based on a gain value according to sensitivity;
a demosaicing module that receives the fifth image and generates a sixth image on which a demosaicing operation is performed; and
a gamma module that receives the sixth image and generates the data set image to which gamma correction is applied based on a gamma function.
7. The image sensing device according to claim 1, further comprising a learning processor that learns true noise based on the dataset image and removes the true noise from the true image.
8. An image sensing device, comprising:
a noise processor that generates a dataset image by applying a noise value that models a true noise value for each pixel to a source image that is free of true noise; and
a learning processor that learns true noise based on the dataset image and removes true noise from a true image corresponding to the source image.
9. The image sensing device as claimed in claim 8, wherein the noise processor converts the source image into an original image having a set color pattern and then models the noise value based on each image value included in the original image.
10. The image sensing device of claim 8, wherein the noise processor comprises:
a reverse pipeline that generates an original image based on the source image;
a noise generator that generates a noise image corresponding to the real image by applying the noise value to the original image; and
a pipeline that generates the dataset image based on the noise image.
11. The image sensing device of claim 10, wherein the noise generator models the noise value based on each image value included in the original image.
12. The image sensing device of claim 11, wherein the noise value is calculated by including a square root of each of the image values.
13. The image sensing device according to claim 11, wherein the noise value is defined based on a square root value of each of the image values and a random value, the random value being any value randomly selected among values following a standard normal distribution.
14. The image sensing device of claim 10, wherein the counter-waterline comprises:
an inverse gamma module that receives the source image and generates a first image before gamma correction is applied thereto based on an inverse gamma function;
a reverse demosaicing module that receives the first image and generates a second image before a demosaicing operation is performed thereon based on a predetermined color pattern;
an inverse white balance module that receives the second image and generates a third image before a white balance operation is performed thereon based on a gain value according to sensitivity; and
an inverse correction module that receives the third image and generates the original image before lens shading correction is applied thereto based on a gain value according to brightness.
15. The image sensing device of claim 10, wherein the pipeline comprises:
a correction module that receives the noise image and generates a fourth image to which lens shading correction is applied based on a gain value according to a position of an image;
a white balance module that receives the fourth image and generates a fifth image on which a white balance operation is performed based on a gain value according to sensitivity;
a demosaicing module that receives the fifth image and generates a sixth image on which a demosaicing operation is performed; and
a gamma module that receives the sixth image and generates the data set image to which gamma correction is applied based on a gamma function.
16. An operating method of an image sensing device, the operating method comprising the steps of:
during a learning mode period, generating an original image from the image by operation of an inverse mapping pipeline;
modeling, during the learning mode period, a true noise value for each pixel based on an image value included in the original image;
during the learning mode period, generating, by operation of the pipeline, a dataset image by applying a noise value modeling the true noise value to the raw image; and
learning the noise value based on the original image and the dataset image.
17. The method of operation of claim 16, further comprising the steps of:
generating, by operation of the pipeline, a target image corresponding to a real image during a capture mode period; and
generating an output image by denoising real noise applied to the real image from the target image according to a learning result of the noise value during the capture mode period.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020200185889A KR20220094561A (en) | 2020-12-29 | 2020-12-29 | Image sensing device and method of operating the same |
KR10-2020-0185889 | 2020-12-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114697571A true CN114697571A (en) | 2022-07-01 |
Family
ID=82118238
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111386521.1A Withdrawn CN114697571A (en) | 2020-12-29 | 2021-11-22 | Image sensing device and operation method thereof |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220210351A1 (en) |
JP (1) | JP2022104795A (en) |
KR (1) | KR20220094561A (en) |
CN (1) | CN114697571A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20240016599A (en) | 2022-07-29 | 2024-02-06 | 엘지디스플레이 주식회사 | Light emitting display apparatus |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102055916A (en) * | 2009-11-03 | 2011-05-11 | 三星电子株式会社 | Methods of modeling an integrated noise in an image sensor and methods of reducing noise using the same |
CN104853171A (en) * | 2014-02-17 | 2015-08-19 | 索尼公司 | Image processing apparatus, method for image processing, and program |
US20180007333A1 (en) * | 2016-06-30 | 2018-01-04 | Apple Inc. | Per pixel color correction filtering |
CN108280811A (en) * | 2018-01-23 | 2018-07-13 | 哈尔滨工业大学深圳研究生院 | A kind of image de-noising method and system based on neural network |
CN111512344A (en) * | 2017-08-08 | 2020-08-07 | 西门子股份公司 | Generating synthetic depth images from CAD data using enhanced generative antagonistic neural networks |
CN112116539A (en) * | 2020-09-08 | 2020-12-22 | 浙江大学 | Optical aberration fuzzy removal method based on deep learning |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IT1403150B1 (en) * | 2010-11-24 | 2013-10-04 | St Microelectronics Srl | PROCEDURE AND DEVICE TO CLEAR A DIGITAL VIDEO SIGNAL, ITS RELATED PRODUCT, FROM THE NOISE. |
CN111401411B (en) * | 2020-02-28 | 2023-09-29 | 北京小米松果电子有限公司 | Method and device for acquiring sample image set |
JP7508265B2 (en) * | 2020-05-14 | 2024-07-01 | キヤノン株式会社 | Information processing device, information processing method, and program |
-
2020
- 2020-12-29 KR KR1020200185889A patent/KR20220094561A/en unknown
-
2021
- 2021-06-17 US US17/350,773 patent/US20220210351A1/en not_active Abandoned
- 2021-11-22 CN CN202111386521.1A patent/CN114697571A/en not_active Withdrawn
- 2021-11-25 JP JP2021190907A patent/JP2022104795A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102055916A (en) * | 2009-11-03 | 2011-05-11 | 三星电子株式会社 | Methods of modeling an integrated noise in an image sensor and methods of reducing noise using the same |
CN104853171A (en) * | 2014-02-17 | 2015-08-19 | 索尼公司 | Image processing apparatus, method for image processing, and program |
US20150235612A1 (en) * | 2014-02-17 | 2015-08-20 | Sony Corporation | Image processing apparatus, method for image processing, and program |
US20180007333A1 (en) * | 2016-06-30 | 2018-01-04 | Apple Inc. | Per pixel color correction filtering |
CN111512344A (en) * | 2017-08-08 | 2020-08-07 | 西门子股份公司 | Generating synthetic depth images from CAD data using enhanced generative antagonistic neural networks |
CN108280811A (en) * | 2018-01-23 | 2018-07-13 | 哈尔滨工业大学深圳研究生院 | A kind of image de-noising method and system based on neural network |
CN112116539A (en) * | 2020-09-08 | 2020-12-22 | 浙江大学 | Optical aberration fuzzy removal method based on deep learning |
Also Published As
Publication number | Publication date |
---|---|
KR20220094561A (en) | 2022-07-06 |
US20220210351A1 (en) | 2022-06-30 |
JP2022104795A (en) | 2022-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8913153B2 (en) | Imaging systems and methods for generating motion-compensated high-dynamic-range images | |
US8547442B2 (en) | Method and apparatus for motion blur and ghosting prevention in imaging system | |
US8310567B2 (en) | Methods of modeling an integrated noise in an image sensor and methods of reducing noise using the same | |
JP6096243B2 (en) | Image data processing method and system | |
CN104170376B (en) | Image processing equipment, imaging device and image processing method | |
US9007488B2 (en) | Systems and methods for generating interpolated high-dynamic-range images | |
US8442345B2 (en) | Method and apparatus for image noise reduction using noise models | |
US9438827B2 (en) | Imaging systems and methods for generating binned high-dynamic-range images | |
TW201918995A (en) | Multiplexed high dynamic range images | |
KR101225056B1 (en) | Apparatus and method for reducing noise from image sensor | |
RU2712436C1 (en) | Rccb image processing method | |
US20190075263A1 (en) | Methods and apparatus for high dynamic range imaging | |
CN113781367B (en) | Noise reduction method after low-illumination image histogram equalization | |
US9497427B2 (en) | Method and apparatus for image flare mitigation | |
CN114697571A (en) | Image sensing device and operation method thereof | |
CN111093064A (en) | Image processing | |
US8054348B2 (en) | Noise reduction device and digital camera | |
CN115209120B (en) | Image sensing device, operation method thereof and image processing device | |
US11967042B2 (en) | Data pre-processing for low-light images | |
KR20060022574A (en) | Method for color interpolation of image sensor | |
CN114630010A (en) | Image sensing device and image processing device | |
KR20190100833A (en) | Apparatus for generating high dynamic range image | |
US8339481B2 (en) | Image restoration devices adapted to remove artifacts from a restored image and associated image restoration methods | |
JP2009017583A (en) | Image processing device | |
KR100738185B1 (en) | Image sensor and image contour detection method in image sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20220701 |