KR101634904B1 - Method and Apparatus for Color correction in low light environment - Google Patents
Method and Apparatus for Color correction in low light environment Download PDFInfo
- Publication number
- KR101634904B1 KR101634904B1 KR1020150076631A KR20150076631A KR101634904B1 KR 101634904 B1 KR101634904 B1 KR 101634904B1 KR 1020150076631 A KR1020150076631 A KR 1020150076631A KR 20150076631 A KR20150076631 A KR 20150076631A KR 101634904 B1 KR101634904 B1 KR 101634904B1
- Authority
- KR
- South Korea
- Prior art keywords
- image
- color
- mapping table
- sensor
- illuminance
- Prior art date
Links
Images
Classifications
-
- H04N9/07—
-
- H04N5/217—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
Abstract
Description
The present invention relates to a method and system for correcting image color in low light.
Surveillance cameras such as CCTV typically have a visible light image sensor such as CMOS or CCD and an IR sensor. In the daytime when the light is sufficient, the visible light sensor operates, and at night when the light is insufficient, the IR sensor operates and recognizes the object. However, IR sensor, which operates mainly at night, can not recognize the color of the object. In some cases, the IR image and the RGB image are fused, but in this case, the color is not restored. Therefore, a color correction scheme at low illumination is required.
The object of the present invention is to provide a surveillance camera such as a CCTV having a visible light image sensor such as a CMOS or a CCD and an IR sensor, Method and apparatus. The present invention aims at providing a method and apparatus for correcting color of an image even in a low light level by using a mapping table created through learning.
According to an aspect of the present invention, there is provided a method of correcting color in a low light level, comprising the steps of sensing an image using an IR sensor and a visible light sensor at the same time, performing a preprocessing process to remove noise of the sensed image, Mapping the image to a new color using a previously stored mapping table, and correcting the color of the image with the mapped new color.
The color correction method in the low illumination level may further include a learning step for generating the previously stored mapping table.
The learning step for generating the previously stored mapping table may include the steps of photographing an image at a predetermined illuminance or more with respect to the colors of R, G, and B, and displaying the image at a predetermined illuminance or less with respect to the colors of R, And photographing the IR image in the same illuminance environment as the image photographed at the predetermined illuminance or less. The learning step for generating the previously stored mapping table for all the predetermined colors may be repeated.
A mapping table learned differently according to illuminance can be generated using the photographed images.
The previously stored mapping table can generate a mapping table for each channel.
The step of correcting the color of the image with the mapped new color may include correcting the color of the image by selecting a mapping table corresponding to the inputted roughness when the current roughness is inputted.
According to another aspect of the present invention, there is provided a color correction system in a low illumination level, which includes a sensor unit including an IR sensor and a visible light sensor and simultaneously sensing an image using the IR sensor and the visible light sensor, A preprocessing unit for performing preprocessing to remove noise of an image, a mapping unit for mapping the image into a new color using a previously stored mapping table, and a color correction unit for correcting the color of the image with the new color mapped can do.
The mapping unit may perform a learning process for generating the previously stored mapping table.
The mapping unit may photograph an image at a predetermined illuminance or more and a predetermined illuminance or less with respect to colors of R, G, and B using a visible light sensor, It is possible to photograph the image in the same illuminance environment and repeat the learning step for generating the previously stored mapping table for all the predetermined colors.
The mapping unit may generate a mapping table learned differently according to illuminance using the photographed images.
The mapping unit may generate the previously stored mapping table for each channel.
The color correction unit may correct the color of the image by selecting a mapping table corresponding to the inputted roughness when the current roughness is inputted.
According to the embodiments of the present invention, the color of an image can be corrected even in a low light level using a mapping table created through learning. Therefore, you can see the color of objects even at night when there is not enough light. Also, it is possible to generate a mapping table learned differently for each illumination level, and to select a table corresponding to a given illuminance to correct the color of the image.
1 is a flowchart for explaining a color correction method in a low light level according to an embodiment of the present invention.
FIG. 2 is a block diagram illustrating a color correction process in a low light level according to an exemplary embodiment of the present invention. Referring to FIG.
3 is a flowchart illustrating a learning process for generating a mapping table according to an embodiment of the present invention.
4 is an exemplary diagram of a mapping table according to an embodiment of the present invention.
5 is an exemplary diagram illustrating a mapping table for each color component according to an embodiment of the present invention.
FIG. 6 is an overall block diagram for explaining a color correction process for each illumination according to an embodiment of the present invention.
7 is a diagram illustrating a configuration of a color correction system in a low light level according to an embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
1 is a flowchart for explaining a color correction method in a low light level according to an embodiment of the present invention.
The method of correcting color in low light level includes the steps of sensing
In
In
In
In
The color correction method in the low illumination level may further include a learning step for generating a previously stored mapping table.
The learning step for generating the previously stored mapping table may include the steps of photographing an image at a predetermined illuminance or more with respect to the colors of R, G, and B, and displaying the image at a predetermined illuminance or less with respect to the colors of R, And photographing the IR image in the same illuminance environment as the image photographed at the predetermined illuminance or less. The learning step for generating the previously stored mapping table for all the predetermined colors may be repeated.
A mapping table learned differently according to illuminance can be generated using the photographed images. In addition, the pre-stored mapping table can generate a mapping table for each channel. Will be described in more detail with reference to Figs. 2 to 6. Fig.
FIG. 2 is a block diagram illustrating a color correction process in a low light level according to an exemplary embodiment of the present invention. Referring to FIG.
As shown in FIG. 2, an
The
When the preprocessing process for the
Let the preprocessing process for
The obtained IR image and RGB image are not generally aligned. Apply a general registration method to align them. In other words, the first step in the
3 is a flowchart illustrating a learning process for generating a mapping table according to an embodiment of the present invention.
The color correction method in the low illumination level may further include a learning step for generating a previously stored mapping table.
The learning step for generating the previously stored mapping table may include a step (310) of photographing an image at a predetermined illuminance or higher with respect to each color of R, G and B, a step (310) A
A mapping table learned differently according to illuminance can be generated using the photographed images. In addition, the pre-stored mapping table can generate a mapping table for each channel.
In other words, the mapping table is learned through learning. For example, the learning is taken when each color image is bright and when it is low. Also, IR images are obtained in the same low light environment. Once you have taken a shot for any given color, you can get a mapping table. Therefore, the RGB values are true when they are bright.
4 is an exemplary diagram of a mapping table according to an embodiment of the present invention.
4 is an exemplary diagram showing a mapping table on the assumption that there are a total of N cases. When the corresponding values of IR ', R', G ', and B' are input, the corresponding R ", G", and B "are output. In this case, the size of the table increases greatly according to the bit- can do.
5 is an exemplary diagram illustrating a mapping table for each color component according to an embodiment of the present invention.
As described in FIG. 4, the size of the table can greatly increase according to the bit-depth of the color. In order to solve such a problem as shown in FIG. 5, a table may be provided for each channel of three colors. 5 is a mapping table for the R component, and tables for the G and B components can be created in the same form.
G "and B" images corrected for each pixel as shown in FIG. 2 can be obtained by using the integrated table method of FIG. 4 or the color component-based table method of FIG.
FIG. 6 is an overall block diagram for explaining a color correction process for each illumination according to an embodiment of the present invention.
As described above, the
The
When the preprocessing process for the
Let the preprocessing process for
The obtained IR image and RGB image are not generally aligned. Apply a general registration method to align them. In other words, the first step in
The mapping table 630 is obtained through learning. For example, the learning is taken when each color image is bright and when it is low. Also, IR images are obtained in the same low light environment. Once you have taken a shot for any given color, you can get a mapping table. Therefore, the RGB values are true when they are bright.
However, depending on the illuminance, the color value may vary even though the IR value is the same. Accordingly, as shown in FIG. 6, when the mapping table 630 learned differently according to the roughness is generated in advance, and the current roughness is given as an input, the mapping table matching the given roughness may be selected to perform color correction.
7 is a diagram illustrating a configuration of a
The
The
The
The
The
The
The
The
The
The
The
The
The
The
The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA) A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.
The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.
The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.
Claims (12)
Sensing images simultaneously using an IR sensor and a visible light sensor;
Performing a preprocessing process to remove noise of the sensed image;
Mapping the image into a new color using a previously stored mapping table;
Correcting the color of the image with the mapped new color; And
A learning step for generating the previously stored mapping table
Lt; / RTI >
Wherein the learning step for generating the previously stored mapping table comprises:
Capturing an image at a predetermined illuminance or more for each color of R, G, and B;
Capturing an image at a predetermined illuminance or less with respect to each color of R, G, and B; And
Capturing an IR image in the same illumination environment as the image photographed at a predetermined illuminance or less;
Lt; / RTI >
And a learning step for generating the previously stored mapping table for all the predetermined colors is repeatedly performed.
And generating a mapping table learned differently according to illumination intensity using the photographed images.
Wherein the pre-stored mapping table is capable of generating a mapping table for each channel.
Wherein the step of correcting the color of the image with the mapped new color comprises:
And selecting a mapping table corresponding to the inputted roughness to correct the color of the image when the current roughness is inputted.
A sensor unit including an IR sensor and a visible light sensor and simultaneously sensing an image using the IR sensor and the visible light sensor;
A preprocessing unit for performing a preprocessing process to remove noise of the sensed image;
A mapping unit for mapping the image into a new color using a previously stored mapping table; And
And a color correction unit for correcting the color of the image with the mapped new color,
Lt; / RTI >
Wherein the mapping unit comprises:
The image is photographed at a predetermined illuminance or more and a predetermined illuminance or less with respect to the colors of R, G, and B using a visible light sensor,
An image is captured in the same illuminance environment as that of the image photographed under the predetermined illuminance using the IR sensor,
And a learning step for generating the previously stored mapping table for all of the predetermined colors is repeatedly performed.
Wherein the mapping unit comprises:
And generating a mapping table learned differently according to illuminance using the photographed images.
Wherein the mapping unit comprises:
Wherein the pre-stored mapping table is generated on a channel-by-channel basis.
Wherein the color correction unit comprises:
Wherein the color correction unit corrects the color of the image by selecting a mapping table corresponding to the inputted roughness when the current roughness is inputted.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150076631A KR101634904B1 (en) | 2015-05-29 | 2015-05-29 | Method and Apparatus for Color correction in low light environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150076631A KR101634904B1 (en) | 2015-05-29 | 2015-05-29 | Method and Apparatus for Color correction in low light environment |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101634904B1 true KR101634904B1 (en) | 2016-06-29 |
Family
ID=56366013
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150076631A KR101634904B1 (en) | 2015-05-29 | 2015-05-29 | Method and Apparatus for Color correction in low light environment |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101634904B1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20000020833A (en) * | 1998-09-24 | 2000-04-15 | 전주범 | Apparatus and method for compensating digital image in real time |
KR100731812B1 (en) | 2006-08-11 | 2007-06-22 | 엠텍비젼 주식회사 | Color deviation compensating apparatus and method, image processor using it, recorded medium |
KR20110023694A (en) * | 2009-08-28 | 2011-03-08 | 한국전자통신연구원 | System for converting color of images cinematograph and controlling method thereof |
KR101141844B1 (en) * | 2010-11-15 | 2012-05-07 | 한양대학교 산학협력단 | Method and system for enhancing image under low illumination |
-
2015
- 2015-05-29 KR KR1020150076631A patent/KR101634904B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20000020833A (en) * | 1998-09-24 | 2000-04-15 | 전주범 | Apparatus and method for compensating digital image in real time |
KR100731812B1 (en) | 2006-08-11 | 2007-06-22 | 엠텍비젼 주식회사 | Color deviation compensating apparatus and method, image processor using it, recorded medium |
KR20110023694A (en) * | 2009-08-28 | 2011-03-08 | 한국전자통신연구원 | System for converting color of images cinematograph and controlling method thereof |
KR101141844B1 (en) * | 2010-11-15 | 2012-05-07 | 한양대학교 산학협력단 | Method and system for enhancing image under low illumination |
Non-Patent Citations (1)
Title |
---|
최원희 외 3명. 가시광-근적외광 수광 구조를 이용한 저조도 컬러 영상 생성. 대한전자공학회 학술대회, 2011년 6월, pp. 1664-1667.* * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11704775B2 (en) | Bright spot removal using a neural network | |
JP7077395B2 (en) | Multiplexed high dynamic range image | |
EP3425590B1 (en) | Image processing apparatus, image processing method, and storage medium | |
EP3335420A1 (en) | Systems and methods for multiscopic noise reduction and high-dynamic range | |
CN101742123A (en) | Image processing apparatus and method | |
US8860840B2 (en) | Light source estimation device, light source estimation method, light source estimation program, and imaging apparatus | |
US20170118451A1 (en) | Information processing apparatus, image projection system, and computer program product | |
KR20190095795A (en) | Apparatus and method for estimating optical image stabilization motion | |
JP2016042629A (en) | Imaging apparatus, imaging method and program | |
JP6374849B2 (en) | User terminal, color correction system, and color correction method | |
JP6891808B2 (en) | Image alignment system, method and program | |
CN109429044B (en) | Method for detecting portion of image and image processing apparatus | |
KR101634904B1 (en) | Method and Apparatus for Color correction in low light environment | |
US20120120285A1 (en) | Method and apparatus for reconfiguring time of flight shot mode | |
KR20160035473A (en) | Stereoscopic camera and method for operating the same | |
WO2021025375A1 (en) | Apparatus and method for efficient regularized image alignment for multi-frame fusion | |
JP6301202B2 (en) | Shooting condition setting device and shooting condition setting method | |
JP2005295302A (en) | Camera image processing device | |
CN115280757A (en) | Information processing apparatus, information processing method, and information processing program | |
KR101774913B1 (en) | Method and apparatus for displaying images using pre-processing | |
JP2021005798A (en) | Imaging apparatus, control method of imaging apparatus, and program | |
JP7193359B2 (en) | Image processing system and camera system | |
JP6668967B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and image processing program | |
CN116823590A (en) | Data acquisition method, device, equipment and system | |
JP2006180271A (en) | Imaging apparatus, program, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20190408 Year of fee payment: 4 |