KR101634904B1 - Method and Apparatus for Color correction in low light environment - Google Patents

Method and Apparatus for Color correction in low light environment Download PDF

Info

Publication number
KR101634904B1
KR101634904B1 KR1020150076631A KR20150076631A KR101634904B1 KR 101634904 B1 KR101634904 B1 KR 101634904B1 KR 1020150076631 A KR1020150076631 A KR 1020150076631A KR 20150076631 A KR20150076631 A KR 20150076631A KR 101634904 B1 KR101634904 B1 KR 101634904B1
Authority
KR
South Korea
Prior art keywords
image
color
mapping table
sensor
illuminance
Prior art date
Application number
KR1020150076631A
Other languages
Korean (ko)
Inventor
송병철
Original Assignee
인하대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 인하대학교 산학협력단 filed Critical 인하대학교 산학협력단
Priority to KR1020150076631A priority Critical patent/KR101634904B1/en
Application granted granted Critical
Publication of KR101634904B1 publication Critical patent/KR101634904B1/en

Links

Images

Classifications

    • H04N9/07
    • H04N5/217
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

Presented are a method and apparatus for correcting a color in a low illuminance environment. The method for correcting a color in a low illuminance environment, which is proposed by the present invention, may comprise the steps of: simultaneously sensing an image by using an IR sensor and a visible light sensor; performing a preprocessing process adapted to eliminate noise from the sensed image; mapping the image to a new color by using a previously stored mapping table; and correcting a color of the image by using the mapped new color; and may further comprise the step of performing learning to generate the previously stored mapping table.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a color correction method and apparatus for low-

The present invention relates to a method and system for correcting image color in low light.

Surveillance cameras such as CCTV typically have a visible light image sensor such as CMOS or CCD and an IR sensor. In the daytime when the light is sufficient, the visible light sensor operates, and at night when the light is insufficient, the IR sensor operates and recognizes the object. However, IR sensor, which operates mainly at night, can not recognize the color of the object. In some cases, the IR image and the RGB image are fused, but in this case, the color is not restored. Therefore, a color correction scheme at low illumination is required.

Korean Patent No. 10-0731812

The object of the present invention is to provide a surveillance camera such as a CCTV having a visible light image sensor such as a CMOS or a CCD and an IR sensor, Method and apparatus. The present invention aims at providing a method and apparatus for correcting color of an image even in a low light level by using a mapping table created through learning.

According to an aspect of the present invention, there is provided a method of correcting color in a low light level, comprising the steps of sensing an image using an IR sensor and a visible light sensor at the same time, performing a preprocessing process to remove noise of the sensed image, Mapping the image to a new color using a previously stored mapping table, and correcting the color of the image with the mapped new color.

The color correction method in the low illumination level may further include a learning step for generating the previously stored mapping table.

The learning step for generating the previously stored mapping table may include the steps of photographing an image at a predetermined illuminance or more with respect to the colors of R, G, and B, and displaying the image at a predetermined illuminance or less with respect to the colors of R, And photographing the IR image in the same illuminance environment as the image photographed at the predetermined illuminance or less. The learning step for generating the previously stored mapping table for all the predetermined colors may be repeated.

A mapping table learned differently according to illuminance can be generated using the photographed images.

The previously stored mapping table can generate a mapping table for each channel.

The step of correcting the color of the image with the mapped new color may include correcting the color of the image by selecting a mapping table corresponding to the inputted roughness when the current roughness is inputted.

According to another aspect of the present invention, there is provided a color correction system in a low illumination level, which includes a sensor unit including an IR sensor and a visible light sensor and simultaneously sensing an image using the IR sensor and the visible light sensor, A preprocessing unit for performing preprocessing to remove noise of an image, a mapping unit for mapping the image into a new color using a previously stored mapping table, and a color correction unit for correcting the color of the image with the new color mapped can do.

The mapping unit may perform a learning process for generating the previously stored mapping table.

The mapping unit may photograph an image at a predetermined illuminance or more and a predetermined illuminance or less with respect to colors of R, G, and B using a visible light sensor, It is possible to photograph the image in the same illuminance environment and repeat the learning step for generating the previously stored mapping table for all the predetermined colors.

The mapping unit may generate a mapping table learned differently according to illuminance using the photographed images.

The mapping unit may generate the previously stored mapping table for each channel.

The color correction unit may correct the color of the image by selecting a mapping table corresponding to the inputted roughness when the current roughness is inputted.

According to the embodiments of the present invention, the color of an image can be corrected even in a low light level using a mapping table created through learning. Therefore, you can see the color of objects even at night when there is not enough light. Also, it is possible to generate a mapping table learned differently for each illumination level, and to select a table corresponding to a given illuminance to correct the color of the image.

1 is a flowchart for explaining a color correction method in a low light level according to an embodiment of the present invention.
FIG. 2 is a block diagram illustrating a color correction process in a low light level according to an exemplary embodiment of the present invention. Referring to FIG.
3 is a flowchart illustrating a learning process for generating a mapping table according to an embodiment of the present invention.
4 is an exemplary diagram of a mapping table according to an embodiment of the present invention.
5 is an exemplary diagram illustrating a mapping table for each color component according to an embodiment of the present invention.
FIG. 6 is an overall block diagram for explaining a color correction process for each illumination according to an embodiment of the present invention.
7 is a diagram illustrating a configuration of a color correction system in a low light level according to an embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a flowchart for explaining a color correction method in a low light level according to an embodiment of the present invention.

The method of correcting color in low light level includes the steps of sensing image 110 simultaneously using an IR sensor and a visible light sensor, performing a preprocessing process 120 for removing noise of the sensed image, A step 130 of mapping a new color using the stored mapping table, and a step 140 of correcting the color of the image using the mapped new color.

In step 110, images can be simultaneously sensed using an IR sensor and a visible light sensor. At this time, the IR sensor and the RGB sensor sense the image at the same time. It is assumed that the two sensors are synchronized. Let the IR image generated at a specific point in time be IR, and the three channel images obtained from RGB sensor be R, G, B. RGB sensors can be CMOS or CCD.

In step 120, a preprocessing process for removing the noise of the sensed image may be performed. The images obtained directly from the sensor are generally noisy. We need a preprocessing process to handle them. The preprocessing process can include various image processing including noise cancellation. For example, it can include preprocessing processes to process raw data obtained through IR sensors and visible light sensors that acquire RGB colors.

In step 130, the image may be mapped to a new color using a previously stored mapping table. The IR and RGB images thus obtained are not generally aligned. A general registration method of aligning them is applied

In step 140, the color of the image may be corrected using the mapped new color. The first step in the process of color correction of an image is image matching. Then, a new color is mapped according to the mapping table previously stored in pixel units. The color of the image can be corrected by selecting a mapping table corresponding to the input illuminance when the current illuminance is inputted.

The color correction method in the low illumination level may further include a learning step for generating a previously stored mapping table.

The learning step for generating the previously stored mapping table may include the steps of photographing an image at a predetermined illuminance or more with respect to the colors of R, G, and B, and displaying the image at a predetermined illuminance or less with respect to the colors of R, And photographing the IR image in the same illuminance environment as the image photographed at the predetermined illuminance or less. The learning step for generating the previously stored mapping table for all the predetermined colors may be repeated.

A mapping table learned differently according to illuminance can be generated using the photographed images. In addition, the pre-stored mapping table can generate a mapping table for each channel. Will be described in more detail with reference to Figs. 2 to 6. Fig.

FIG. 2 is a block diagram illustrating a color correction process in a low light level according to an exemplary embodiment of the present invention. Referring to FIG.

As shown in FIG. 2, an IR sensor 211 and a visible light sensor for acquiring RGB color, that is, an RGB sensor 212, preprocessing processes for processing acquired raw data, a previously learned mapping table, And a step of correcting the color.

The IR sensor 211 and the RGB sensor 212 simultaneously sense images. It is assumed that the two sensors are synchronized. Let the IR image generated at a specific point in time be IR, and the three channel images obtained from the visible ray sensor be R, G, B. For example, the RGB sensor can be a CMOS or a CCD. Images obtained directly from these sensors are generally noisy. Therefore, a preprocessing process is required to process them.

When the preprocessing process for the IR sensor 211 is referred to as preprocessing process 1 (221), various image processes including noise cancellation may be included. Let the output IR '.

Let the preprocessing process for RGB sensor 212 be preprocessing process 2 (222). This also includes noise cancellation and de-mosaicing. That is, when the RGB sensor output is a Bayer pattern, it is necessary to convert it to have RGB values per pixel. Let R ', G', B 'be the images obtained through this preprocessing process. Meanwhile, if the resolution of the RGB image of the RGB sensor 212 is different from that of the IR sensor 211, a preprocessing process 221 may include a scaler.

The obtained IR image and RGB image are not generally aligned. Apply a general registration method to align them. In other words, the first step in the color correction 240 is image registration. Then, mapping is performed in a new color according to the mapping table 230 on a pixel basis. The learning process for generating the mapping table according to an embodiment of the present invention will be described in more detail with reference to FIG.

3 is a flowchart illustrating a learning process for generating a mapping table according to an embodiment of the present invention.

The color correction method in the low illumination level may further include a learning step for generating a previously stored mapping table.

The learning step for generating the previously stored mapping table may include a step (310) of photographing an image at a predetermined illuminance or higher with respect to each color of R, G and B, a step (310) A step 320 of photographing an image at the predetermined illuminance, and a step 330 of photographing an IR image in the same illuminance environment as the image taken at the predetermined illuminance or less. Then, the learning step for generating the previously stored mapping table for all the predetermined colors may be repeatedly performed (340).

A mapping table learned differently according to illuminance can be generated using the photographed images. In addition, the pre-stored mapping table can generate a mapping table for each channel.

In other words, the mapping table is learned through learning. For example, the learning is taken when each color image is bright and when it is low. Also, IR images are obtained in the same low light environment. Once you have taken a shot for any given color, you can get a mapping table. Therefore, the RGB values are true when they are bright.

4 is an exemplary diagram of a mapping table according to an embodiment of the present invention.

4 is an exemplary diagram showing a mapping table on the assumption that there are a total of N cases. When the corresponding values of IR ', R', G ', and B' are input, the corresponding R ", G", and B "are output. In this case, the size of the table increases greatly according to the bit- can do.

5 is an exemplary diagram illustrating a mapping table for each color component according to an embodiment of the present invention.

As described in FIG. 4, the size of the table can greatly increase according to the bit-depth of the color. In order to solve such a problem as shown in FIG. 5, a table may be provided for each channel of three colors. 5 is a mapping table for the R component, and tables for the G and B components can be created in the same form.

G "and B" images corrected for each pixel as shown in FIG. 2 can be obtained by using the integrated table method of FIG. 4 or the color component-based table method of FIG.

FIG. 6 is an overall block diagram for explaining a color correction process for each illumination according to an embodiment of the present invention.

As described above, the IR sensor 611 and the visible light sensor for acquiring the RGB color, that is, the RGB sensor 612, the preprocessing processes for processing the acquired raw data, the previously learned mapping table, And a step of correcting the color.

The IR sensor 611 and the RGB sensor 612 simultaneously sense images. It is assumed that the two sensors are synchronized. Let the IR image generated at a specific point in time be IR, and the three channel images obtained from the visible ray sensor be R, G, B. For example, the RGB sensor can be a CMOS or a CCD. Images obtained directly from these sensors are generally noisy. Therefore, a preprocessing process is required to process them.

When the preprocessing process for the IR sensor 611 is referred to as preprocessing process 1 (621), various image processes including noise cancellation may be included. Let the output be IR '.

Let the preprocessing process for RGB sensor 612 be preprocessing process 2 (622). This also includes noise cancellation and de-mosaicing. That is, when the RGB sensor output is a Bayer pattern, it is necessary to convert it to have RGB values per pixel. Let R ', G', and B 'be the images obtained through this preprocessing process. Meanwhile, if the image of the IR sensor 611 and the RGB image of the RGB sensor 612 have different resolutions, a preprocessing process 1 621 may include a scaler.

The obtained IR image and RGB image are not generally aligned. Apply a general registration method to align them. In other words, the first step in color correction 640 is image registration. Then, mapping is performed in a new color according to the mapping table 630 on a pixel basis.

The mapping table 630 is obtained through learning. For example, the learning is taken when each color image is bright and when it is low. Also, IR images are obtained in the same low light environment. Once you have taken a shot for any given color, you can get a mapping table. Therefore, the RGB values are true when they are bright.

However, depending on the illuminance, the color value may vary even though the IR value is the same. Accordingly, as shown in FIG. 6, when the mapping table 630 learned differently according to the roughness is generated in advance, and the current roughness is given as an input, the mapping table matching the given roughness may be selected to perform color correction.

7 is a diagram illustrating a configuration of a color correction system 700 in a low light level according to an embodiment of the present invention.

The color correction system 700 in the low light level according to the present embodiment may include a processor 710, a bus 720, a network interface 730, a memory 740 and a database 750. The memory 740 may include an operating system 741 and a color correction routine 742. The processor 710 may include a network sensor unit 711, a preprocessor 712, a mapping unit 713, and a color correction unit 714. In other embodiments, the color correction system 700 may include more components than the components of FIG. However, there is no need to clearly illustrate most prior art components. For example, the color correction system 700 may include other components such as a display or a transceiver.

The memory 740 may be a computer-readable recording medium and may include a permanent mass storage device such as a random access memory (RAM), a read only memory (ROM), and a disk drive. Also, the memory 740 may store program codes for the operating system 741 and the color correction routine 742. [ These software components may be loaded from a computer readable recording medium separate from the memory 740 using a drive mechanism (not shown). Such a computer-readable recording medium may include a computer-readable recording medium (not shown) such as a floppy drive, a disk, a tape, a DVD / CD-ROM drive, or a memory card. In other embodiments, the software components may be loaded into the memory 740 via the network interface 730 rather than from a computer readable recording medium.

The bus 720 may enable communication and data transfer between components of the color correction system 700. The bus 720 may be configured using a high-speed serial bus, a parallel bus, a Storage Area Network (SAN), and / or other suitable communication technology.

The network interface 730 may be a computer hardware component for connecting the color correction system 700 to a computer network. The network interface 730 may connect the color correction system 700 to a computer network via a wireless or wired connection.

The database 750 may store and hold all information required for color correction in low light. Although FIG. 7 shows that the database 750 is built in the color correction system 700, the present invention is not limited thereto. The database 750 may be omitted depending on the system implementation method or environment. Alternatively, It is also possible to exist as an external database built on another system of the system.

The processor 710 may be configured to process instructions of a computer program by performing basic arithmetic, logic, and color correction system input / output operations. The instructions may be provided to the processor 710 by the memory 740 or the network interface 730 and via the bus 720. The processor 710 may be configured to execute program codes for the sensor unit 711, the preprocessor 712, the mapping unit 713, and the color corrector 714. [ Such program code may be stored in a recording device such as memory 740. [

The sensor unit 711, the preprocessor 712, the mapping unit 713, and the color correction unit 714 may be configured to perform the steps 110 to 140 of FIG.

The color correction system 700 may include a sensor unit 711, a preprocessing unit 712, a mapping unit 713, and a color correction unit 714.

The sensor unit 711 includes an IR sensor and a visible light sensor, and can simultaneously sense an image using the IR sensor and the visible light sensor. At this time, the IR sensor and the RGB sensor sense the image at the same time. It is assumed that the two sensors are synchronized. Let the IR image generated at a specific point in time be IR, and the three channel images obtained from RGB sensor be R, G, B. RGB sensors can be CMOS or CCD.

The preprocessing unit 712 may perform a preprocessing process to remove the noise of the sensed image. The images obtained directly from the sensor are generally noisy. We need a preprocessing process to handle them. The preprocessing process can include various image processing including noise cancellation. For example, it can include preprocessing processes to process raw data obtained through IR sensors and visible light sensors that acquire RGB colors.

The mapping unit 713 can map the image to a new color using a previously stored mapping table. The IR and RGB images thus obtained are not generally aligned. The mapping unit 713 applies a general registration method to align the same. The mapping unit 713 may perform a learning process for generating the previously stored mapping table.

The mapping unit 713 can use the visible light sensor to photograph an image at a predetermined illuminance or more and a predetermined illuminance or less with respect to the colors of R, G, and B, respectively. The IR sensor may be used to photograph an image in the same illuminance environment as the image taken at the predetermined illuminance or less and to repeat the learning step for generating the previously stored mapping table for all the predetermined colors have.

The mapping unit 713 may generate a mapping table differently learned for each illumination using the photographed images. In addition, a mapping table can be generated for each channel of three colors.

The color correction unit 714 may correct the color of the image with the new mapped color. The first step in the process of color correction of an image is image matching. Then, a new color is mapped according to the mapping table previously stored in pixel units. The color of the image can be corrected by selecting a mapping table corresponding to the input illuminance when the current illuminance is inputted. The color corrector 714 may correct the color of the image by selecting a mapping table corresponding to the input illuminance when the current illuminance is input.

The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA) A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

Claims (12)

In the method for correcting color in low light,
Sensing images simultaneously using an IR sensor and a visible light sensor;
Performing a preprocessing process to remove noise of the sensed image;
Mapping the image into a new color using a previously stored mapping table;
Correcting the color of the image with the mapped new color; And
A learning step for generating the previously stored mapping table
Lt; / RTI >
Wherein the learning step for generating the previously stored mapping table comprises:
Capturing an image at a predetermined illuminance or more for each color of R, G, and B;
Capturing an image at a predetermined illuminance or less with respect to each color of R, G, and B; And
Capturing an IR image in the same illumination environment as the image photographed at a predetermined illuminance or less;
Lt; / RTI >
And a learning step for generating the previously stored mapping table for all the predetermined colors is repeatedly performed.
delete delete The method according to claim 1,
And generating a mapping table learned differently according to illumination intensity using the photographed images.
The method according to claim 1,
Wherein the pre-stored mapping table is capable of generating a mapping table for each channel.
The method according to claim 1,
Wherein the step of correcting the color of the image with the mapped new color comprises:
And selecting a mapping table corresponding to the inputted roughness to correct the color of the image when the current roughness is inputted.
In a low-illuminance color correction system,
A sensor unit including an IR sensor and a visible light sensor and simultaneously sensing an image using the IR sensor and the visible light sensor;
A preprocessing unit for performing a preprocessing process to remove noise of the sensed image;
A mapping unit for mapping the image into a new color using a previously stored mapping table; And
And a color correction unit for correcting the color of the image with the mapped new color,
Lt; / RTI >
Wherein the mapping unit comprises:
The image is photographed at a predetermined illuminance or more and a predetermined illuminance or less with respect to the colors of R, G, and B using a visible light sensor,
An image is captured in the same illuminance environment as that of the image photographed under the predetermined illuminance using the IR sensor,
And a learning step for generating the previously stored mapping table for all of the predetermined colors is repeatedly performed.
delete delete 8. The method of claim 7,
Wherein the mapping unit comprises:
And generating a mapping table learned differently according to illuminance using the photographed images.
8. The method of claim 7,
Wherein the mapping unit comprises:
Wherein the pre-stored mapping table is generated on a channel-by-channel basis.
8. The method of claim 7,
Wherein the color correction unit comprises:
Wherein the color correction unit corrects the color of the image by selecting a mapping table corresponding to the inputted roughness when the current roughness is inputted.
KR1020150076631A 2015-05-29 2015-05-29 Method and Apparatus for Color correction in low light environment KR101634904B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150076631A KR101634904B1 (en) 2015-05-29 2015-05-29 Method and Apparatus for Color correction in low light environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150076631A KR101634904B1 (en) 2015-05-29 2015-05-29 Method and Apparatus for Color correction in low light environment

Publications (1)

Publication Number Publication Date
KR101634904B1 true KR101634904B1 (en) 2016-06-29

Family

ID=56366013

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150076631A KR101634904B1 (en) 2015-05-29 2015-05-29 Method and Apparatus for Color correction in low light environment

Country Status (1)

Country Link
KR (1) KR101634904B1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000020833A (en) * 1998-09-24 2000-04-15 전주범 Apparatus and method for compensating digital image in real time
KR100731812B1 (en) 2006-08-11 2007-06-22 엠텍비젼 주식회사 Color deviation compensating apparatus and method, image processor using it, recorded medium
KR20110023694A (en) * 2009-08-28 2011-03-08 한국전자통신연구원 System for converting color of images cinematograph and controlling method thereof
KR101141844B1 (en) * 2010-11-15 2012-05-07 한양대학교 산학협력단 Method and system for enhancing image under low illumination

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000020833A (en) * 1998-09-24 2000-04-15 전주범 Apparatus and method for compensating digital image in real time
KR100731812B1 (en) 2006-08-11 2007-06-22 엠텍비젼 주식회사 Color deviation compensating apparatus and method, image processor using it, recorded medium
KR20110023694A (en) * 2009-08-28 2011-03-08 한국전자통신연구원 System for converting color of images cinematograph and controlling method thereof
KR101141844B1 (en) * 2010-11-15 2012-05-07 한양대학교 산학협력단 Method and system for enhancing image under low illumination

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
최원희 외 3명. 가시광-근적외광 수광 구조를 이용한 저조도 컬러 영상 생성. 대한전자공학회 학술대회, 2011년 6월, pp. 1664-1667.* *

Similar Documents

Publication Publication Date Title
US11704775B2 (en) Bright spot removal using a neural network
JP7077395B2 (en) Multiplexed high dynamic range image
EP3425590B1 (en) Image processing apparatus, image processing method, and storage medium
EP3335420A1 (en) Systems and methods for multiscopic noise reduction and high-dynamic range
CN101742123A (en) Image processing apparatus and method
US8860840B2 (en) Light source estimation device, light source estimation method, light source estimation program, and imaging apparatus
US20170118451A1 (en) Information processing apparatus, image projection system, and computer program product
KR20190095795A (en) Apparatus and method for estimating optical image stabilization motion
JP2016042629A (en) Imaging apparatus, imaging method and program
JP6374849B2 (en) User terminal, color correction system, and color correction method
JP6891808B2 (en) Image alignment system, method and program
CN109429044B (en) Method for detecting portion of image and image processing apparatus
KR101634904B1 (en) Method and Apparatus for Color correction in low light environment
US20120120285A1 (en) Method and apparatus for reconfiguring time of flight shot mode
KR20160035473A (en) Stereoscopic camera and method for operating the same
WO2021025375A1 (en) Apparatus and method for efficient regularized image alignment for multi-frame fusion
JP6301202B2 (en) Shooting condition setting device and shooting condition setting method
JP2005295302A (en) Camera image processing device
CN115280757A (en) Information processing apparatus, information processing method, and information processing program
KR101774913B1 (en) Method and apparatus for displaying images using pre-processing
JP2021005798A (en) Imaging apparatus, control method of imaging apparatus, and program
JP7193359B2 (en) Image processing system and camera system
JP6668967B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
CN116823590A (en) Data acquisition method, device, equipment and system
JP2006180271A (en) Imaging apparatus, program, and recording medium

Legal Events

Date Code Title Description
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190408

Year of fee payment: 4