CN114143443B - Dual-sensor imaging system and imaging method thereof - Google Patents

Dual-sensor imaging system and imaging method thereof Download PDF

Info

Publication number
CN114143443B
CN114143443B CN202011540274.1A CN202011540274A CN114143443B CN 114143443 B CN114143443 B CN 114143443B CN 202011540274 A CN202011540274 A CN 202011540274A CN 114143443 B CN114143443 B CN 114143443B
Authority
CN
China
Prior art keywords
image
color
infrared
scene
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011540274.1A
Other languages
Chinese (zh)
Other versions
CN114143443A (en
Inventor
彭诗渊
郑书峻
黄旭鍊
李运锦
赖国铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Altek Semiconductor Corp
Original Assignee
Altek Semiconductor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Altek Semiconductor Corp filed Critical Altek Semiconductor Corp
Publication of CN114143443A publication Critical patent/CN114143443A/en
Application granted granted Critical
Publication of CN114143443B publication Critical patent/CN114143443B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions

Abstract

The invention provides a dual-sensor imaging system and an imaging method thereof. The dual sensor camera system includes at least one color sensor, at least one infrared sensor, a memory device, and a processor. The processor is configured to load and execute a computer program stored in the storage device to: identifying an imaging scene of a dual-sensor imaging system; controlling a color sensor and an infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for an imaging scene; adaptively selecting a combination of color images and infrared images that reveals details of the imaging scene; and fusing the selected color image and the infrared image to generate a scene image having details of the captured scene.

Description

Dual-sensor imaging system and imaging method thereof
Technical Field
The present disclosure relates to an imaging system and method, and more particularly, to a dual-sensor imaging system and method.
Background
The exposure conditions of cameras (including aperture, shutter, perceived brightness) affect the quality of the captured image, and many cameras automatically adjust the exposure conditions during the capture of the image to obtain a clear and bright image. However, in a scene with high contrast such as a low light source or a backlight, the result of adjusting the exposure condition by the camera may generate too high noise or overexposure of a partial area, which cannot satisfy the image quality of all the areas.
In view of this, a new image sensor architecture is adopted in the prior art, which utilizes the sensitivity characteristic of an Infrared (IR) sensor Gao Guangmin to insert and dispose IR pixels in color pixels of the image sensor to assist in brightness detection. For example, fig. 1 is a schematic diagram of an existing image acquisition using an image sensor. Referring to fig. 1, in a conventional image sensor 10, pixels of red (R), green (G), blue (B) and the like are arranged, and pixels of infrared (I) are also arranged alternately. Thus, the image sensor 10 is able to combine the color information 12 acquired by the R, G, B color pixels with the luminance information 14 acquired by the I pixels to obtain an image 16 of moderate color and luminance.
However, under the above-mentioned architecture of a single image sensor, the exposure condition of each pixel in the image sensor is the same, so that only the exposure condition suitable for the color pixel or the infrared pixel can be selected to acquire the image, and as a result, the characteristics of the two pixels cannot be effectively utilized to improve the image quality of the acquired image.
Disclosure of Invention
The invention provides a dual-sensor image pickup system and an image pickup method thereof, which utilize independently configured color and infrared sensors to respectively acquire a plurality of images under different exposure conditions, select proper colors of the exposure conditions and the infrared images to be fused into a result image, complement the texture details of the color images and improve the image quality of the photographed images.
The dual-sensor camera system comprises at least one color sensor, at least one infrared sensor, a storage device and a processor coupled with the color sensor, the infrared sensor and the storage device. The processor is configured to load and execute a computer program stored in a storage device to: identifying an imaging scene of a dual-sensor imaging system; controlling a color sensor and an infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for an imaging scene; adaptively selecting a combination of color images and infrared images that reveals details of the imaging scene; and fusing the selected color image and the infrared image to generate a scene image having details of the captured scene.
The image pickup method of the double-sensor image pickup system is suitable for the double-sensor image pickup system comprising at least one color sensor, at least one infrared sensor and a processor. The method comprises the following steps: identifying an imaging scene of a dual-sensor imaging system; controlling a color sensor and an infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for an imaging scene; adaptively selecting a combination of color images and infrared images that reveals details of the imaging scene; and fusing the selected color image and the infrared image to generate a scene image having details of the captured scene.
Based on the above, the dual-sensor image capturing system and the image capturing method thereof of the present invention acquire a plurality of images on the color sensor and the infrared sensor which are independently configured, and select a combination of the color image and the infrared image which can reveal details of the image capturing scene for fusion, thereby generating a scene image with details of the image capturing scene and improving the image quality of the captured image.
In order to make the present disclosure more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a schematic diagram of a prior art image acquisition using an image sensor;
FIG. 2 is a schematic diagram illustrating the use of an image sensor to acquire an image in accordance with an embodiment of the present invention;
FIG. 3 is a block diagram of a dual sensor camera system according to one embodiment of the present invention;
FIG. 4 is a flow chart of an imaging method of a dual sensor imaging system according to an embodiment of the present invention;
FIG. 5 is a flow chart of an imaging method of a dual sensor imaging system according to an embodiment of the present invention;
FIG. 6 is an example of an imaging method of a dual sensor imaging system according to an embodiment of the present invention;
FIG. 7 is a flow chart of an imaging method of a dual sensor imaging system according to an embodiment of the present invention;
FIG. 8 is an example of an imaging method of a dual sensor imaging system according to an embodiment of the present invention;
fig. 9 is a flowchart of an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention.
Symbol description
10. 20: image sensor
12: color information
14: luminance information
16: image processing apparatus
22: color sensor
22a, 62, 82: color image
24: infrared sensor
24a, 64, 84: infrared image
26. 66, 86: scene image
30: dual sensor camera system
32: color sensor
34: infrared sensor
36: storage device
38: processor and method for controlling the same
62a: face region
82a, 84a, 86a: cola tank area
R, G, B, I: pixel arrangement
S402 to S408, S502 to S510, S702 to S708, S902 to S914: step (a)
Detailed Description
Fig. 2 is a schematic diagram illustrating capturing an image using an image sensor in accordance with an embodiment of the present invention. Referring to fig. 2, the image sensor 20 of the embodiment of the present invention adopts a dual-sensor architecture in which the color sensor 22 and the Infrared (IR) sensor 24 are independently configured, uses the characteristics of the color sensor 22 and the IR sensor 24 to respectively obtain a plurality of images by using a plurality of exposure conditions suitable for the current shooting scene, selects a color image 22a and an IR image 24a with appropriate exposure conditions, and uses the IR image 24a to complement the texture details lacking in the color image 22a in an image fusion manner, thereby obtaining a scene image 26 with good color and texture details.
Fig. 3 is a block diagram of a dual sensor camera system according to an embodiment of the present invention. Referring to fig. 3, the dual-sensor camera system 30 of the present embodiment can be configured in an electronic device such as a mobile phone, a tablet computer, a notebook computer, a navigation device, a driving recorder, a digital camera, a digital video camera, etc. for providing a camera function. The dual sensor camera system 30 includes at least one color sensor 32, at least one infrared sensor 34, a memory device 36, and a processor 38, the functions of which are as follows:
the color sensor 32 may, for example, comprise a charge coupled device (Charge Coupled Device, CCD), a complementary metal oxide semiconductor (Complementary Metal-Oxide Semiconductor, CMOS) device, or other type of photosensitive device, and may sense light intensity to produce an image of the camera scene. The color sensor 32 is, for example, a red, green and blue (RGB) image sensor, which includes red (R), green (G) and blue (B) color pixels, and is configured to acquire color information such as red light, green light and blue light in the imaging scene, and combine the color information to generate a color image of the imaging scene.
The infrared sensor 34 includes, for example, a CCD, a CMOS device, or other kind of photosensitive device, which is capable of sensing infrared light by adjusting a wavelength sensing range of the photosensitive device. The infrared sensor 34 acquires infrared light information in the imaging scene using the above-described photosensitive device as a pixel, for example, and synthesizes the infrared light information to generate an infrared image of the imaging scene.
The Memory device 36 is, for example, any type of fixed or removable random access Memory (Random Access Memory, RAM), read-Only Memory (ROM), flash Memory (Flash Memory), hard disk, or the like, or a combination thereof, for storing a computer program executable by the processor 38. In some embodiments, storage device 36 may also store, for example, color images acquired by color sensor 32 and infrared images acquired by infrared sensor 34.
The processor 38 is, for example, a central processing unit (Central Processing Unit, CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), microcontroller (Microcontroller), digital signal processor (Digital Signal Processor, DSP), programmable controller, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), programmable logic device (Programmable Logic Device, PLD), or other similar device, or combination of devices, as the invention is not limited in this regard. In this embodiment, the processor 38 may load a computer program from the storage device 36 to execute the image capturing method of the dual sensor image capturing system according to the embodiment of the present invention.
Fig. 4 is a flowchart of an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and fig. 4, the method of the present embodiment is applicable to the dual-sensor imaging system 30, and the detailed steps of the imaging method of the present embodiment are described below with respect to each device of the dual-sensor imaging system 30.
In step S402, the imaging scene of the dual sensor imaging system 30 is identified by the processor 38. In some embodiments, processor 38, for example, controls at least one of color sensor 32 and infrared sensor 34 to take at least one standard image of the camera scene using standard exposure conditions and to use these standard images to identify the camera scene. The standard exposure conditions include parameters such as aperture, shutter, brightness, etc. determined by the existing photometry technique, and the processor 38 identifies the image scene according to the intensity or distribution of image parameters such as Hue (Hue), brightness (Value), chroma (Chroma), white balance, etc. of the image acquired under the exposure conditions, including the position (indoor or outdoor) of the image scene, the light source (high light source or low light source), the contrast (high contrast or low contrast), the type (object or portrait) or the state (dynamic or static) of the image object, etc. In other embodiments, the processor 38 may also use a positioning method to identify the image scene or directly receive the user operation to set the image scene, which is not limited herein.
In step S404, the processor 38 controls the color sensor 32 and the infrared sensor 34 to acquire a plurality of color images and a plurality of infrared images, respectively, using a plurality of exposure conditions suitable for the recognized imaging scene. In some embodiments, processor 38 controls color sensor 32 and infrared sensor 34 to obtain color images with shorter or longer Exposure times, for example, based on Exposure times in standard Exposure conditions, and the difference between the Exposure times of the color images is, for example, any Value in an Exposure Value (EV) between-3 and 3, which is not limited herein. For example, if an a-picture is twice brighter than a B-picture, then the EV of the B-picture may be incremented by 1, and so on, the exposure value may be a fraction (e.g., +0.3 EV), without limitation.
In step S406, the processor 38 adaptively selects a combination of the color image and the infrared image that reveals details of the imaging scene. In some embodiments, processor 38 may control color sensor 32 to acquire color images at the appropriate exposure times, for example, so that portions of the color details of the captured scene may be preserved and to ensure that the later fused image reveals the color details of the captured scene. The appropriate exposure time is, for example, an exposure time shorter than an exposure time that would cause overexposure of the acquired image by a preset time length, for example, any value of 0.01 to 1 second, without limitation.
In some embodiments, processor 38 may, for example, select one of the color images as a reference image based on the color details of the respective color image, identify at least one defective area in the reference image that lacks texture details, and select one of the infrared images as the image fused with the reference image based on the texture details of the images of the respective infrared images that correspond to the defective areas.
In detail, based on the color sensor 32 acquiring color images only with a single exposure condition at a time, in the case of a low light source or a high contrast of the imaging scene, each color image may have a high noise, overexposed or underexposed area (i.e., the above-mentioned defective area). At this time, the processor 38 may select an infrared image having texture details of the defective region from the plurality of previously acquired infrared images for the defective region by utilizing the characteristic of high photosensitivity of the infrared sensor 34, and may complement the texture details of the defective region in the color image.
In step S408, the selected color image and the infrared ray image are fused by the processor 38 to generate a scene image having details of the imaging scene. In some embodiments, processor 38 directly fuses the selected color image and the entire image of the infrared image, for example, by calculating an average or weighted average of pixel values of corresponding pixels in the selected color image and the entire image of the infrared image, or by other image fusion means. In some embodiments, processor 38 may also use the image corresponding to the defective area in the infrared image to fill in or replace the image of the defective area in the color image for only the defective area in the color image, without limitation.
By the above method, the dual-sensor image capturing system 30 not only can select a color image with better color details, but also can fill or replace an area with insufficient texture details in the color image by using an image of a corresponding area in the infrared image, and finally generate an image which can comprise all details (color and texture details) of the captured scene, thereby improving the image quality of the captured image.
Fig. 5 is a flowchart of an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and 5, the present embodiment further illustrates a detailed implementation of the above embodiment for fusing the whole image. The method of the present embodiment is applicable to the dual-sensor imaging system 30, and the detailed steps of the imaging method of the present embodiment will be described below with respect to each element of the dual-sensor imaging system 30.
In step S502, one of the color images is selected as a reference image by the processor 38 according to the color details of each color image. In one embodiment, processor 38 selects, for example, the color image with the most color details as the reference image. The size of the color details may be determined, for example, by the size of the overexposed or underexposed areas in the color image. In detail, the color of the overexposed region pixels is closer to white and the color of the underexposed region pixels is closer to black, so that the color details of these regions are less. Therefore, if more such areas are included in the color image, which represents less color details, the processor 38 can determine which color image has the greatest color details, and use the color image as the reference image. In other embodiments, processor 38 may also distinguish how much of its color details are based on the contrast, saturation, or other image parameters of each color image, without limitation.
In step S504, at least one defective area in the reference image lacking texture details is identified by the processor 38. The defect area is, for example, the above overexposed area or underexposed area, or the area with higher noise obtained under the low light source, which is not limited herein.
In step S506, one of the infrared images is selected by the processor 38 based on the texture details of the image corresponding to the defective area among the infrared images. In an embodiment, processor 38, for example, selects the infrared image with the most texture details of the image corresponding to the defective area as the image to be fused with the reference image. The processor 38 is not limited herein, and can distinguish how much the texture details are based on, for example, the contrast of each infrared image or other image parameters.
In step S508, feature acquisition is performed on the selected color image and infrared image by the processor 38 to acquire a plurality of features in the color image and infrared image, and the color image and infrared image are aligned according to the correspondence between the acquired features. It should be noted that the above-mentioned feature acquisition and matching are merely examples, and in other embodiments, the processor 38 may align the color image and the infrared image by using other kinds of image alignment methods, which is not limited herein.
At step S510, the aligned infrared image is image fused with a reference image by processor 38 to generate a scene image that complements the texture details of the defect region.
In some embodiments, processor 38 performs image fusion of the infrared image with the reference image, for example, by calculating an average or weighted average of pixel values of corresponding pixels in the color image and the entire image of the infrared image.
In some embodiments, processor 38 converts the color space of the reference image from, for example, an RGB color space to a YUV color space, and replaces the luminance component of the converted reference image with the luminance component of the infrared image, and then converts the color space of the replaced reference image back to the RGB color space to generate the scene image. In other embodiments, the processor 38 may also convert the color space of the reference image to YCbCr, CMYK or other color space, and convert the color space back to the original color space after replacing the luminance component, which is not limited in the embodiment.
In detail, since the luminance component of the infrared image has a better signal-to-noise ratio (SNR) and includes more texture details of the imaging scene, the luminance component of the infrared image is directly substituted for the luminance component of the reference image, so that the texture details in the reference image can be greatly increased.
By the above method, the dual-sensor image capturing system 30 can use the infrared image to increase the texture details of the color image, especially for the region with insufficient texture details, so as to improve the image quality of the captured image.
For example, fig. 6 is an example of an imaging method of a dual sensor imaging system according to an embodiment of the present invention. Referring to fig. 6, in the present embodiment, by using the image capturing method of fig. 5 as described above, a color image 62 with the greatest color details is selected as a reference image, and for a defect area (e.g., a face area 62 a) lacking texture details in the color image 62, an infrared image 64 with the greatest texture details of the defect area is selected from a plurality of infrared images obtained under different exposure conditions, and is used for image fusion with the color image 62, so as to obtain a scene image 66 with more color details and texture details.
Fig. 7 is a flowchart of an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and fig. 7, the present embodiment further illustrates a detailed implementation of the above embodiment for fusing defective areas. The method of the present embodiment is applicable to the dual-sensor imaging system 30, and the detailed steps of the imaging method of the present embodiment will be described below with respect to each element of the dual-sensor imaging system 30.
In step S702, one of the color images is selected as a reference image by the processor 38 according to the color details of each color image. In step S704, at least one defective area in the reference image lacking texture details is identified by the processor 38. In step S706, one of the infrared images is selected by the processor 38 based on the texture details of the image corresponding to the defective area among the infrared images. The implementation manners of the steps S702 to S706 are the same as or similar to the steps S502 to S506 of the previous embodiment, respectively, so details thereof are not repeated here.
Unlike the previous embodiment, in step S708, the processor 38 replaces the luminance component of the image of the defective area in the reference image with the luminance component of the image of the infrared ray corresponding to the defective area to generate a scene image complementing the texture details of the defective area.
In some embodiments, processor 38 converts the color space of the reference image from, for example, an RGB color space to a YUV color space, and replaces the luminance component of the image of the defective area of the converted reference image with the luminance component of the infrared image corresponding to the defective area, and then converts the color space of the replaced reference image back to the RGB color space to generate the scene image. In other embodiments, the processor 38 may also convert the color space of the reference image to YCbCr, CMYK or other color space, and convert the color space back to the original color space after replacing the luminance component, which is not limited in the embodiment.
By the above method, the dual-sensor image capturing system 30 can use the infrared image to complement the area with insufficient texture details in the color image, thereby improving the image quality of the captured image.
For example, fig. 8 is an example of an imaging method of a dual sensor imaging system according to an embodiment of the present invention. Referring to fig. 8, in the present embodiment, by using the image capturing method of fig. 7, a color image 82 with the greatest color details is selected as a reference image, and for a defective area (for example, a cola can area 82 a) lacking texture details in the color image 82, an infrared image 84 with the greatest texture details of the defective area is selected from a plurality of infrared images obtained under different exposure conditions, and the luminance component of the cola can area 82a is replaced with the luminance component of a corresponding cola can area 84a in the infrared image 84, so as to obtain a scene image 86 with more texture details in the cola can area 86 a.
It should be noted that in some embodiments, the texture details of certain defective areas in the color image may not be enhanced or complemented by the infrared image due to specific factors, such as parallax (Parallax) between the color sensor 32 and the infrared sensor 34, which may cause the infrared sensor 34 to be obscured. In this case, embodiments of the present invention provide an alternative way to increase the texture detail of the defective area to maximize the image quality of the captured image.
Fig. 9 is a flowchart of an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and 9, the method of the present embodiment is applicable to the dual-sensor imaging system 30, and the detailed steps of the imaging method of the present embodiment are described below with respect to each device of the dual-sensor imaging system 30.
In step S902, at least one of the color sensor 32 and the infrared sensor 34 is controlled by the processor 38 to acquire at least one standard image of the imaging scene using standard exposure conditions, and to identify the imaging scene using these standard images. The definition of the standard exposure condition and the identification manner of the imaging scene are as described in the foregoing embodiments, and are not described herein.
In step S904, the processor 38 controls the color sensor 32 and the infrared sensor 34 to acquire a plurality of color images and a plurality of infrared images, respectively, using a plurality of exposure conditions suitable for the recognized imaging scene. In step S906, one of the color images is selected as a reference image by the processor 38 according to the color details of each color image. In step S908, at least one defective area in the reference image lacking texture detail is identified by the processor 38. The implementation manners of the steps S904 to S908 are the same as or similar to the steps S404, S702 to S704 of the previous embodiment, respectively, so details thereof are not repeated here.
Unlike the previous embodiment, in step S910, processor 38 determines whether any of the above-mentioned plurality of infrared images includes texture details of the defective area in the reference image. The processor 38, for example, examines the infrared images for areas corresponding to the defective areas to determine whether the infrared sensor 34 is masked and whether the infrared images can be used to fill the texture details of the defective areas in the reference image.
If the ir image includes texture details of the defective area, then in step S912, processor 38 replaces the luminance component of the image of the defective area in the reference image with the luminance component of the ir image corresponding to the defective area to generate a scene image that complements the texture details of the defective area. The implementation of step S912 is the same as or similar to step S708 in the previous embodiment, so the details thereof are not described here again.
If no ir image includes texture details of the defective area, then in step S914, processor 38 controls color sensor 32 to acquire multiple color images with multiple exposures longer or shorter than the exposure time of the reference image and performs a high dynamic range (high dynamic range, HDR) process to generate a scene image with texture details of the defective area.
In some embodiments, processor 38 controls color sensor 32 to acquire a color image having a shorter exposure time and a color image having a longer exposure time, respectively, using a shorter exposure time than the exposure time and a longer exposure time than the exposure time, for example, based on the exposure time of the reference image selected by the processor, and performs the HDR processing in combination with the color image acquired using the original exposure time. That is, an HDR image with excellent details of a bright portion and a dark portion is obtained as a final output scene image by selecting a region with excellent color and texture details from three color images to complement a region lacking details in other color images.
In some embodiments, the processor 38 may perform Noise Reduction (NR) processing, such as two-dimensional spatial noise reduction (2D spatial denoise), on the HDR image to reduce noise in the HDR image and improve the image quality of the final output image.
In some embodiments, the processor 38 may combine the processing manners of steps S912 and S914 to individually select an appropriate processing manner for the plurality of defect areas in the reference image, so as to maximize the details of the reference image, thereby improving the image quality of the captured image.
In summary, the dual-sensor image capturing system and the image capturing method thereof according to the present invention are configured with the color sensor and the infrared sensor independently, and acquire a plurality of images respectively using a plurality of exposure conditions suitable for a current captured scene, and select a color image and an infrared image with appropriate exposure conditions for fusion, so as to fill or increase texture details lacking in the color image by using the infrared image, thereby generating a scene image with details of the captured scene, and improving the image quality of the captured image.
While the present disclosure has been described with reference to the exemplary embodiments, it should be understood that the invention is not limited thereto, but may be modified or altered somewhat by persons skilled in the art without departing from the spirit and scope of the present disclosure.

Claims (18)

1. A dual sensor camera system comprising:
at least one color sensor;
at least one infrared sensor;
a storage device storing a computer program; and
a processor, coupled to the at least one color sensor, the at least one infrared sensor, and the storage device, configured to load and execute the computer program to:
identifying an imaging scene of the dual-sensor imaging system;
controlling the at least one color sensor and the at least one infrared sensor to acquire a plurality of color images and a plurality of infrared images respectively by adopting a plurality of exposure conditions suitable for the shooting scene;
adaptively selecting a combination of the color images and the infrared images that reveals details of the camera scene; and
fusing the selected color image and the infrared image to generate a scene image having the details of the camera scene, wherein the processor comprises:
selecting one of the color images as a reference image according to the color details of each color image;
identifying at least one defective area in the reference image lacking texture detail; and
and selecting one of the infrared images according to the texture details of the image corresponding to the at least one defect area in each infrared image, and fusing the infrared image with the reference image.
2. The dual sensor camera system of claim 1, wherein said processor comprises:
at least one of the at least one color sensor and the at least one infrared sensor is controlled to acquire at least one standard image of the camera scene using standard exposure conditions, and the scene is identified using the at least one standard image.
3. The dual sensor camera system of claim 1, wherein said processor comprises:
selecting the color image with the greatest color details as the reference image; and
the infrared image with the most texture details corresponding to the image of the at least one defect area is selected and used for fusion with the reference image.
4. The dual sensor camera system of claim 1, wherein said processor comprises:
replacing a luminance component of an image of the at least one defective area in the reference image with an image of the infrared image corresponding to the at least one defective area to generate the scene image complementing the texture detail of the at least one defective area.
5. The dual sensor camera system of claim 1, wherein said processor comprises:
image fusion is performed on the selected infrared image and the reference image to generate the scene image complementing the texture details of the defective area.
6. The dual sensor camera system of claim 1, wherein said processor further comprises:
judging whether each infrared image comprises the texture details of the at least one defect area; and
and when none of the infrared images includes the texture details, controlling the at least one color sensor to acquire a plurality of color images and perform high dynamic range processing by adopting a plurality of exposure times longer or shorter than the exposure time of the reference image so as to generate the scene image with the texture details of the at least one defect area.
7. The dual sensor camera system of claim 1, wherein said processor comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the brightness component of the converted image of the at least one defective area of the color image with the selected image of the infrared image corresponding to the at least one defective area; and
and converting the color space of the replaced color image back to the RGB color space to generate the scene image.
8. The dual sensor camera system of claim 1, wherein said processor comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the brightness component of the converted color image with the brightness component of the selected infrared image; and
and converting the color space of the replaced color image back to the RGB color space to generate the scene image.
9. The dual sensor camera system of claim 1, wherein said processor comprises:
and acquiring a plurality of characteristics in the selected color image and the selected infrared image, and aligning the color image and the infrared image according to the acquired corresponding relation between the characteristics.
10. An image capturing method of a dual sensor image capturing system including at least one color sensor, at least one infrared sensor, and a processor, the method comprising the steps of:
identifying an imaging scene of the dual-sensor imaging system;
controlling the at least one color sensor and the at least one infrared sensor to acquire a plurality of color images and a plurality of infrared images respectively by adopting a plurality of exposure conditions suitable for the shooting scene;
adaptively selecting a combination of the color images and the infrared images that reveals details of the camera scene; and
fusing the selected color image and the infrared image to generate a scene image having the details of the imaging scene, comprising:
selecting one of the color images as a reference image according to the color details of each color image;
identifying at least one defective area in the reference image lacking texture detail; and
and selecting one of the infrared images according to the texture details of the image corresponding to the at least one defect area in each infrared image.
11. The method of claim 10, wherein identifying the camera scene of the dual sensor camera system comprises:
at least one of the at least one color sensor and the at least one infrared sensor is controlled to acquire at least one standard image of the camera scene using standard exposure conditions, and the scene is identified using the at least one standard image.
12. The method of claim 10, wherein adaptively selecting the combination of the color image and the infrared image that reveals details of the camera scene comprises:
selecting the color image with the greatest color details as the reference image; and
the infrared image with the most texture details corresponding to the image of the at least one defect area is selected and used for fusion with the reference image.
13. The method of claim 10, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
replacing a luminance component of an image of the at least one defective area in the reference image with an image of the infrared image corresponding to the at least one defective area to generate the scene image complementing the texture detail of the at least one defective area.
14. The method of claim 10, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
image fusion is performed on the selected infrared image and the reference image to generate the scene image complementing the texture details of the defective area.
15. The method of claim 10, wherein prior to the step of fusing the selected color image and the infrared image to generate a scene image with the details of the camera scene, the method further comprises:
judging whether each infrared image comprises the texture details of the at least one defect area; and
and when none of the infrared images includes the texture details, controlling the at least one color sensor to acquire a plurality of color images and perform high dynamic range processing by adopting a plurality of exposure times longer or shorter than the exposure time of the reference image so as to generate the scene image with the texture details of the at least one defect area.
16. The method of claim 10, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the brightness component of the converted image of the at least one defective area of the color image with the selected image of the infrared image corresponding to the at least one defective area; and
and converting the color space of the replaced color image back to the RGB color space to generate the scene image.
17. The method of claim 10, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the brightness component of the converted color image with the brightness component of the selected infrared image; and
and converting the color space of the replaced color image back to the RGB color space to generate the scene image.
18. The method of claim 10, wherein prior to the step of fusing the selected color image and the infrared image to generate a scene image with the details of the camera scene, the method further comprises:
and acquiring a plurality of characteristics in the selected color image and the selected infrared image, and aligning the color image and the infrared image according to the acquired corresponding relation between the characteristics.
CN202011540274.1A 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof Active CN114143443B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063074477P 2020-09-04 2020-09-04
US63/074,477 2020-09-04

Publications (2)

Publication Number Publication Date
CN114143443A CN114143443A (en) 2022-03-04
CN114143443B true CN114143443B (en) 2024-04-05

Family

ID=80438521

Family Applications (5)

Application Number Title Priority Date Filing Date
CN202011541300.2A Active CN114143418B (en) 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof
CN202011540274.1A Active CN114143443B (en) 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof
CN202011625515.2A Pending CN114143420A (en) 2020-09-04 2020-12-30 Double-sensor camera system and privacy protection camera method thereof
CN202011622478.XA Active CN114143419B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and depth map calculation method thereof
CN202011625552.3A Active CN114143421B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and calibration method thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202011541300.2A Active CN114143418B (en) 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN202011625515.2A Pending CN114143420A (en) 2020-09-04 2020-12-30 Double-sensor camera system and privacy protection camera method thereof
CN202011622478.XA Active CN114143419B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and depth map calculation method thereof
CN202011625552.3A Active CN114143421B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and calibration method thereof

Country Status (2)

Country Link
CN (5) CN114143418B (en)
TW (5) TWI767468B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091341B (en) * 2022-12-15 2024-04-02 南京信息工程大学 Exposure difference enhancement method and device for low-light image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965654A (en) * 2018-02-11 2018-12-07 浙江宇视科技有限公司 Double spectrum camera systems and image processing method based on single-sensor
CN109474770A (en) * 2017-09-07 2019-03-15 华为技术有限公司 A kind of imaging device and imaging method
CN110136183A (en) * 2018-02-09 2019-08-16 华为技术有限公司 A kind of method and relevant device of image procossing
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium
WO2020168465A1 (en) * 2019-02-19 2020-08-27 华为技术有限公司 Image processing device and method

Family Cites Families (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004246252A (en) * 2003-02-17 2004-09-02 Takenaka Komuten Co Ltd Apparatus and method for collecting image information
JP2005091434A (en) * 2003-09-12 2005-04-07 Noritsu Koki Co Ltd Position adjusting method and image reader with damage compensation function using the same
JP4244018B2 (en) * 2004-03-25 2009-03-25 ノーリツ鋼機株式会社 Defective pixel correction method, program, and defective pixel correction system for implementing the method
JP4341680B2 (en) * 2007-01-22 2009-10-07 セイコーエプソン株式会社 projector
US9307212B2 (en) * 2007-03-05 2016-04-05 Fotonation Limited Tone mapping for low-light video frame enhancement
EP2289235A4 (en) * 2008-05-20 2011-12-28 Pelican Imaging Corp Capturing and processing of images using monolithic camera array with hetergeneous imagers
US8866920B2 (en) * 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
CN101404060B (en) * 2008-11-10 2010-06-30 北京航空航天大学 Human face recognition method based on visible light and near-infrared Gabor information amalgamation
US8749635B2 (en) * 2009-06-03 2014-06-10 Flir Systems, Inc. Infrared camera systems and methods for dual sensor applications
US8681216B2 (en) * 2009-03-12 2014-03-25 Hewlett-Packard Development Company, L.P. Depth-sensing camera system
US20120154596A1 (en) * 2009-08-25 2012-06-21 Andrew Augustine Wajs Reducing noise in a color image
US8478123B2 (en) * 2011-01-25 2013-07-02 Aptina Imaging Corporation Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
JP2013115679A (en) * 2011-11-30 2013-06-10 Fujitsu General Ltd Imaging apparatus
US10848731B2 (en) * 2012-02-24 2020-11-24 Matterport, Inc. Capturing and aligning panoramic image and depth data
TW201401186A (en) * 2012-06-25 2014-01-01 Psp Security Co Ltd System and method for identifying human face
CN104662897A (en) * 2012-09-25 2015-05-27 日本电信电话株式会社 Image encoding method, image decoding method, image encoding device, image decoding device, image encoding program, image decoding program, and recording medium
KR102086509B1 (en) * 2012-11-23 2020-03-09 엘지전자 주식회사 Apparatus and method for obtaining 3d image
WO2014100787A1 (en) * 2012-12-21 2014-06-26 Flir Systems, Inc. Compact multi-spectrum imaging with fusion
TWM458748U (en) * 2012-12-26 2013-08-01 Chunghwa Telecom Co Ltd Image type depth information retrieval device
JP6055681B2 (en) * 2013-01-10 2016-12-27 株式会社 日立産業制御ソリューションズ Imaging device
CN104661008B (en) * 2013-11-18 2017-10-31 深圳中兴力维技术有限公司 The treating method and apparatus that color image quality is lifted under low light conditions
CN104021548A (en) * 2014-05-16 2014-09-03 中国科学院西安光学精密机械研究所 Method for acquiring 4D scene information
US9516295B2 (en) * 2014-06-30 2016-12-06 Aquifi, Inc. Systems and methods for multi-channel imaging based on multiple exposure settings
JP6450107B2 (en) * 2014-08-05 2019-01-09 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
WO2016093070A1 (en) * 2014-12-10 2016-06-16 ソニー株式会社 Image capture device, image capture method, program, and image processing device
CN107431760B (en) * 2015-03-31 2018-08-28 富士胶片株式会社 The image processing method and storage medium of photographic device, photographic device
WO2016192437A1 (en) * 2015-06-05 2016-12-08 深圳奥比中光科技有限公司 3d image capturing apparatus and capturing method, and 3d image system
JP2017011634A (en) * 2015-06-26 2017-01-12 キヤノン株式会社 Imaging device, control method for the same and program
CN105049829B (en) * 2015-07-10 2018-12-25 上海图漾信息科技有限公司 Optical filter, imaging sensor, imaging device and 3-D imaging system
CN105069768B (en) * 2015-08-05 2017-12-29 武汉高德红外股份有限公司 A kind of visible images and infrared image fusion processing system and fusion method
US10523855B2 (en) * 2015-09-24 2019-12-31 Intel Corporation Infrared and visible light dual sensor imaging system
TW201721269A (en) * 2015-12-11 2017-06-16 宏碁股份有限公司 Automatic exposure system and auto exposure method thereof
JP2017112401A (en) * 2015-12-14 2017-06-22 ソニー株式会社 Imaging device, apparatus and method for image processing, and program
CN206117865U (en) * 2016-01-16 2017-04-19 上海图漾信息科技有限公司 Range data monitoring device
JP2017163297A (en) * 2016-03-09 2017-09-14 キヤノン株式会社 Imaging apparatus
KR101747603B1 (en) * 2016-05-11 2017-06-16 재단법인 다차원 스마트 아이티 융합시스템 연구단 Color night vision system and operation method thereof
CN106815826A (en) * 2016-12-27 2017-06-09 上海交通大学 Night vision image Color Fusion based on scene Recognition
CN108280807A (en) * 2017-01-05 2018-07-13 浙江舜宇智能光学技术有限公司 Monocular depth image collecting device and system and its image processing method
ES2747387B1 (en) * 2017-02-06 2021-07-27 Photonic Sensors & Algorithms S L DEVICE AND METHOD TO OBTAIN DEPTH INFORMATION FROM A SCENE.
CN111988587B (en) * 2017-02-10 2023-02-07 杭州海康威视数字技术股份有限公司 Image fusion apparatus and image fusion method
CN109712102B (en) * 2017-10-25 2020-11-27 杭州海康威视数字技术股份有限公司 Image fusion method and device and image acquisition equipment
CN107846537B (en) * 2017-11-08 2019-11-26 维沃移动通信有限公司 A kind of CCD camera assembly, image acquiring method and mobile terminal
CN112788249B (en) * 2017-12-20 2022-12-06 杭州海康威视数字技术股份有限公司 Image fusion method and device, electronic equipment and computer readable storage medium
US10748247B2 (en) * 2017-12-26 2020-08-18 Facebook, Inc. Computing high-resolution depth images using machine learning techniques
US10757320B2 (en) * 2017-12-28 2020-08-25 Waymo Llc Multiple operating modes to expand dynamic range
TWI661726B (en) * 2018-01-09 2019-06-01 呂官諭 Image sensor with enhanced image recognition and application
CN110572583A (en) * 2018-05-18 2019-12-13 杭州海康威视数字技术股份有限公司 method for shooting image and camera
CN108961195B (en) * 2018-06-06 2021-03-23 Oppo广东移动通信有限公司 Image processing method and device, image acquisition device, readable storage medium and computer equipment
JP6574878B2 (en) * 2018-07-19 2019-09-11 キヤノン株式会社 Image processing apparatus, image processing method, imaging apparatus, program, and storage medium
JP7254461B2 (en) * 2018-08-01 2023-04-10 キヤノン株式会社 IMAGING DEVICE, CONTROL METHOD, RECORDING MEDIUM, AND INFORMATION PROCESSING DEVICE
CN109035193A (en) * 2018-08-29 2018-12-18 成都臻识科技发展有限公司 A kind of image processing method and imaging processing system based on binocular solid camera
CN112602316B (en) * 2018-09-14 2022-06-24 浙江宇视科技有限公司 Automatic exposure method and device for double-light image, double-light image camera and machine storage medium
JP2020052001A (en) * 2018-09-28 2020-04-02 パナソニックIpマネジメント株式会社 Depth acquisition device, depth acquisition method, and program
US11176694B2 (en) * 2018-10-19 2021-11-16 Samsung Electronics Co., Ltd Method and apparatus for active depth sensing and calibration method thereof
CN109636732B (en) * 2018-10-24 2023-06-23 深圳先进技术研究院 Hole repairing method of depth image and image processing device
CN110248105B (en) * 2018-12-10 2020-12-08 浙江大华技术股份有限公司 Image processing method, camera and computer storage medium
US11120536B2 (en) * 2018-12-12 2021-09-14 Samsung Electronics Co., Ltd Apparatus and method for determining image sharpness
US10972649B2 (en) * 2019-02-27 2021-04-06 X Development Llc Infrared and visible imaging system for device identification and tracking
JP7316809B2 (en) * 2019-03-11 2023-07-28 キヤノン株式会社 Image processing device, image processing device control method, system, and program
CN110349117B (en) * 2019-06-28 2023-02-28 重庆工商大学 Infrared image and visible light image fusion method and device and storage medium
CN110706178B (en) * 2019-09-30 2023-01-06 杭州海康威视数字技术股份有限公司 Image fusion device, method, equipment and storage medium
CN111524175A (en) * 2020-04-16 2020-08-11 东莞市东全智能科技有限公司 Depth reconstruction and eye movement tracking method and system for asymmetric multiple cameras
CN111540003A (en) * 2020-04-27 2020-08-14 浙江光珀智能科技有限公司 Depth image generation method and device
CN111383206B (en) * 2020-06-01 2020-09-29 浙江大华技术股份有限公司 Image processing method and device, electronic equipment and storage medium
IN202021032940A (en) * 2020-07-31 2020-08-28 .Us Priyadarsan

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109474770A (en) * 2017-09-07 2019-03-15 华为技术有限公司 A kind of imaging device and imaging method
CN110136183A (en) * 2018-02-09 2019-08-16 华为技术有限公司 A kind of method and relevant device of image procossing
CN108965654A (en) * 2018-02-11 2018-12-07 浙江宇视科技有限公司 Double spectrum camera systems and image processing method based on single-sensor
WO2020168465A1 (en) * 2019-02-19 2020-08-27 华为技术有限公司 Image processing device and method
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于块匹配的低光度图像对融合方法;王光霞;冯华君;徐之海;李奇;陈跃庭;;光子学报(第04期);全文 *

Also Published As

Publication number Publication date
TWI797528B (en) 2023-04-01
CN114143443A (en) 2022-03-04
TWI764484B (en) 2022-05-11
CN114143418A (en) 2022-03-04
TW202211160A (en) 2022-03-16
TWI778476B (en) 2022-09-21
CN114143421A (en) 2022-03-04
CN114143419A (en) 2022-03-04
TWI767468B (en) 2022-06-11
TWI767484B (en) 2022-06-11
CN114143418B (en) 2023-12-01
CN114143421B (en) 2024-04-05
CN114143420A (en) 2022-03-04
TW202211165A (en) 2022-03-16
TW202211674A (en) 2022-03-16
CN114143419B (en) 2023-12-26
TW202211673A (en) 2022-03-16
TW202211161A (en) 2022-03-16

Similar Documents

Publication Publication Date Title
US10574961B2 (en) Image processing apparatus and image processing method thereof
JP5739001B2 (en) Target area extraction
US8803994B2 (en) Adaptive spatial sampling using an imaging assembly having a tunable spectral response
US8614751B2 (en) Image processing apparatus and image processing method
US10325354B2 (en) Depth assisted auto white balance
US9036046B2 (en) Image processing apparatus and method with white balance correction
US11689822B2 (en) Dual sensor imaging system and privacy protection imaging method thereof
US20200228770A1 (en) Lens rolloff assisted auto white balance
US11496694B2 (en) Dual sensor imaging system and imaging method thereof
CN114143443B (en) Dual-sensor imaging system and imaging method thereof
WO2019111659A1 (en) Image processing device, imaging device, image processing method, and program
US20200228769A1 (en) Lens rolloff assisted auto white balance
US7397968B2 (en) System and method for tone composition
US11496660B2 (en) Dual sensor imaging system and depth map calculation method thereof
Brown Color processing for digital cameras
CN109447925B (en) Image processing method and device, storage medium and electronic equipment
Wueller Low light performance of digital still cameras
US11568526B2 (en) Dual sensor imaging system and imaging method thereof
US20230017498A1 (en) Flexible region of interest color processing for cameras
JP2017034536A (en) Image processing apparatus, image processing method, and program
JP2020205631A (en) Image processing device and image processing program
Koskiranta Improving Automatic Imaging Algorithms with Dual Camera System
CN117581557A (en) Flexible region of interest color processing for cameras
JP2014130442A (en) Image processing device, imaging device, and image processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant