CN114143443A - Dual-sensor camera system and camera method thereof - Google Patents

Dual-sensor camera system and camera method thereof Download PDF

Info

Publication number
CN114143443A
CN114143443A CN202011540274.1A CN202011540274A CN114143443A CN 114143443 A CN114143443 A CN 114143443A CN 202011540274 A CN202011540274 A CN 202011540274A CN 114143443 A CN114143443 A CN 114143443A
Authority
CN
China
Prior art keywords
image
color
infrared
scene
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011540274.1A
Other languages
Chinese (zh)
Other versions
CN114143443B (en
Inventor
彭诗渊
郑书峻
黄旭鍊
李运锦
赖国铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Altek Semiconductor Corp
Original Assignee
Altek Semiconductor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Altek Semiconductor Corp filed Critical Altek Semiconductor Corp
Publication of CN114143443A publication Critical patent/CN114143443A/en
Application granted granted Critical
Publication of CN114143443B publication Critical patent/CN114143443B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions

Abstract

The invention provides a double-sensor camera system and a camera method thereof. The dual sensor camera system includes at least one color sensor, at least one infrared sensor, a storage device, and a processor. The processor is configured to load and execute a computer program stored in the storage device to: recognizing a camera shooting scene of a double-sensor camera shooting system; controlling a color sensor and an infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for a shooting scene; adaptively selecting a combination of a color image and an infrared image which can reveal details of a camera scene; and fusing the selected color image and the infrared image to generate a scene image with the details of the camera scene.

Description

Dual-sensor camera system and camera method thereof
Technical Field
The present disclosure relates to a camera system and a method thereof, and more particularly, to a dual sensor camera system and a camera method thereof.
Background
The exposure conditions of the cameras (including aperture, shutter, brightness) affect the quality of the captured images, so many cameras automatically adjust the exposure conditions during the process of capturing the images to obtain clear and bright images. However, in a scene with high contrast such as a low light source or a backlight, the exposure condition adjusted by the camera may cause too high noise or too high exposure of a part of the area, and the image quality of all the areas cannot be considered.
In view of the above, a new image sensor architecture is adopted in the prior art, which utilizes the high light sensitivity characteristic of an Infrared (IR) sensor to alternately arrange IR pixels among color pixels of the image sensor to assist brightness detection. For example, fig. 1 is a schematic diagram of an image sensor for acquiring an image. Referring to fig. 1, in a conventional image sensor 10, infrared (I) pixels are disposed in an interposed manner in addition to red (R), green (G), blue (B), and other color pixels. Thus, the image sensor 10 is able to combine the color information 12 obtained by the R, G, B color pixels with the luminance information 14 obtained by the I pixels to obtain an image 16 of moderate color and luminance.
However, in the single image sensor architecture, the exposure conditions of each pixel in the image sensor are the same, so that only the exposure conditions suitable for the color pixel or the infrared pixel can be selected to obtain the image, and as a result, the characteristics of the two pixels cannot be effectively utilized to improve the image quality of the obtained image.
Disclosure of Invention
The invention provides a double-sensor camera system and a camera method thereof, which utilize independently configured color and infrared sensors to respectively obtain a plurality of images under different exposure conditions, select the color and infrared images with proper exposure conditions to be fused into a result image, can complement the texture details of the color image and improve the image quality of the shot image.
The dual sensor camera system of the present invention includes at least one color sensor, at least one infrared sensor, a storage device, and a processor coupled to the color sensor, the infrared sensor, and the storage device. The processor is configured to load and execute a computer program stored in a storage device to: recognizing a camera shooting scene of a double-sensor camera shooting system; controlling a color sensor and an infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for a shooting scene; adaptively selecting a combination of a color image and an infrared image which can reveal details of a camera scene; and fusing the selected color image and the infrared image to generate a scene image with the details of the camera scene.
The image pickup method of the double-sensor image pickup system is suitable for the double-sensor image pickup system comprising at least one color sensor, at least one infrared sensor and a processor. The method comprises the following steps: recognizing a camera shooting scene of a double-sensor camera shooting system; controlling a color sensor and an infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for a shooting scene; adaptively selecting a combination of a color image and an infrared image which can reveal details of a camera scene; and fusing the selected color image and the infrared image to generate a scene image with the details of the camera scene.
Based on the above, the dual-sensor camera system and the camera method thereof of the present invention acquire a plurality of images on the color sensor and the infrared sensor which are configured independently, using different exposure conditions suitable for the current camera scene, and select a combination of the color image and the infrared image from which the camera scene details can be exposed for fusion, thereby generating a scene image with the camera scene details, and improving the image quality of the captured image.
In order to make the disclosure more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a schematic diagram of a prior art image sensor for capturing an image;
FIG. 2 is a schematic diagram illustrating the acquisition of an image using an image sensor in accordance with one embodiment of the present invention;
FIG. 3 is a block diagram of a dual sensor camera system according to an embodiment of the present invention;
FIG. 4 is a flowchart of an imaging method of a dual sensor imaging system according to an embodiment of the invention;
FIG. 5 is a flowchart of an imaging method of a dual sensor imaging system according to an embodiment of the invention;
fig. 6 is an example of a photographing method of a dual sensor photographing system according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating an image capture method of a dual sensor image capture system in accordance with one embodiment of the present invention;
fig. 8 is an example of a photographing method of a dual sensor photographing system according to an embodiment of the present invention;
fig. 9 is a flowchart illustrating an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention.
Description of the symbols
10. 20: image sensor with a plurality of pixels
12: color information
14: luminance information
16: image of a person
22: color sensor
22a, 62, 82: color image
24: infrared sensor
24a, 64, 84: infrared image
26. 66, 86: scene image
30: dual-sensor camera system
32: color sensor
34: infrared sensor
36: storage device
38: processor with a memory having a plurality of memory cells
62 a: face region
82a, 84a, 86 a: cola can area
R, G, B, I: pixel
S402 to S408, S502 to S510, S702 to S708, S902 to S914: step (ii) of
Detailed Description
Fig. 2 is a schematic diagram illustrating the acquisition of an image using an image sensor according to an embodiment of the present invention. Referring to fig. 2, the image sensor 20 according to the embodiment of the present invention adopts a dual-sensor architecture in which a color sensor 22 and an Infrared (IR) sensor 24 are independently configured, and uses the respective characteristics of the color sensor 22 and the IR sensor 24 to respectively acquire a plurality of images under a plurality of exposure conditions suitable for a current shooting scene, and selects a color image 22a and an IR image 24a with appropriate exposure conditions, and uses the IR image 24a to complement texture details lacking in the color image 22a in an image fusion manner, so as to obtain a scene image 26 with good color and texture details.
Fig. 3 is a block diagram of a dual-sensor camera system according to an embodiment of the invention. Referring to fig. 3, the dual-sensor camera system 30 of the present embodiment can be configured in an electronic device, such as a mobile phone, a tablet computer, a notebook computer, a navigation device, a driving recorder, a digital camera, a digital video camera, etc., for providing a camera function. The dual sensor camera system 30 includes at least one color sensor 32, at least one infrared sensor 34, a storage device 36, and a processor 38, and functions as follows:
the color sensor 32 may include, for example, a Charge Coupled Device (CCD), a Complementary Metal-Oxide Semiconductor (CMOS) Device, or other types of photosensitive devices, and may sense the light intensity to generate an image of the photographic scene. The color sensor 32 is, for example, a red-green-blue (RGB) image sensor, which includes red (R), green (G), and blue (B) color pixels, and is configured to acquire color information of red light, green light, and blue light in the imaging scene and combine the color information to generate a color image of the imaging scene.
The infrared sensor 34 includes, for example, a CCD, a CMOS device, or other kinds of light-sensing devices, which are capable of sensing infrared light by adjusting a wavelength sensing range of the light-sensing devices. The infrared sensor 34 acquires infrared light information in an imaging scene with the above-described light-sensing devices as pixels, for example, and synthesizes the infrared light information to generate an infrared image of the imaging scene.
The storage device 36 is, for example, any type of fixed or removable Random Access Memory (RAM), Read-Only Memory (ROM), Flash Memory (Flash Memory), hard disk, or the like, or a combination thereof, and is used for storing computer programs executable by the processor 38. In some embodiments, storage device 36 may also store, for example, color images acquired by color sensor 32 and infrared images acquired by infrared sensor 34.
The Processor 38 is, for example, a Central Processing Unit (CPU), or other Programmable general purpose or special purpose Microprocessor (Microprocessor), a Microcontroller (Microcontroller), a Digital Signal Processor (DSP), a Programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), or other similar devices or combinations thereof, and the invention is not limited thereto. In the present embodiment, the processor 38 may load a computer program from the storage device 36 to execute the image capturing method of the dual sensor image capturing system of the embodiment of the present invention.
Fig. 4 is a flowchart illustrating an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and fig. 4, the method of the present embodiment is applied to the dual-sensor camera system 30, and the detailed steps of the camera method of the present embodiment will be described below with reference to the components of the dual-sensor camera system 30.
In step S402, a camera scene of the dual sensor camera system 30 is identified by the processor 38. In some embodiments, the processor 38, for example, controls at least one of the color sensor 32 and the infrared sensor 34 to acquire at least one standard image of the camera scene using standard exposure conditions and to identify the camera scene using the standard images. The standard exposure conditions include, for example, parameters such as aperture, shutter, brightness, etc. determined by the existing photometric technique, and the processor 38 identifies the shooting scene including the position (indoor or outdoor), light source (high or low light source), contrast (high or low contrast), type (article or portrait) or state (dynamic or static) of the shooting scene according to the intensity or distribution of image parameters such as Hue (Hue), lightness (Value), Chroma (Chroma), white balance, etc. of the image obtained under the exposure conditions. In other embodiments, the processor 38 may also use a positioning method to identify the shooting scene or directly receive a user operation to set the shooting scene, which is not limited herein.
In step S404, the processor 38 controls the color sensor 32 and the infrared sensor 34 to acquire a plurality of color images and a plurality of infrared images, respectively, under a plurality of exposure conditions suitable for the identified imaging scene. In some embodiments, the processor 38 controls the color sensor 32 and the infrared sensor 34 to acquire color images with shorter or longer Exposure times based on the Exposure time in the standard Exposure condition, and the difference between the Exposure times of the color images is, for example, any Value between-3 and 3 Exposure Values (EV), which is not limited herein. For example, if the a image is twice brighter than the B image, the EV of the B image may be increased by 1, and so on, and the exposure value may have a decimal value (e.g., +0.3EV), which is not limited herein.
In step S406, a combination of color images and infrared images that reveal details of the camera scene is adaptively selected by the processor 38. In some embodiments, processor 38 may, for example, control color sensor 32 to acquire color images at appropriate exposure times so that some of the color details of the camera scene may be preserved and to ensure that the fused images may then reveal the color details of the camera scene. The appropriate exposure time is, for example, an exposure time shorter than an exposure time that would cause overexposure of the acquired image by a predetermined time length, and the predetermined time length is, for example, an arbitrary value from 0.01 to 1 second, and is not limited thereto.
In some embodiments, the processor 38 may select one of the color images as a reference image according to color details of the color images, identify at least one defective area in the reference image lacking texture details, and select one of the infrared images as a fused image with the reference image according to texture details of images corresponding to the defective area in the infrared images.
In detail, since the color sensor 32 can only acquire a color image under a single exposure condition at a time, in the case of a low light source or a high contrast imaging scene, a high noise, an overexposure, or an underexposed region (i.e., the above-mentioned defect region) may occur in each color image. At this time, the processor 38 can utilize the characteristic of high photosensitivity of the infrared sensor 34 to select an infrared image having texture details of the defective region from a plurality of previously acquired infrared images for the defective region, so as to complement the texture details of the defective region in the color image.
In step S408, the selected color image and the infrared image are fused by the processor 38 to generate a scene image having details of the imaged scene. In some embodiments, the processor 38 directly blends the entire image of the selected color image and the infrared image, for example, by calculating an average or weighted average of pixel values of corresponding pixels in the entire image of the selected color image and the infrared image, or by other image fusion methods. In some embodiments, processor 38 may also use the image corresponding to the defective area in the infrared image to fill in or replace the image of the defective area in the color image for the defective area in the color image, without limitation.
By the above method, the dual-sensor camera system 30 can select a color image with better color details, and can fill or replace an area with insufficient texture details in the color image with an image of a corresponding area in the infrared image, so as to finally generate an image including all details (color and texture details) of the camera scene, thereby improving the image quality of the camera image.
Fig. 5 is a flowchart illustrating an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and fig. 5, this embodiment further illustrates a detailed implementation of the embodiment for fusing the whole image. The method of the present embodiment is applied to the dual-sensor imaging system 30, and the detailed steps of the imaging method of the present embodiment will be described below with reference to the components of the dual-sensor imaging system 30.
In step S502, one of the color images is selected as a reference image by the processor 38 according to the color details of the respective color images. In one embodiment, processor 38 selects, for example, the color image with the most color detail as the reference image. How much of the color detail is, for example, can be determined by the size of the over-or under-exposed areas in the color image. In detail, the color of the pixels in the overexposed region is close to white, and the color of the pixels in the underexposed region is close to black, so that the color detail of these regions is small. Therefore, if more such regions are included in the color image, indicating that the color detail is less, the processor 38 can determine which color image has the most color detail for use as the reference image. In other embodiments, the processor 38 can also distinguish the amount of color detail according to the contrast, saturation or other image parameters of each color image, which is not limited herein.
In step S504, at least one defective area in the reference image lacking texture detail is identified by the processor 38. The defect area is, for example, the above-mentioned overexposed area or underexposed area, or an area with higher noise obtained under a low light source, which is not limited herein.
In step S506, one of the infrared images is selected by the processor 38 according to the texture details of the image corresponding to the defect area in each infrared image. In one embodiment, processor 38 selects, for example, the infrared image corresponding to the image of the defective region having the most texture details as the fused image with the reference image. The processor 38 can distinguish the texture details according to the contrast or other image parameters of each infrared image, for example, but not limited thereto.
In step S508, feature acquisition is performed on the selected color image and infrared image by the processor 38 to acquire a plurality of features in the color image and infrared image, and the color image and infrared image are aligned according to the correspondence between the acquired features. It should be noted that the above-mentioned manner of feature acquisition and matching is only an example, and in other embodiments, the processor 38 may also use other types of image alignment manners to align the color image and the infrared image, which is not limited herein.
In step S510, the aligned ir image is image fused with the reference image by the processor 38 to generate a scene image that complements the texture details of the defective region.
In some embodiments, the processor 38 performs image fusion of the infrared image with the reference image by, for example, calculating an average or weighted average of pixel values of corresponding pixels in the entire color image and the infrared image.
In some embodiments, processor 38 converts the color space of the reference image from the RGB color space to the YUV color space, for example, and replaces the luminance component of the converted reference image with the luminance component of the infrared image, and then converts the color space of the replaced reference image back to the RGB color space to generate the scene image. In other embodiments, the processor 38 may also convert the color space of the reference image into YCbCr, CMYK or other color spaces, and convert the color space back to the original color space after replacing the luminance component.
In detail, since the luminance component of the infrared image has a better signal-to-noise ratio (SNR) and includes more texture details of the camera scene, the luminance component of the infrared image directly replaces the luminance component of the reference image, so that the texture details in the reference image can be greatly increased.
By the method, the dual-sensor camera system 30 can increase the texture details of the color image by using the infrared image, especially for the area with insufficient texture details, thereby improving the image quality of the shot image.
For example, fig. 6 is an exemplary image capturing method of a dual-sensor image capturing system according to an embodiment of the invention. Referring to fig. 6, in the present embodiment, the color image 62 with the most color details is selected as the reference image by the above-mentioned image capturing method shown in fig. 5, and for a defective area (e.g., a human face area 62a) lacking texture details in the color image 62, an infrared image 64 with the most texture details in the defective area is selected from a plurality of infrared images obtained under different exposure conditions for image fusion with the color image 62, so as to obtain a scene image 66 with more color details and texture details.
Fig. 7 is a flowchart illustrating an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and fig. 7, the present embodiment further illustrates a detailed implementation of the embodiment of fusing the defect regions. The method of the present embodiment is applied to the dual-sensor imaging system 30, and the detailed steps of the imaging method of the present embodiment will be described below with reference to the components of the dual-sensor imaging system 30.
In step S702, one of the color images is selected as a reference image by the processor 38 according to the color details of the respective color images. In step S704, at least one defective area in the reference image lacking texture detail is identified by the processor 38. In step S706, one of the infrared images is selected by the processor 38 according to the texture details of the image corresponding to the defect area in each infrared image. The implementation of the steps S702 to S706 are the same as or similar to the steps S502 to S506 of the previous embodiment, and therefore the details thereof are not repeated herein.
Unlike the previous embodiment, in step S708, the processor 38 replaces the luminance component of the image of the defective region in the reference image with the luminance component corresponding to the defective region in the infrared image to generate a scene image that complements the texture details of the defective region.
In some embodiments, processor 38 converts the color space of the reference image from the RGB color space to the YUV color space, for example, and replaces the luminance component of the image of the defective region of the converted reference image with the luminance component of the infrared image corresponding to the defective region, and then converts the color space of the replaced reference image back to the RGB color space to generate the scene image. In other embodiments, the processor 38 may also convert the color space of the reference image into YCbCr, CMYK or other color spaces, and convert the color space back to the original color space after replacing the luminance component.
By the above method, the dual-sensor camera system 30 can complement the area with insufficient texture details in the color image by using the infrared image, thereby improving the image quality of the photographed image.
For example, fig. 8 is an exemplary image capturing method of a dual-sensor image capturing system according to an embodiment of the invention. Referring to fig. 8, in the present embodiment, the color image 82 with the most color details is selected as the reference image by the above-mentioned image capturing method shown in fig. 7, and for a defective area (e.g., a coke can area 82a) lacking texture details in the color image 82, an infrared image 84 with the most texture details in the defective area is selected from a plurality of infrared images obtained under different exposure conditions, and the luminance component of the coke can area 82a is replaced by the luminance component of the corresponding coke can area 84a in the infrared image 84, so as to obtain a scene image 86 with a coke can area 86a having more texture details.
It should be noted that in some embodiments, the texture details of some defective areas in the color image may not be enhanced or complemented by the infrared image due to certain factors, such as parallax (parallax) between the color sensor 32 and the infrared sensor 34, which may cause the infrared sensor 34 to be obscured. In this case, the embodiment of the present invention provides an alternative way to increase the texture detail of the defective area to maximize the image quality of the captured image.
Fig. 9 is a flowchart illustrating an image capturing method of a dual sensor image capturing system according to an embodiment of the present invention. Referring to fig. 3 and fig. 9, the method of the present embodiment is applied to the dual-sensor camera system 30, and the detailed steps of the camera method of the present embodiment will be described below with reference to the components of the dual-sensor camera system 30.
In step S902, at least one standard image of the imaging scene is acquired by the processor 38 controlling at least one of the color sensor 32 and the infrared sensor 34 using standard exposure conditions, and the imaging scene is recognized using these standard images. The definition of the standard exposure condition and the recognition method of the shooting scene are as described in the foregoing embodiments, and are not described herein again.
In step S904, the processor 38 controls the color sensor 32 and the infrared sensor 34 to acquire a plurality of color images and a plurality of infrared images, respectively, using a plurality of exposure conditions suitable for the identified imaging scene. In step S906, one of the color images is selected as a reference image by the processor 38 according to the color details of the respective color images. In step S908, at least one defective area in the reference image lacking texture detail is identified by the processor 38. The implementation of the steps S904 to S908 is the same as or similar to the steps S404 and S702 to S704 of the previous embodiment, and therefore the details thereof are not repeated herein.
Unlike the previous embodiment, in step S910, the processor 38 determines whether any of the infrared images includes texture details of the defect area in the reference image. For example, the processor 38 may check whether there is an image in the area corresponding to the defect area in each infrared image to determine whether the infrared sensor 34 is shielded, and determine whether the infrared image can be used to fill in the texture details of the defect area in the reference image.
If the ir image includes the texture details of the defective area, in step S912, the processor 38 replaces the luminance component of the image of the defective area in the reference image with the luminance component of the ir image corresponding to the defective area to generate a scene image for complementing the texture details of the defective area. The implementation of step S912 is the same as or similar to step S708 of the previous embodiment, and therefore the details thereof are not repeated herein.
If the infrared image does not include the texture details of the defect region, in step S914, the processor 38 controls the color sensor 32 to acquire a plurality of color images and perform High Dynamic Range (HDR) processing when the color sensor performs a plurality of exposures that are longer or shorter than the exposure time of the reference image, so as to generate a scene image with the texture details of the defect region.
In some embodiments, the processor 38 controls the color sensor 32 to acquire a color image with a shorter exposure time and a color image with a longer exposure time according to the exposure time of the reference image selected by the processor, for example, by using a shorter exposure time and a longer exposure time than the exposure time, respectively, and then performs HDR processing in combination with the color image acquired by using the original exposure time. That is, an area with better color and texture details is selected from the three color images to complement the area with less details in the other color images, so as to obtain an HDR image with better details of bright parts and dark parts as a finally output scene image.
In some embodiments, the processor 38 may perform Noise Reduction (NR) processing, such as 2D spatial noise reduction (2D) on the HDR image, to reduce noise in the HDR image and improve image quality of a final output image.
In some embodiments, the processor 38 may combine the processing manners of steps S912 and S914 to individually select an appropriate processing manner for the plurality of defective areas in the reference image, so as to increase the details of the reference image to the maximum extent, thereby improving the image quality of the captured image.
In summary, the dual-sensor imaging system and the imaging method thereof according to the present invention separately configure the color sensor and the infrared sensor, and respectively acquire a plurality of images under a plurality of exposure conditions suitable for a current imaging scene, and select the color image and the infrared image with appropriate exposure conditions for fusion, so as to fill up or increase texture details lacking in the color image by using the infrared image, thereby generating a scene image with imaging scene details, and improving the image quality of the captured image.
Therefore, it is intended that the present disclosure not be limited to the particular embodiment disclosed, but that the present disclosure will include all modifications and equivalents falling within the spirit and scope of the present disclosure.

Claims (20)

1. A dual sensor camera system, comprising:
at least one color sensor;
at least one infrared sensor;
a storage device storing a computer program; and
a processor, coupled to the at least one color sensor, the at least one infrared light sensor, and the storage device, configured to load and execute the computer program to:
identifying a camera scene of the dual-sensor camera system;
controlling the at least one color sensor and the at least one infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for the shooting scene;
adaptively selecting a combination of the color image and the infrared image that reveals details of the photographic scene; and
fusing the selected color image and the infrared image to generate a scene image having the details of the photographic scene.
2. The dual sensor camera system of claim 1, wherein the processor comprises:
and controlling at least one of the at least one color sensor and the at least one infrared sensor to acquire at least one standard image of the shooting scene by adopting standard exposure conditions, and identifying the scene by using the at least one standard image.
3. The dual sensor camera system of claim 1, wherein the processor comprises:
selecting one of the color images as a reference image according to the color details of each color image;
identifying at least one defective region in the reference image lacking texture detail; and
and selecting one of the infrared images according to the texture details of the image corresponding to the at least one defective area in each infrared image, and fusing the selected infrared image with the reference image.
4. The dual sensor camera system of claim 3, wherein the processor comprises:
selecting the color image with the most color details as the reference image; and
and selecting the infrared image with the most texture details of the image corresponding to the at least one defect area for fusion with the reference image.
5. The dual sensor camera system of claim 3, wherein the processor comprises:
replacing a luminance component of an image of the at least one defective region in the reference image with an image of the infrared image corresponding to the at least one defective region to generate the scene image that complements the texture details of the at least one defective region.
6. The dual sensor camera system of claim 3, wherein the processor comprises:
image fusing the selected infrared image with the reference image to generate the scene image complementing the texture details of the defective region.
7. The dual sensor camera system of claim 3, wherein the processor further comprises:
judging whether each infrared image comprises the texture details of the at least one defect area; and
and when none of the infrared images comprises the texture details, controlling the at least one color sensor to acquire a plurality of color images and execute high dynamic range processing when a plurality of exposures which are longer or shorter than the exposure time of the reference image are adopted so as to generate the scene image with the texture details of the at least one defect area.
8. The dual sensor camera system of claim 3, wherein the processor comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the brightness component of the image of the at least one defect area of the color image after conversion with the selected image of the infrared image corresponding to the at least one defect area; and
converting the color space of the color image after the replacing back to the RGB color space to generate the scene image.
9. The dual sensor camera system of claim 1, wherein the processor comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the luminance component of the converted color image with the selected luminance component of the infrared image; and
converting the color space of the color image after the replacing back to the RGB color space to generate the scene image.
10. The dual sensor camera system of claim 1, wherein the processor comprises:
and acquiring a plurality of characteristics in the selected color image and the selected infrared image, and aligning the color image and the infrared image according to the acquired corresponding relation between the characteristics.
11. A method of imaging in a dual sensor imaging system, the dual sensor imaging system including at least one color sensor, at least one infrared sensor, and a processor, the method comprising the steps of:
identifying a camera scene of the dual-sensor camera system;
controlling the at least one color sensor and the at least one infrared sensor to respectively acquire a plurality of color images and a plurality of infrared images by adopting a plurality of exposure conditions suitable for the shooting scene;
adaptively selecting a combination of the color image and the infrared image that reveals details of the photographic scene; and
fusing the selected color image and the infrared image to generate a scene image having the details of the photographic scene.
12. The method of claim 11, wherein identifying the camera scene of the dual sensor camera system comprises:
and controlling at least one of the at least one color sensor and the at least one infrared sensor to acquire at least one standard image of the shooting scene by adopting standard exposure conditions, and identifying the scene by using the at least one standard image.
13. The method of claim 11, wherein adaptively selecting the combination of the color image and the infrared image that reveals details of the camera scene comprises:
selecting one of the color images as a reference image according to the color details of each color image;
identifying at least one defective region in the reference image lacking texture detail; and
and selecting one of the infrared images according to the texture details of the image corresponding to the at least one defect area in each infrared image.
14. The method of claim 13, wherein adaptively selecting the combination of the color image and the infrared image that reveals details of the camera scene comprises:
selecting the color image with the most color details as the reference image; and
and selecting the infrared image with the most texture details of the image corresponding to the at least one defect area for fusion with the reference image.
15. The method of claim 13, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
replacing a luminance component of an image of the at least one defective region in the reference image with an image of the infrared image corresponding to the at least one defective region to generate the scene image that complements the texture details of the at least one defective region.
16. The method of claim 13, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
image fusing the selected infrared image with the reference image to generate the scene image complementing the texture details of the defective region.
17. The method of claim 13, wherein prior to the step of fusing the selected color image and the infrared image to generate a scene image with the details of the camera scene, the method further comprises:
judging whether each infrared image comprises the texture details of the at least one defect area; and
and when none of the infrared images comprises the texture details, controlling the at least one color sensor to acquire a plurality of color images and execute high dynamic range processing when a plurality of exposures which are longer or shorter than the exposure time of the reference image are adopted so as to generate the scene image with the texture details of the at least one defect area.
18. The method of claim 13, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the brightness component of the image of the at least one defect area of the color image after conversion with the selected image of the infrared image corresponding to the at least one defect area; and
converting the color space of the color image after the replacing back to the RGB color space to generate the scene image.
19. The method of claim 11, wherein fusing the selected color image and the infrared image to generate the scene image with the details of the camera scene comprises:
converting the color space of the selected color image from an RGB color space to a YUV color space;
replacing the luminance component of the converted color image with the selected luminance component of the infrared image; and
converting the color space of the color image after the replacing back to the RGB color space to generate the scene image.
20. The method of claim 11, wherein prior to the step of fusing the selected color image and the infrared image to generate a scene image with the details of the camera scene, the method further comprises:
and acquiring a plurality of characteristics in the selected color image and the selected infrared image, and aligning the color image and the infrared image according to the acquired corresponding relation between the characteristics.
CN202011540274.1A 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof Active CN114143443B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063074477P 2020-09-04 2020-09-04
US63/074,477 2020-09-04

Publications (2)

Publication Number Publication Date
CN114143443A true CN114143443A (en) 2022-03-04
CN114143443B CN114143443B (en) 2024-04-05

Family

ID=80438521

Family Applications (5)

Application Number Title Priority Date Filing Date
CN202011540274.1A Active CN114143443B (en) 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof
CN202011541300.2A Active CN114143418B (en) 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof
CN202011622478.XA Active CN114143419B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and depth map calculation method thereof
CN202011625552.3A Active CN114143421B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and calibration method thereof
CN202011625515.2A Pending CN114143420A (en) 2020-09-04 2020-12-30 Double-sensor camera system and privacy protection camera method thereof

Family Applications After (4)

Application Number Title Priority Date Filing Date
CN202011541300.2A Active CN114143418B (en) 2020-09-04 2020-12-23 Dual-sensor imaging system and imaging method thereof
CN202011622478.XA Active CN114143419B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and depth map calculation method thereof
CN202011625552.3A Active CN114143421B (en) 2020-09-04 2020-12-30 Dual-sensor camera system and calibration method thereof
CN202011625515.2A Pending CN114143420A (en) 2020-09-04 2020-12-30 Double-sensor camera system and privacy protection camera method thereof

Country Status (2)

Country Link
CN (5) CN114143443B (en)
TW (5) TWI767468B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091341B (en) * 2022-12-15 2024-04-02 南京信息工程大学 Exposure difference enhancement method and device for low-light image

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120189293A1 (en) * 2011-01-25 2012-07-26 Dongqing Cao Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
US20170318238A1 (en) * 2014-12-10 2017-11-02 Sony Corporation Image pickup apparatus, image pickup method, program, and image processing apparatus
CN108965654A (en) * 2018-02-11 2018-12-07 浙江宇视科技有限公司 Double spectrum camera systems and image processing method based on single-sensor
CN109474770A (en) * 2017-09-07 2019-03-15 华为技术有限公司 A kind of imaging device and imaging method
CN110136183A (en) * 2018-02-09 2019-08-16 华为技术有限公司 A kind of method and relevant device of image procossing
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium
WO2020168465A1 (en) * 2019-02-19 2020-08-27 华为技术有限公司 Image processing device and method

Family Cites Families (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004246252A (en) * 2003-02-17 2004-09-02 Takenaka Komuten Co Ltd Apparatus and method for collecting image information
JP2005091434A (en) * 2003-09-12 2005-04-07 Noritsu Koki Co Ltd Position adjusting method and image reader with damage compensation function using the same
JP4244018B2 (en) * 2004-03-25 2009-03-25 ノーリツ鋼機株式会社 Defective pixel correction method, program, and defective pixel correction system for implementing the method
JP4341680B2 (en) * 2007-01-22 2009-10-07 セイコーエプソン株式会社 projector
US9307212B2 (en) * 2007-03-05 2016-04-05 Fotonation Limited Tone mapping for low-light video frame enhancement
US8902321B2 (en) * 2008-05-20 2014-12-02 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US8866920B2 (en) * 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
CN101404060B (en) * 2008-11-10 2010-06-30 北京航空航天大学 Human face recognition method based on visible light and near-infrared Gabor information amalgamation
US8749635B2 (en) * 2009-06-03 2014-06-10 Flir Systems, Inc. Infrared camera systems and methods for dual sensor applications
WO2010104490A1 (en) * 2009-03-12 2010-09-16 Hewlett-Packard Development Company, L.P. Depth-sensing camera system
US20120154596A1 (en) * 2009-08-25 2012-06-21 Andrew Augustine Wajs Reducing noise in a color image
JP2013115679A (en) * 2011-11-30 2013-06-10 Fujitsu General Ltd Imaging apparatus
US10848731B2 (en) * 2012-02-24 2020-11-24 Matterport, Inc. Capturing and aligning panoramic image and depth data
TW201401186A (en) * 2012-06-25 2014-01-01 Psp Security Co Ltd System and method for identifying human face
CN104662897A (en) * 2012-09-25 2015-05-27 日本电信电话株式会社 Image encoding method, image decoding method, image encoding device, image decoding device, image encoding program, image decoding program, and recording medium
KR102070778B1 (en) * 2012-11-23 2020-03-02 엘지전자 주식회사 Rgb-ir sensor with pixels array and apparatus and method for obtaining 3d image using the same
CN105009568B (en) * 2012-12-21 2018-12-18 菲力尔系统公司 For handling the system of visible spectrum image and infrared image, method and the readable medium of non-volatile machine
TWM458748U (en) * 2012-12-26 2013-08-01 Chunghwa Telecom Co Ltd Image type depth information retrieval device
JP6055681B2 (en) * 2013-01-10 2016-12-27 株式会社 日立産業制御ソリューションズ Imaging device
CN104661008B (en) * 2013-11-18 2017-10-31 深圳中兴力维技术有限公司 The treating method and apparatus that color image quality is lifted under low light conditions
CN104021548A (en) * 2014-05-16 2014-09-03 中国科学院西安光学精密机械研究所 Method for acquiring 4D scene information
US9516295B2 (en) * 2014-06-30 2016-12-06 Aquifi, Inc. Systems and methods for multi-channel imaging based on multiple exposure settings
JP6450107B2 (en) * 2014-08-05 2019-01-09 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
CN107431760B (en) * 2015-03-31 2018-08-28 富士胶片株式会社 The image processing method and storage medium of photographic device, photographic device
WO2016192437A1 (en) * 2015-06-05 2016-12-08 深圳奥比中光科技有限公司 3d image capturing apparatus and capturing method, and 3d image system
JP2017011634A (en) * 2015-06-26 2017-01-12 キヤノン株式会社 Imaging device, control method for the same and program
CN105049829B (en) * 2015-07-10 2018-12-25 上海图漾信息科技有限公司 Optical filter, imaging sensor, imaging device and 3-D imaging system
CN105069768B (en) * 2015-08-05 2017-12-29 武汉高德红外股份有限公司 A kind of visible images and infrared image fusion processing system and fusion method
US10523855B2 (en) * 2015-09-24 2019-12-31 Intel Corporation Infrared and visible light dual sensor imaging system
TW201721269A (en) * 2015-12-11 2017-06-16 宏碁股份有限公司 Automatic exposure system and auto exposure method thereof
JP2017112401A (en) * 2015-12-14 2017-06-22 ソニー株式会社 Imaging device, apparatus and method for image processing, and program
CN206117865U (en) * 2016-01-16 2017-04-19 上海图漾信息科技有限公司 Range data monitoring device
JP2017163297A (en) * 2016-03-09 2017-09-14 キヤノン株式会社 Imaging apparatus
KR101747603B1 (en) * 2016-05-11 2017-06-16 재단법인 다차원 스마트 아이티 융합시스템 연구단 Color night vision system and operation method thereof
CN106815826A (en) * 2016-12-27 2017-06-09 上海交通大学 Night vision image Color Fusion based on scene Recognition
CN108280807A (en) * 2017-01-05 2018-07-13 浙江舜宇智能光学技术有限公司 Monocular depth image collecting device and system and its image processing method
JP6974873B2 (en) * 2017-02-06 2021-12-01 フォトニック センサーズ アンド アルゴリズムス,エセ.エレ. Devices and methods for retrieving depth information from the scene
CN108419062B (en) * 2017-02-10 2020-10-02 杭州海康威视数字技术股份有限公司 Image fusion apparatus and image fusion method
CN109712102B (en) * 2017-10-25 2020-11-27 杭州海康威视数字技术股份有限公司 Image fusion method and device and image acquisition equipment
CN107846537B (en) * 2017-11-08 2019-11-26 维沃移动通信有限公司 A kind of CCD camera assembly, image acquiring method and mobile terminal
CN112788249B (en) * 2017-12-20 2022-12-06 杭州海康威视数字技术股份有限公司 Image fusion method and device, electronic equipment and computer readable storage medium
US10748247B2 (en) * 2017-12-26 2020-08-18 Facebook, Inc. Computing high-resolution depth images using machine learning techniques
US10757320B2 (en) * 2017-12-28 2020-08-25 Waymo Llc Multiple operating modes to expand dynamic range
TWI661726B (en) * 2018-01-09 2019-06-01 呂官諭 Image sensor with enhanced image recognition and application
CN110572583A (en) * 2018-05-18 2019-12-13 杭州海康威视数字技术股份有限公司 method for shooting image and camera
CN108961195B (en) * 2018-06-06 2021-03-23 Oppo广东移动通信有限公司 Image processing method and device, image acquisition device, readable storage medium and computer equipment
JP6574878B2 (en) * 2018-07-19 2019-09-11 キヤノン株式会社 Image processing apparatus, image processing method, imaging apparatus, program, and storage medium
JP7254461B2 (en) * 2018-08-01 2023-04-10 キヤノン株式会社 IMAGING DEVICE, CONTROL METHOD, RECORDING MEDIUM, AND INFORMATION PROCESSING DEVICE
CN109035193A (en) * 2018-08-29 2018-12-18 成都臻识科技发展有限公司 A kind of image processing method and imaging processing system based on binocular solid camera
US11689785B2 (en) * 2018-09-14 2023-06-27 Zhejiang Uniview Technologies Co., Ltd. Dual-spectrum image automatic exposure method and apparatus, and dual-spectrum image camera and machine storage medium
JP2020052001A (en) * 2018-09-28 2020-04-02 パナソニックIpマネジメント株式会社 Depth acquisition device, depth acquisition method, and program
US11176694B2 (en) * 2018-10-19 2021-11-16 Samsung Electronics Co., Ltd Method and apparatus for active depth sensing and calibration method thereof
CN109636732B (en) * 2018-10-24 2023-06-23 深圳先进技术研究院 Hole repairing method of depth image and image processing device
CN110248105B (en) * 2018-12-10 2020-12-08 浙江大华技术股份有限公司 Image processing method, camera and computer storage medium
US11120536B2 (en) * 2018-12-12 2021-09-14 Samsung Electronics Co., Ltd Apparatus and method for determining image sharpness
US10972649B2 (en) * 2019-02-27 2021-04-06 X Development Llc Infrared and visible imaging system for device identification and tracking
JP7316809B2 (en) * 2019-03-11 2023-07-28 キヤノン株式会社 Image processing device, image processing device control method, system, and program
CN110349117B (en) * 2019-06-28 2023-02-28 重庆工商大学 Infrared image and visible light image fusion method and device and storage medium
CN110706178B (en) * 2019-09-30 2023-01-06 杭州海康威视数字技术股份有限公司 Image fusion device, method, equipment and storage medium
CN111524175A (en) * 2020-04-16 2020-08-11 东莞市东全智能科技有限公司 Depth reconstruction and eye movement tracking method and system for asymmetric multiple cameras
CN111540003A (en) * 2020-04-27 2020-08-14 浙江光珀智能科技有限公司 Depth image generation method and device
CN111383206B (en) * 2020-06-01 2020-09-29 浙江大华技术股份有限公司 Image processing method and device, electronic equipment and storage medium
IN202021032940A (en) * 2020-07-31 2020-08-28 .Us Priyadarsan

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120189293A1 (en) * 2011-01-25 2012-07-26 Dongqing Cao Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
US20170318238A1 (en) * 2014-12-10 2017-11-02 Sony Corporation Image pickup apparatus, image pickup method, program, and image processing apparatus
CN109474770A (en) * 2017-09-07 2019-03-15 华为技术有限公司 A kind of imaging device and imaging method
CN110136183A (en) * 2018-02-09 2019-08-16 华为技术有限公司 A kind of method and relevant device of image procossing
CN108965654A (en) * 2018-02-11 2018-12-07 浙江宇视科技有限公司 Double spectrum camera systems and image processing method based on single-sensor
WO2020168465A1 (en) * 2019-02-19 2020-08-27 华为技术有限公司 Image processing device and method
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王光霞;冯华君;徐之海;李奇;陈跃庭;: "基于块匹配的低光度图像对融合方法", 光子学报, no. 04 *

Also Published As

Publication number Publication date
CN114143418B (en) 2023-12-01
TWI767484B (en) 2022-06-11
TWI797528B (en) 2023-04-01
TW202211674A (en) 2022-03-16
TWI767468B (en) 2022-06-11
TWI778476B (en) 2022-09-21
CN114143418A (en) 2022-03-04
CN114143443B (en) 2024-04-05
CN114143421A (en) 2022-03-04
TW202211673A (en) 2022-03-16
CN114143421B (en) 2024-04-05
TW202211165A (en) 2022-03-16
CN114143419B (en) 2023-12-26
CN114143420A (en) 2022-03-04
CN114143419A (en) 2022-03-04
TW202211160A (en) 2022-03-16
TW202211161A (en) 2022-03-16
TWI764484B (en) 2022-05-11

Similar Documents

Publication Publication Date Title
US10574961B2 (en) Image processing apparatus and image processing method thereof
US8106965B2 (en) Image capturing device which corrects a target luminance, based on which an exposure condition is determined
KR101360543B1 (en) Automatic backlight detection
US8614751B2 (en) Image processing apparatus and image processing method
US11689822B2 (en) Dual sensor imaging system and privacy protection imaging method thereof
US9036046B2 (en) Image processing apparatus and method with white balance correction
US20200228770A1 (en) Lens rolloff assisted auto white balance
US11838648B2 (en) Image processing device, imaging apparatus, image processing method, and program for determining a condition for high dynamic range processing
US11496694B2 (en) Dual sensor imaging system and imaging method thereof
CN113691795A (en) Image processing apparatus, image processing method, and storage medium
CN114143443B (en) Dual-sensor imaging system and imaging method thereof
CN110677558B (en) Image processing method and electronic device
US20200228769A1 (en) Lens rolloff assisted auto white balance
US11496660B2 (en) Dual sensor imaging system and depth map calculation method thereof
US11568526B2 (en) Dual sensor imaging system and imaging method thereof
KR101710629B1 (en) Image pickup apparatus and image pickup method
JP2017034536A (en) Image processing apparatus, image processing method, and program
KR102068164B1 (en) Imaging device imaging method and imaging program thereof
Koskiranta Improving Automatic Imaging Algorithms with Dual Camera System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant