CN118264914A - Image pickup device - Google Patents

Image pickup device Download PDF

Info

Publication number
CN118264914A
CN118264914A CN202410417378.5A CN202410417378A CN118264914A CN 118264914 A CN118264914 A CN 118264914A CN 202410417378 A CN202410417378 A CN 202410417378A CN 118264914 A CN118264914 A CN 118264914A
Authority
CN
China
Prior art keywords
image
image sensor
resolution
optical signal
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410417378.5A
Other languages
Chinese (zh)
Inventor
黄进新
汪鹏程
刘军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN118264914A publication Critical patent/CN118264914A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the application discloses an imaging device using an asymmetric image sensor, which comprises two sensors with different resolutions, wherein a low-resolution image sensor generates a color image, a high-resolution image sensor generates a color image or a gray image, and color information and brightness information of the images generated by the two image sensors are acquired and fused, so that a target image is generated, and the imaging effect of a camera in a low-illumination environment is improved.

Description

Image pickup device
The present application is a divisional application, the application number of the original application is 202080000505.1, the original application date is 2020, 03, 20, and the whole content of the original application is incorporated by reference.
Technical Field
The embodiment of the application relates to the field of security monitoring, in particular to an image pickup device.
Background
A low light scene (low illumination scene/low LIGHT SCENE) refers to a scene with insufficient light, such as outdoors at night or indoors with insufficient illumination. In the security monitoring field, in order to obtain an image with high definition and rich colors in a low-illumination scene, a camera is often matched with a visible light supplementing lamp or an infrared light supplementing lamp. However, the use of the visible light-supplementing lamp is easy to cause light pollution and is unfavorable for hidden monitoring; the infrared light supplement lamp can not record color although imaging is clear. In recent years, a dual-light fusion architecture is widely adopted in the industry, under the architecture, a camera uses 2 sensors to image infrared light and visible light respectively, and then an infrared image and a visible light image are fused, so that the imaging capability of the camera under low illumination is improved.
Specifically, a beam splitting prism is provided in the video camera, and the beam splitting prism splits incident light into visible light and infrared light according to a frequency spectrum. Then, the camera inputs the aforementioned visible light and infrared light to two identical image sensors, respectively. Wherein, the image sensor inputting visible light outputs a color image, the image sensor inputting infrared light outputs a gray image, and the size and resolution of the color image and the gray image are the same. The camera fuses the color image and the gray level image to obtain a target image, wherein the details and textures of the target image mainly come from the gray level image, and the color information comes from the color image.
In the foregoing scheme, when the ambient illuminance is lower than the lower limit of the imaging signal-to-noise ratio of the visible light image sensor, the color information of the color image is submerged by noise, resulting in low color saturation of the finally output fusion image and even degradation to a gray-scale image. Therefore, under the current dual-light fusion architecture, it is important to further reduce the lower limit of the working illuminance of the camera.
Disclosure of Invention
The embodiment of the application provides an image pickup device which is used for outputting higher-quality images in a low-illumination environment and reducing the lower limit of the working illumination of the image pickup device.
In a first aspect, an embodiment of the present application provides an image capturing apparatus including: an optical assembly, a first image sensor, a second image sensor, and an image processor. The resolution of the first image sensor is less than the resolution of the second image sensor. The optical component is used for receiving the incident light signal and processing the incident light signal into a first light signal and a second light signal. The first image sensor is used for generating a first image by sensing the first optical signal; meanwhile, the second image sensor is used for generating a second image by sensing the second optical signal. Wherein the image information of the first image includes first color information and first luminance information; the image information of the second image includes second luminance information. In addition, the resolution of the first image is less than the resolution of the second image. The image processor is used for generating a target image based on the first image and the second image. The color and brightness of the target image are determined by the image information of the first image and the image information of the second image.
In the embodiment of the application, the image pickup device adopts two image sensors with different resolutions. Since image resolution is inversely related to color sensitivity, it is positively related to image sharpness. Therefore, the first image output by the first image sensor with low resolution has higher color sensitivity, and the color reality of the first image can be ensured. And the second image sensor with high resolution has more pixels, so that the output second image has higher definition and can present rich details and textures. Therefore, the target image generated based on the two images can retain the advantages of the first image and the second image, and therefore, the image pickup apparatus can be operated in an environment of lower illumination intensity.
In a first implementation manner of the first aspect of the embodiment of the present application, according to the first aspect, the image processor is specifically configured to: adjusting the resolution of the first image to be the same as the resolution of the second image to obtain a third image; then, the third image is fused with the second image to obtain a target image. The third image carries the first color information and the first brightness information.
In this embodiment, since the resolution of the first image is smaller than the resolution of the second image, the resolution of the first image is adjusted to be the same as the resolution of the second image, and it can be understood that the resolution of the first image is increased to the resolution of the second image to obtain the third image. Since the first image has a high color sensitivity, a true color and a high brightness, the third image also has a true color and a high brightness. Therefore, the target image determined based on the aforementioned third image and second image can retain the advantages of the third image and second image, and therefore, the image pickup apparatus can be operated in an environment of lower illumination intensity.
In a second implementation manner of the first aspect according to the first aspect or the first implementation manner of the first aspect, the first optical signal includes visible light, the second optical signal includes visible light and infrared light, energy of the visible light in the first optical signal is greater than energy of the visible light in the second optical signal, and a frequency band of the visible light in the first optical signal is the same as a frequency band of the visible light in the second optical signal.
The ratio between the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal can be flexibly controlled according to practical application requirements.
In the present embodiment, it is proposed that the processing of the incident optical signal by the aforementioned optical component includes not only frequency division but also energy division. The dividing energy refers to dividing visible light in an incident light signal, and may also be understood as processing to make the intensity of visible light in a first light signal different from the intensity of visible light in a second light signal. Since the energy of the visible light in the first optical signal is different from the energy of the visible light in the second optical signal, it is advantageous that the brightness of the two images obtained by the two image sensors is different when the illumination intensity is higher than the preset value (i.e. in daytime). Determining the target image based on the two images with different brightness is beneficial to improving the dynamic range.
According to a third implementation manner of the first aspect of the present disclosure, according to a second implementation manner of the first aspect, the first image sensor is a color image sensor, and the second image sensor is a black-and-white image sensor.
In the present embodiment, a black-and-white image sensor is used as the second image sensor. Since the color filter matrix of the black-and-white image sensor has higher light transmittance and higher photoelectric conversion efficiency than the color filter matrix of the same specification color image sensor. Therefore, the brightness of the second image output by the second image sensor can be improved, and the quality of the target image can be improved. Therefore, the imaging device can be operated in an environment with lower illumination intensity.
According to a third implementation manner of the first aspect, in a fourth implementation manner of the first aspect of the embodiment of the present application, the image capturing device further includes an infrared cut filter. The camera device is also used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is positioned between the optical component and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal. The second image sensor is specifically configured to generate the second image by sensing visible light in the second optical signal. The image processor is specifically configured to combine the first color information of the first image with the second brightness information of the second image to obtain the target image, where the color of the target image is determined by the first color information, and the brightness of the target image is determined by the second brightness information.
In this embodiment, it is proposed that the infrared cut filter is used to filter the infrared light in the second optical signal under the condition of high illumination intensity, so as to cut off the influence of the infrared light when the second image sensor senses the visible light. Further, the second image sensor is a black-and-white image sensor, and the second image determined by the second image sensor sensing only visible light in the second light signal only exhibits brightness and no color. Although the resolution of the first image is different from the resolution of the second image, the first image provides rich colors (i.e., first color information) and the second image provides high-brightness texture details (i.e., second brightness information). Thus, in some cases, some or all of the first color information in the first image and some or all of the second luminance information in the second image may be employed in combination to obtain the target image. For example, when the luminance indicated by the first luminance information is equal to the luminance indicated by the second luminance information, part or all of the first color information in the first image and part or all of the second luminance information in the second image may be combined to obtain the target image.
According to a third implementation manner of the first aspect, in a fifth implementation manner of the first aspect of the embodiment of the present application, the second image sensor is specifically configured to sense visible light and infrared light in the second optical signal and generate the second image when the illumination intensity is lower than a preset value. The image processor is specifically configured to fuse the third image with the second image to obtain the target image.
In the present embodiment, it is proposed that the second image sensor senses infrared light and a part of visible light at the same time when the illumination intensity is low. Since the second image sensor is a black-and-white image sensor, the second image output by the second image sensor has only brightness. However, since the second image sensor senses visible light in addition to infrared light, the second image sensor senses infrared light and the brightness of the second image output by the visible light is greater than the brightness of the second image output by the second image sensor which senses only infrared light. Therefore, the quality of the second image can be improved, and the quality of the target image can be improved.
According to a second implementation manner of the first aspect, in a sixth implementation manner of the first aspect of the embodiment of the present application, the first image sensor and the second image sensor are both color image sensors.
According to a sixth implementation manner of the first aspect, in a seventh implementation manner of the first aspect of the embodiment of the present application, the image capturing device further includes an infrared cut filter. The camera device is also used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is positioned between the optical component and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal. The second image sensor is specifically configured to generate the second image by sensing visible light in the second optical signal, and the image information of the second image further includes second color information. The image processor is specifically configured to fuse the third image with the second image to obtain the target image.
In this embodiment, it is proposed that the infrared cut filter is used to filter the infrared light in the second optical signal under the condition of high illumination intensity, so as to cut off the influence of the infrared light when the second image sensor senses the visible light. Since the second image sensor is a color image sensor, the second image sensor senses that the second image determined by the visible light includes not only the second brightness information but also the second color information. Further, since the energy of the visible light sensed by the first image sensor and the second image sensor is different, the third image and the second image are two images having different brightness. And determining the target image based on the two images with different brightness, so that the quality of the target image can be improved, and the dynamic range can be also improved.
According to a sixth implementation manner of the first aspect, in an eighth implementation manner of the first aspect of the embodiment of the present application, the image capturing device further includes a visible light cut filter. The camera device is further used for starting the visible light cut-off filter when the illumination intensity is lower than a preset value, the visible light cut-off filter is located between the optical assembly and the second image sensor, and the visible light cut-off filter is used for filtering visible light in the second light signal. The second image sensor is specifically configured to generate the second image by sensing infrared light in the second optical signal. The image processor is specifically configured to fuse the third image with the second image to obtain the target image.
In this embodiment, it is proposed that a visible light filter is used to filter out visible light in the second optical signal when the illumination intensity is low, so as to reduce the influence of visible light when the second image sensor senses infrared light. Although the second image sensor is a color image sensor, the second light signal contains only infrared light. Therefore, the second image contains only the second luminance information and no color information.
In a ninth implementation form of the first aspect as such or according to any of the first implementation form of the first aspect to the eighth implementation form of the first aspect, the optical assembly comprises a lens and a beam splitting prism, the beam splitting prism being located between the lens and the image sensor. The lens is used for receiving the incident light signal. The beam splitter prism is used for dividing the incident light signal received by the lens into the first light signal and the second light signal.
According to a ninth implementation manner of the first aspect, in a tenth implementation manner of the first aspect of the embodiment of the present application, the lens is an infrared confocal lens.
According to the first aspect, any one of the first implementation manner to the eighth implementation manner of the first aspect, in an eleventh implementation manner of the first aspect, the image capturing device further includes an infrared cut filter. The optical assembly comprises a first lens and a second lens, wherein the first lens and the second lens are used for receiving the incident light signals together, the focal length of the first lens is the same as that of the second lens, the aperture of the first lens is larger than that of the second lens, an infrared cut-off filter is arranged between the first lens and the first image sensor, and the second lens is an infrared confocal lens. The first lens is used for receiving a part of the incident light signals and transmitting the received light signals to the infrared cut-off filter. The infrared cut-off filter is used for filtering infrared light in the optical signal from the first lens to obtain the first optical signal and transmitting the first optical signal to the first image sensor. The second lens is used for receiving the rest part of the incident light signals and transmitting the received light signals to the second image sensor as second light signals.
In the present embodiment, it is proposed that the energy of the optical signals illuminating different image sensors can be made different by using the binocular heads of different apertures. Since the larger the aperture is, the more the luminous flux is, the energy of the visible light in the first optical signal output by the first lens is larger than the energy of the visible light in the second optical signal output by the second lens. In addition, an infrared cut-off filter is arranged between the first lens and the first image sensor, so that only visible light is contained in the first optical signal, and infrared light is not contained in the first optical signal.
According to a twelfth implementation of the first aspect of the embodiment of the present application, according to any one of the foregoing implementations, the resolution of the second image is equal to the resolution of the target image.
In a second aspect, an embodiment of the present application provides an image processor coupled to a memory in an image capture device. The memory is used for storing data or programs processed by the processor, such as the first image, the second image and the third image; the image processor is used for calling the program in the memory to perform image processing on the first image, the second image and the third image.
From the above technical solutions, the embodiment of the present application has the following advantages:
In the embodiment of the application, the image pickup device adopts two image sensors with different resolutions. Since image resolution is inversely related to color sensitivity, it is positively related to image sharpness. Therefore, the first image output by the first image sensor with low resolution has higher color sensitivity, and the color reality of the first image can be ensured. And the second image sensor with high resolution has more pixels, so that the output second image has higher definition and can present rich details and textures. Therefore, the target image determined based on the two images can retain the advantages of the first image and the second image, and thus, the image pickup apparatus can be operated in an environment of lower illumination intensity.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application.
FIG. 1A is a schematic diagram of an embodiment of an image capturing apparatus according to an embodiment of the present application;
fig. 1B is a schematic diagram of another embodiment of an image capturing apparatus according to the present application;
fig. 1C is a schematic diagram of another embodiment of an image capturing apparatus according to the present application;
fig. 2A is a schematic diagram of another embodiment of an image capturing apparatus according to the present application;
FIG. 2B is a schematic diagram of an embodiment of an image processing procedure according to an embodiment of the present application;
FIG. 2C is a schematic diagram of another embodiment of an image processing procedure according to an embodiment of the present application;
FIG. 3A is a schematic diagram of another embodiment of an image capturing apparatus according to an embodiment of the present application;
FIG. 3B is a schematic diagram of another embodiment of an image processing procedure according to an embodiment of the present application;
fig. 3C is a schematic diagram of another embodiment of an image processing procedure according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the application provides an image pickup device which is used for outputting higher-quality images in a low-illumination environment and reducing the lower limit of the working illumination of the image pickup device.
For easy understanding, the application scenario of the image capturing apparatus provided by the embodiment of the present application is described below:
the image pickup device provided by the embodiment of the application can be applied to image pickup in a low-illumination (low-illumination/low light) environment. Specifically, the low-illuminance environment is an environment in which the illumination intensity is lower than a certain value, and is generally measured by the energy of visible light received per unit area of an image sensor in an image pickup device, and the unit is Lux (Lux, also simply referred to as Lx). Generally, an illumination environment of more than 0Lux and less than 1Lux may be referred to as a low-illuminance environment. In particular, the low-light environment may be an outdoor dim street, for example, a street at night, or a street in overcast and rainy days; the room may be a room with only weak light, for example, a store, a warehouse, or the like with only weak light, and the present application is not limited thereto.
In the low-illuminance scene, when the light sensitive area of the image sensor is fixed, the color sensitivity of the output image and the resolution of the image are inversely related. It is contemplated that the image sensor may be capable of providing a higher spatial resolution for grayscale images generated by sensing infrared light, and that the human eye vision system is more sensitive to luminance information and less sensitive to color spatial resolution. Therefore, the spatial resolution of part of the color image can be sacrificed for improving the color sensitivity, so that the effect of reducing the lower limit of the working illumination of the image pickup device is achieved. For example, the lower limit of the original camera's working illuminance requirement is 1Lux, and it is difficult to obtain a color image acceptable to the human eye below 1 Lux. After the scheme provided by the embodiment of the invention is used, the lower limit of the working illumination of the image pickup device can be reduced to 0.1Lux or even 0.01Lux, and the method is not limited in the specific application.
An image pickup apparatus according to an embodiment of the present application will be described below, and as shown in fig. 1A, the image pickup apparatus 10 (for example, a video camera) includes: a first image sensor 101, a second image sensor 102, an optical assembly 103, and an image processor 104. The optical component 103 is configured to receive an incident light signal, and process the incident light signal into a first light signal and a second light signal, where the incident light signal is emitted by an object photographed by the photographing device 10. The optical assembly 103 is also used to control the first optical signal to the first image sensor 101 and the second optical signal to the second image sensor 102. Wherein the first image sensor 101 is configured to generate a first image in response to the first light signal, the first image being a color image, and image information of the first image including first color information and first luminance information. The second image sensor 102 is configured to generate a second image in response to the second light signal, where the second image is a color image or a grayscale image, and the image information of the second image includes second luminance information (when the second image is a color image, the second image also includes second color information). Further, the resolution of the first image sensor 101 is smaller than the resolution of the second image sensor 102. The image processor 104 is configured to generate a target image based on the first image and the second image, where the color and brightness of the target image are determined by the image information of the first image and the image information of the second image. For example, the image processor 104 may be a system on chip (SoC).
It should be appreciated that since the resolution of the first image sensor 101 is less than the resolution of the second image sensor 102, the resolution of the first image is less than the resolution of the second image. It should also be appreciated that the first image sensor 101 is a color image sensor since the first image contains first color information, i.e. the first image is capable of assuming color. The second image sensor 102 may be a color image sensor or a black-and-white image sensor, and is not described herein.
In the present embodiment, the image pickup apparatus 10 employs two image sensors having different resolutions. Since image resolution is inversely related to color sensitivity, it is positively related to image sharpness. Therefore, the first image output by the first image sensor 101 with low resolution has high color sensitivity, and the color reality of the first image can be ensured. And the second image sensor 102 with high resolution has more pixels, so the second image output has higher definition and can present rich details and textures. Therefore, the target image determined based on the two images can retain the advantages of the first image and the second image, and thus, the image pickup apparatus can be operated in an environment of lower illumination intensity.
Based on the foregoing embodiments, the process of determining the target image by the image processor 104 may include the following:
First, the image processor 104 adjusts the resolution of the first image to be the same as the resolution of the second image, resulting in a third image. Then, the image processor 104 determines the target image based on the third image and the second image. Optionally, the resolution of the target image is equal to the resolution of the second image. In the foregoing process, although the resolution of the third image is different from the resolution of the first image, the third image exhibits a color and brightness from the first image. Therefore, the process of adjusting the low-resolution first image to the high-resolution third image can not only keep the color and brightness of the first image, but also facilitate the determination of the target image based on the third image and the second image.
It should be understood that the resolution of the two images is the same in the present embodiment, and it is understood that the resolution of the two images is identical; it will also be appreciated that there is some difference between the resolutions of the two images, but the difference is insufficient to affect subsequent processing of the two images, and is not limited in this particular context.
It should be appreciated that in the process of the image processor 104 adjusting the resolution of the first image to be the same as the resolution of the second image, the image processor 104 may employ an upsampling algorithm or a super resolution algorithm, which is not limited herein. In addition, the process of determining the target image by the image processor 104 based on the aforementioned third image and second image mainly refers to an image fusion (image fusion) process. Image fusion refers to an image processing technique that can combine two or more images into a new image using a specific algorithm, and the combined image has excellent characteristics of the original image (i.e., two or more images before combination). Such as brightness, sharpness, and color.
Optionally, the image capturing apparatus 10 further includes an image signal processor (IMAGE SIGNAL processor, ISP) chip (not shown) located between the aforementioned image sensor and the image processor 104. Alternatively, the ISP chip may be a two-way processing process, which processes the first image output by the first image sensor 101 and the second image output by the second image sensor 102, and sends the processed first image and second image to the image processor 104 for subsequent processing. Alternatively, the ISP chip may perform a plurality of ISP processes on the first image and the second image, for example, 3D noise reduction, demosaicing, brightness correction, color correction, and the like. The basic image processing may be adjusted according to practical application requirements, and the content of the basic image processing is not particularly limited herein.
In the present embodiment and the subsequent embodiments, the image from the image output from the first image sensor 101 to the image before the resolution is adjusted is referred to as a first image. Similarly, the image output from the second image sensor 102 until the image before the image fusion or combination is performed is referred to as a second image. That is, in practical applications, the original image output from the image sensor may undergo various processing flows, but the embodiment of the present application is not limited to the foregoing processing flows. In other words, if the RAW image output from the image sensor is directly fused in the subsequent step, the first image is in a RAW format; if the RGB images output by the ISP chip are fused, the first image is in an RGB format; if the YUV images output by the ISP chip are fused, the first image is in YUV format (for redundancy reduction, the following embodiments will only take the example that the first and second images are both in YUV format). In addition, neither the first image nor the second image is compression-encoded, and the fused image can be encoded to generate an image format which is easily recognized by human eyes and occupies a smaller memory space, such as jpeg format (jpg format for short), bmp format, tga format png format, and gif format.
Optionally, the ISP chip may also perform image correction on the third image and the second image before fusing the third image and the second image. Specifically, the ISP chip may correct the coordinate system of the third image into the coordinate system of the second image, so that the scenes in the two images are aligned, and may also understand that the texture details of the presentation of the two images are aligned. Optionally, the ISP chip may adjust the two images by using preset correction parameters, or adaptively configure correction parameters according to the current temperature change to complete image correction.
Alternatively, the first image sensor 101 or the second image sensor 102 may be a CCD image sensor formed by a charge coupled device (charged coupled device, CCD), a CMOS image sensor formed by a complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS), or other types of image sensors, which are not limited herein.
In the foregoing embodiment, the image capturing apparatus 10 employs an asymmetric image sensor architecture (i.e., the resolutions of the two image sensors in the image capturing apparatus 10 are different), and the color sensitivity (i.e., the low-sensitivity) can be traded for the spatial resolution of the first image sensor that senses visible light. Therefore, the imaging device 10 can be made to operate in a low-illuminance environment better. On the basis of this, the optical component 103 is specifically configured to process the incident light signal such that the energy of the visible light in the first light signal is greater than the energy of the visible light in the second light signal. Wherein the first optical signal comprises visible light and the second optical signal comprises visible light and infrared light. It is also understood that the optical component 103 divides the incident optical signal by the frequency and energy. The frequency division refers to dividing an incident light signal according to different frequencies, for example, dividing the incident light signal into visible light and infrared light. The energy is proportional to the square of the amplitude of the light wave, and the energy of the visible light is also understood as the intensity of the visible light. Accordingly, the energy separation can be understood as separating the visible light in the incident light signal into the first light signal and the second light signal by a physical structure such as a plating film using a lens, the intensity of the visible light in the first light signal is different from the intensity of the visible light in the second light signal, and the intensity ratio between the visible light in the first light signal and the visible light in the second light signal is fixed. It should be noted that the frequency band of the visible light in the first optical signal is the same as the frequency band of the visible light in the second optical signal; or the frequency band of the visible light in the first optical signal and the visible light in the second optical signal have the same frequency band, for example, the first optical signal and the second optical signal both have green light.
In such an embodiment, since the energy of the visible light in the first light signal is different from the energy of the visible light in the second light signal, it is advantageous that the two image sensors obtain different brightness of the two images when the illumination intensity is higher than the preset value (i.e. during daytime). Determining the target image based on the two images with different brightness is beneficial to improving the dynamic range. In addition, the ratio between the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal can be flexibly controlled according to the actual application requirements. For example, the first optical signal and the second optical signal contain the same visible light band, and the ratio of the energy of the visible light in the first optical signal to the energy of the visible light in the second optical signal may be maintained as 9:1, or maintained at 8:2, or maintained at 7:3, or maintained at 6:4, or maintained at 6.5:3.5, etc., and is not specifically limited herein.
Further, the foregoing optical component 103 may be implemented as any one of the following:
As shown in fig. 1B, one implementation of the optical assembly 103 described above. The optical assembly 103 includes a beam splitting prism 1031 and a lens 1032, and the beam splitting prism 1031 may also be called a beam splitting prism (beam splitter), which is an optical device that splits an incident optical signal into two beams by refracting and reflecting light rays by plating one or more thin films on the surface of an optical glass. In this embodiment, the beam splitter prism 1031 is used to split the incident optical signal into a first optical signal and a second optical signal. Wherein the first optical signal comprises visible light and the second optical signal comprises visible light and infrared light. Specifically, a part of the visible light in the incident light signal is directed to the first image sensor 101 through the coating layer, and another part of the visible light and all of the infrared light in the incident light signal are reflected at the coating layer and are irradiated to the second image sensor 102. It should be understood that if different optical components select optical glasses having different thicknesses and constituent coating layers, the ratio between the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal may be different for each of the different optical components. Optionally, the lens 1032 is an infrared confocal lens, and the infrared confocal lens is used to implement infrared confocal.
In addition, the optical component 103 is often used in combination with a filter. Specifically, a filter (for example, the filter 1051) may be provided between the beam splitter prism 1031 and the first image sensor 101, or a filter (for example, the filter 1052) may be provided between the beam splitter prism 1032 and the second image sensor 102. Alternatively, when the aforementioned filter 1051 is an infrared cut filter, the first optical signal entering the first image sensor 101 may be prevented from being mixed into infrared light. Optionally, when the optical filter 1052 is an infrared cut-off filter, infrared light in the second optical signal may be filtered; when the optical filter 1052 is a visible light cut-off filter, visible light in the second optical signal may be filtered out; when the optical filter 1052 is a white glass (the material of the white glass is colorless transparent glass, and light is not filtered), the visible light and the infrared light in the second optical signal can pass through, that is, the optical signal in the full frequency band is allowed to pass through the white glass.
In this embodiment, by redesigning the splitting prism and combining the optical filters to achieve frequency division (in the first and second optical signals, infrared light is avoided in the first optical signal by using the infrared filtering cut-off sheet, and splitting of the infrared light frequency band and the visible light frequency band is achieved), energy (in the first and second optical signals, visible light in the same frequency band is included, energy of the two paths of visible light is different, and splitting of the energy is achieved), visible light and infrared light can be divided into two image sensors, and meanwhile, the proportion of the visible light in the incident optical signal entering the two image sensors can be controlled. Therefore, the final output fusion image can be improved in dynamic range (DYNAMIC RANGE) when the illumination intensity is high (e.g., daytime), and the color of the target image can be further improved to be more natural on the basis of the improvement in low illumination when the illumination intensity is low (e.g., nighttime).
As shown in fig. 1C, another implementation of the aforementioned optical assembly 103.
The optical assembly 103 includes a first lens 1033 for converging a portion of the incident light signal such that the output light signal is directed to the first image sensor 101 and a second lens 1034 for converging the remaining portion of the incident light signal such that the output light signal is directed to the second image sensor 102. Wherein the focal length of the first lens 1033 is the same as the focal length of the second lens 1034. Optionally, the aperture of the first lens 1033 is larger than the aperture of the second lens 1034. Therefore, the light flux of the first lens 1033 is larger than that of the second lens 1034. Accordingly, the energy of the visible light output by the first lens 1033 is greater than the energy of the visible light output by the second lens 1034. It is also understood that the optical signals actually passing through the first lens 1033 and the second lens 1034 are different. In addition, the explanation of the energy of the visible light can be referred to the related description of the corresponding embodiment of fig. 1B, which is not repeated here.
Further, a filter (e.g., filter 1051) is disposed between the first lens 1033 and the first image sensor 101, and a filter (e.g., filter 1052) may be disposed between the second lens 1034 and the second image sensor 102. Specifically, the filter 1051 is an infrared cut filter for filtering infrared light in the optical signal from the first lens 1033, so that only visible light but not infrared light is in the first optical signal sensed by the first image sensor 101. Optionally, when the optical filter 1052 is an infrared cut-off filter, infrared light in the second optical signal may be filtered; when the optical filter 1052 is a visible light cut-off filter, visible light in the second optical signal may be filtered out; when the optical filter 1052 is a white glass (the material of the white glass is colorless transparent glass, and light is not filtered), the visible light and the infrared light in the second optical signal can pass through, that is, the optical signal in the full wave band is allowed to pass through the white glass. In addition, the second lens 1034 is an infrared confocal lens.
In this embodiment, two lenses with the same focal length but different aperture sizes are used to form a binocular lens to achieve energy division (different light energies divided by different sensors). Since the size of the aperture determines the intensity of light passing through the lens, different energy of visible light entering the two image sensors can be controlled to be different by adopting apertures with different sizes. And combining the asymmetric image sensor architecture, and adopting different optical filters under different illumination intensities, the final output fusion image can improve the dynamic range when the illumination intensity is higher (for example, in daytime), and further make the color of the target image more natural when the illumination intensity is lower (for example, in the night).
In practical applications, the optical component 103 in fig. 1A and the optical components related to the foregoing may be implemented as shown in fig. 1B, or may be implemented as shown in fig. 1C, which may be specifically selected according to the application scenario, and is not limited herein.
It should be understood that the two image sensors in the foregoing image pickup apparatus may differ in photosensitivity in addition to the resolution. The following description will be made respectively:
As shown in fig. 2A, an embodiment of the imaging device is described above. In this embodiment, the first image sensor is a color image sensor, and the second image sensor is a black-and-white image sensor. Since the resolution of the first image sensor is smaller than that of the second image sensor, the first image sensor is hereinafter referred to as a low-resolution color image sensor, and the second image sensor is referred to as a high-resolution black-and-white image sensor. Alternatively, the low resolution color image sensor may be a bayer format image sensor (Bayer image sensor) or other format color image sensor; the high-resolution black-and-white image sensor may be a MONO format image sensor (Mono image sensor) or other format black-and-white image sensor, which is not limited herein. An infrared cut-off filter is arranged between the low-resolution color image sensor and the optical component, and a double-light filter is arranged between the high-resolution black-and-white image sensor and the optical component. The dual optical filter is also known as an IR-CUT auto-switching filter. The IR-CUT automatic switching filter is provided with a photosensitive device, or the IR-CUT automatic switching filter is connected with the photosensitive device, and the photosensitive device transmits the sensed illumination intensity to the camera device. When detecting the change of illumination intensity, the image pickup device (particularly, the filter control chip in the image pickup device) can control the IR-CUT automatic switching filter to be automatically switched. For example, when the illumination intensity is greater than a preset value (e.g., daytime), the infrared cut filter is switched to, and when the illumination intensity is less than a preset value (e.g., nighttime), the visible light cut filter is switched to. Alternatively, the dual optical filter may be switched to a white slide, allowing both visible and infrared light to pass through. Alternatively, the dual optical filter may be replaced with an infrared cut filter, and the image pickup device controls the enabling and disabling of the infrared cut filter.
When the illumination intensity is lower than the preset value, the description is made with reference to fig. 2B.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. The first color information is used for indicating the color of the low-resolution color image, and the first brightness information is used for indicating the brightness of the low-resolution color image. The present embodiment is not limited to the specific form of the first luminance information and the first color information.
The dual optical filter arranged between the high-resolution black-and-white image sensor and the optical component is switched to a white glass slide, and the high-resolution black-and-white image sensor senses visible light and infrared light in the second optical signal and outputs a high-resolution gray-scale image. Wherein the high resolution gray scale image includes second luminance information for indicating a luminance of the high resolution gray scale image presentation. Alternatively, the second luminance information may be represented by a luminance component Y. The embodiment of the present application is not limited to a specific form of the second luminance information. The image directly generated by the image sensor is in a RAW format, and the RAW format is also classified into various types according to the design of the image sensor, and may be, for example, bayer RGGB, RYYB, RCCC, RCCB, RGBW, CMYW, or other various formats. Using the ISP chip, RAW images of various formats can be converted into RGB format images. Using ISP chips, RAW format images may also be converted to YUV format images, or HSV format images, or Lab format images, or CMY format images, or YCbCr format images. For example: the ISP chip converts the RAW format image into an RGB format image and then converts the RGB format image into a YUV format image. The ISP chip in the image pickup apparatus may perform basic image processing, such as 3D noise reduction, demosaicing, brightness correction, and color correction, on the low-resolution color image and the high-resolution gray-scale image.
In addition, the image processor adjusts the low resolution color image to a high resolution color image having the same resolution as the high resolution gray scale image using an up-sampling algorithm or a super resolution algorithm. Then, the image processor fuses the high-resolution color image and the high-resolution gray scale image to obtain a target image.
Taking fig. 2B as an example, the low-resolution RGB format image 201 is converted into a low-resolution YUV format image 202, and then the low-resolution YUV format image 202 is up-sampled to obtain a high-resolution YUV format image 203. Then, the high-resolution YUV format image 203 and the high-resolution gray scale image 204 having only the Y component are fused, resulting in a high-resolution YUV format image 205 (i.e., a target image).
It should be understood that during actual image processing, images in other formats may appear in addition to those listed in fig. 2B, and are not limited in this regard.
It should be appreciated that the aforementioned Y component, Y in YUV format, represents brightness (luminance/luma) and U and V represent chromaticity/concentration (chrominance/chroma). YUV formats mainly include YUV420, YUV422, and YUV444. Wherein YUV444 refers to a set of UV components for each Y component; YUV422 refers to sharing a set of UV components for every two Y components; YUV420 refers to sharing a set of UV components for every four Y components. Although the foregoing example of YUV420 is shown in fig. 2B, in practical application, the format of the image may be adjusted according to specific requirements, which is not limited herein. In addition, although the YUV format and the RGB format are different color coding modes, the change of the coding format does not affect the color of the image presentation.
In this embodiment, it is proposed that the high-resolution black-and-white image sensor senses infrared light and a part of visible light at the same time when the illumination intensity is low, instead of sensing only infrared light or only visible light. Thus, the second image sensor senses the brightness of the second image of infrared light and visible light output greater than the second image sensor senses only the brightness of the second image of infrared light output. Therefore, the quality of the second image can be improved, and the quality of the target image can be improved.
When the illumination intensity is higher than the preset value, the description is made with reference to fig. 2C.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. The first color information is used for indicating the color of the low-resolution color image, and the first brightness information is used for indicating the brightness of the low-resolution color image. The embodiment of the application is not limited to the specific form of the first brightness information and the first color information. Optionally, when converting the low resolution color image into YUV format, the first color information is a U/V component and the first luminance information is a Y component.
The dual optical filter arranged between the high-resolution black-and-white image sensor and the optical component is switched to an infrared cut-off filter, and the infrared cut-off filter is used for filtering infrared light in the second optical signal. Therefore, the high-resolution black-and-white image sensor senses the visible light in the second optical signal and outputs a high-resolution gray-scale image. Wherein the high resolution gray scale image includes second luminance information for indicating a luminance of the high resolution gray scale image presentation. Alternatively, the second luminance information may be represented by a luminance component Y. Since the high resolution black and white image sensor cannot record color, the second image only exhibits brightness and cannot exhibit color. Thus, the second image has only the luminance component Y and no chrominance component U/V. It should be understood that when the formats of the image sensors are different, the formats of the output images will be different, and detailed descriptions thereof are omitted herein.
In this embodiment, when the brightness presented by the first brightness information and the brightness presented by the second brightness information are made to be the same by controlling the exposure time and the gain, the image processor may combine the first color information of the low resolution color image with the second brightness information of the high resolution gray scale image to obtain the target image. Wherein the color of the target image is determined by the first color information, and the brightness of the target image is determined by the second brightness information. Specifically, the image processor may combine the color component (i.e., the U/V component) of the aforementioned low-resolution color image with the luminance component (i.e., the Y component) of the high-resolution gray-scale image to obtain the target image. In such an embodiment, a higher quality target image can be obtained without employing a complex fusion algorithm, and the data throughput of the image processor can be reduced.
Optionally, the ratio of the resolution of the color image sensor to the resolution of the black-and-white image sensor is 1:4, and the ratio of the resolution of the low resolution color image (i.e., the first image) to the resolution of the high resolution grayscale image (i.e., the second image) is 1:4. Alternatively, the low resolution color image and the high resolution gray scale image are both represented in YUV format. Optionally, the low-resolution color image adopts a YUV444 format, and the high-resolution gray-scale image adopts a YUV420 format. At this time, the ratio of the number of Y components of the high-resolution gray-scale image to the number of U/V components of the low-resolution color image is 4:1. Thus, the image processor may output the target image in YUV420 format. Taking fig. 2C as an example, the low resolution RGB format image 211 is converted into a low resolution YUV444 format image 212. Then, the U/V component in the low resolution YUV444 format image 212 and the Y component in the high resolution gray scale image 213 are combined to obtain the high resolution YUV420 format image 214 (i.e., the target image).
It should be appreciated that the ratio of the resolution of the low resolution color image sensor to the resolution of the high resolution grayscale image sensor may be other values, such as 1:2 or 1:16, and the like, and is not limited thereto. When the ratio of the resolutions of the two image sensors is different, the image format adopted in the image fusion process is also adaptively adjusted, so that the calculated amount of outputting the target image is reduced while the target image with better quality is output.
Optionally, the ratio of the visible light in the first optical signal to the visible light in the second optical signal in the present embodiment is 3:2. It will also be appreciated that the foregoing optical assembly splits both spectrally and energetically, with 60% of the visible light in the incident light signal striking the low resolution color image sensor and 40% of the visible light and 100% of the infrared light in the incident light signal striking the high resolution black and white image sensor. Of course, due to the filter, the optical signals actually entering the two image sensors can be further adjusted.
In this embodiment, it is proposed that the first image sensor is a low-resolution color image sensor and the second image sensor is a high-resolution black-and-white image sensor. First, the asymmetric image sensor architecture can reduce the lower limit of the operating illuminance of the image capturing device. Second, the color filter matrix of the black-and-white image sensor has higher light transmittance and higher photoelectric conversion efficiency than the color filter matrix of the same-specification color image sensor. Therefore, the brightness of the high-resolution gray-scale image (namely the brightness indicated by the second brightness information) output by the high-resolution black-and-white image sensor can be improved, and the quality of the target image can be improved. Therefore, the image pickup apparatus can be further operated in an environment with a lower illumination intensity.
As shown in fig. 3A, another embodiment of the image capturing apparatus is shown. In this embodiment, the first image sensor and the second image sensor are color image sensors. Alternatively, the color image sensor may be a bayer format image sensor (Bayer image sensor) or other format color image sensor. Since the resolution of the first image sensor is smaller than that of the second image sensor, the first image sensor is hereinafter referred to as a low-resolution color image sensor, and the second image sensor is referred to as a high-resolution color image sensor. An infrared cut-off filter is arranged between the low-resolution color image sensor and the optical component, and a double-light filter is arranged between the high-resolution color image sensor and the optical component. The dual optical filter is described in the embodiment corresponding to fig. 2A, and will not be described herein.
When the illumination intensity is lower than the preset value, the description is made with reference to fig. 3B.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. The first color information is used for indicating the color of the low-resolution color image, and the first brightness information is used for indicating the brightness of the low-resolution color image. The embodiment of the application is not limited to the specific form of the first brightness information and the first color information. When converting the low resolution color image into YUV format, the first color information is U/V component and the first brightness information is Y component.
The dual-light filter arranged between the high-resolution color image sensor and the optical component is switched to a visible light cut-off filter, and the high-resolution color image sensor senses infrared light in the second light signal and outputs a high-resolution gray scale image. Wherein the high resolution gray scale image includes second luminance information for indicating a luminance of the high resolution gray scale image presentation. The second luminance information may be represented by a luminance component Y. It should be appreciated that while the high resolution color image sensor may record color, the high resolution color image sensor senses only infrared light and no visible light, so the second image only exhibits brightness and no color. Thus, the second image has only the luminance component Y and no chrominance component U/V.
The ISP chip in the image pickup apparatus may perform the ISP processing on the low resolution color image and the high resolution color image, respectively. Such as 3D noise reduction, demosaicing, luminance correction, color correction, and the like. Optionally, the ISP chip may also adjust the formats of the foregoing low resolution color image and high resolution color image, for example, adjust the bayer format to YUV format, which is not limited herein.
In addition, the image processor adjusts the low resolution color image to a high resolution color image having the same resolution as the high resolution gray scale image using an up-sampling algorithm or a super resolution algorithm. Then, the image processor fuses the high-resolution color image and the high-resolution gray scale image to obtain a target image.
Taking fig. 3B as an example, the low-resolution RGB format image 301 is converted into a low-resolution YUV format image 302, and then the low-resolution YUV format image 302 is up-sampled to obtain a high-resolution YUV format image 303. Then, the high-resolution YUV format image 303 and the gray-scale image 304 having only the Y component are fused, resulting in a high-resolution YUV format image 305 (i.e., a target image).
It should be understood that during actual image processing, images in other formats may appear in addition to those listed in fig. 3B, and are not limited in this regard.
In this embodiment, it is proposed that the high-resolution color image sensor only senses the infrared light in the second light signal when the illumination intensity is low, and generates a high-resolution gray-scale image with only brightness. The low-resolution color image and the high-resolution gray level image are fused, so that the advantages of the two images can be maintained, and the quality of the target image is improved.
When the illumination intensity is higher than the preset value, the description is made with reference to fig. 3C.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. Specifically, similar to the case where the illumination intensity is higher than the preset value, the description thereof will not be repeated here.
The dual-light filter arranged between the high-resolution color image sensor and the optical component is switched to an infrared cut-off filter, and the infrared cut-off filter is used for filtering infrared light in the second light signal. Accordingly, the high-resolution color image sensor senses the visible light in the second optical signal and outputs a high-resolution color image. At this time, the high-resolution color image (i.e., the aforementioned second image) includes not only the second luminance information but also the second color information. The second color information is used for indicating the color of the high-resolution color image, and the second brightness information is used for indicating the brightness of the high-resolution color image. The embodiment of the application is not limited to the specific form of the second brightness information and the second color information. Optionally, when converting the high resolution color image to YUV format, the second color information is a U/V component and the second luminance information is a Y component.
The ISP chip in the image pickup apparatus may perform the ISP processing on the low resolution color image and the high resolution color image. Such as 3D noise reduction, demosaicing, luminance correction, color correction, and the like. Optionally, the ISP chip may also adjust the formats of the foregoing low resolution color image and high resolution color image, for example, adjust the bayer format to YUV format, which is not limited herein.
In addition, the image processor adjusts the low-resolution color image into a high-resolution color image by an up-sampling algorithm or a super-resolution algorithm, and the two high-resolution color images have the same resolution. Then, the image processor fuses the two high-resolution color images to obtain a target image. The dynamic range of the target image is improved.
Taking fig. 3C as an example, the low-resolution RGB format image 311 is converted into a low-resolution YUV format image 312, and then the low-resolution YUV format image 312 is up-sampled to obtain a high-resolution YUV format image 313. At the same time, the high resolution RGB format image 314 is converted into a high resolution YUV format image 315. Then, the high-resolution YUV format image 313 and the high-resolution YUV format image 315 are fused to obtain a high-resolution YUV format image 316 (i.e. the target image). The target image can have the advantages of the two images, and the dynamic range is improved while the quality of the target image is improved. Although the foregoing examples of YUV420 are shown in fig. 3B and 3C, in practical applications, the format of the image may be adjusted according to specific requirements, which is not limited herein. In addition, although the YUV format and the RGB format are different color coding modes, the change of the coding format does not affect the color of the image presentation.
Optionally, the ratio of the visible light in the first optical signal to the visible light in the second optical signal in the present embodiment is 4:1. It is also understood that the foregoing optical assembly splits both spectrally and energetically, with 80% of the visible light in the incident light signal striking the low resolution color image sensor and 20% of the visible light and 100% of the infrared light in the incident light signal striking the high resolution color image sensor. And the optical signals actually entering the two image sensors are further adjusted by combining the optical filters.
In this embodiment, since both image sensors are color image sensors, when the illumination intensity is greater than the preset value, the high-resolution color image sensor can output a color image compared to the aforementioned high-resolution black-and-white image sensor. Since the energy of visible light sensed between the low-resolution color image sensor and the high-resolution color image sensor is different, the brightness of the output low-resolution color image and the high-resolution color image is also different. The dynamic range can be improved by fusing the two images, and the target image can be more real.
The present invention also proposes an image processing method for executing the functions of the image processor in the above embodiment, such as: adjusting the resolution of the first image to be the same as the resolution of the second image; and generating a target image based on the first image and the second image.
It should be understood that in the embodiments of the present application, the same reference numerals in different drawings may be regarded as the same object. The same reference numerals are used for the explanation of the same reference numerals between the preceding figures unless specifically stated otherwise. It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
For convenience and brevity, the specific working processes of the above-described system, apparatus and unit may refer to the corresponding processes in the foregoing method embodiments, which are not described herein again.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it is possible to modify the technical solutions described in the foregoing embodiments or to perform equivalent substitution of some of the technical features thereof; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. An image pickup apparatus, comprising:
An optical component for receiving an incident optical signal, processing the incident optical signal into a first optical signal and a second optical signal;
The first image sensor is used for sensing the first optical signal to generate a first image, and the image information of the first image comprises first color information and first brightness information;
A second image sensor for sensing the second light signal to generate a second image, the image information of the second image including second brightness information, the resolution of the first image sensor being smaller than the resolution of the second image sensor, the resolution of the second image being greater than the resolution of the first image;
An image processor for generating a target image based on the first image and the second image, the color and brightness of the target image being determined by the image information of the first image and the image information of the second image.
2. The image capturing apparatus according to claim 1, wherein the image processor is specifically configured to:
The resolution of the first image is adjusted to be the same as that of the second image, and a third image is obtained, wherein the third image carries the first color information and the first brightness information;
and fusing the third image with the second image to obtain the target image.
3. The image capturing apparatus according to claim 1 or 2, wherein the first optical signal includes visible light, the second optical signal includes visible light and infrared light, energy of the visible light in the first optical signal is greater than energy of the visible light in the second optical signal, and a frequency band of the visible light in the first optical signal is the same as a frequency band of the visible light in the second optical signal.
4. The image capturing device of claim 3, wherein the first image sensor is a color image sensor and the second image sensor is a black-and-white image sensor.
5. The image pickup apparatus according to claim 4, further comprising an infrared cut filter;
The camera device is further used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is positioned between the optical assembly and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal;
The second image sensor is specifically configured to sense visible light in the second optical signal, and generate the second image;
The image processor is specifically configured to combine first color information of the first image with second brightness information of the second image to obtain the target image, where a color of the target image is determined by the first color information, and a brightness of the target image is determined by the second brightness information.
6. The image capturing apparatus according to claim 4, wherein the second image sensor is configured to generate the second image by sensing visible light and infrared light in the second light signal when the illumination intensity is lower than a preset value;
the image processor is specifically configured to fuse the third image with the second image to obtain the target image.
7. The image capturing device of claim 3, wherein the first image sensor and the second image sensor are both color image sensors.
8. The image pickup apparatus according to claim 7, further comprising an infrared cut filter;
The camera device is further used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is positioned between the optical assembly and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal;
the second image sensor is specifically configured to sense visible light in the second optical signal, generate the second image, and the image information of the second image further includes second color information;
the image processor is specifically configured to fuse the third image with the second image to obtain the target image.
9. The image pickup apparatus according to claim 7, further comprising a visible light cut-off filter;
The camera device is further used for starting the visible light cut-off filter when the illumination intensity is lower than a preset value, the visible light cut-off filter is positioned between the optical assembly and the second image sensor, and the visible light cut-off filter is used for filtering visible light in the second optical signal;
the second image sensor is specifically configured to sense infrared light in the second optical signal, and generate the second image;
the image processor is specifically configured to fuse the third image with the second image to obtain the target image.
10. The image pickup apparatus according to any one of claims 1 to 9, wherein the optical assembly includes a lens and a beam-splitting prism, the beam-splitting prism being located between the lens and an image sensor;
The lens is used for receiving the incident light signal;
the beam splitter prism is used for splitting the incident light signal received by the lens into the first light signal and the second light signal.
11. The image pickup apparatus according to claim 10, wherein the lens is an infrared confocal lens.
12. The image pickup apparatus according to any one of claims 1 to 9, wherein the image pickup apparatus further comprises an infrared cut filter;
the optical assembly comprises a first lens and a second lens, the first lens and the second lens are used for jointly receiving the incident light signals, the focal length of the first lens is the same as that of the second lens, the aperture of the first lens is larger than that of the second lens, the infrared cut-off filter is arranged between the first lens and the first image sensor, and the second lens is an infrared confocal lens;
the first lens is used for receiving a part of the incident light signals and transmitting the received light signals to the infrared cut-off filter;
the infrared cut-off filter is used for filtering infrared light in the optical signals from the first lens to obtain the first optical signals and transmitting the first optical signals to the first image sensor;
The second lens is configured to receive a remaining portion of the incident optical signal, and transmit the received optical signal as a second optical signal to the second image sensor.
13. The image pickup apparatus according to claim 1 or 2, wherein,
The format of the first image is YUV format;
the format of the second image is YUV format.
CN202410417378.5A 2020-03-20 Image pickup device Pending CN118264914A (en)

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202080000505.1A Division CN113728618A (en) 2020-03-20 2020-03-20 Camera device

Publications (1)

Publication Number Publication Date
CN118264914A true CN118264914A (en) 2024-06-28

Family

ID=

Similar Documents

Publication Publication Date Title
CN113711584B (en) Camera device
CN110519489B (en) Image acquisition method and device
CN108712608B (en) Terminal equipment shooting method and device
KR101428635B1 (en) Dual image capture processing
CN110365878B (en) Camera device and method
US20130278726A1 (en) Imaging system using a lens unit with longitudinal chromatic aberrations and method of operating
CN112602316B (en) Automatic exposure method and device for double-light image, double-light image camera and machine storage medium
US20230005240A1 (en) Image sensor and image light sensing method
CN111131798B (en) Image processing method, image processing apparatus, and imaging apparatus
US20050046703A1 (en) Color calibration in photographic devices
WO2020168465A1 (en) Image processing device and method
JP2010154335A (en) Imaging apparatus and imaging method
CN108900785A (en) Exposal control method, device and electronic equipment
US20100207958A1 (en) Color image creating apparatus
JP2007202128A (en) Imaging apparatus and image data correcting method
WO2019155757A1 (en) Image processing device, image processing method, and image processing system
CN109068111A (en) A kind of monitoring device, monitoring method and electronic equipment
JP2010021791A (en) Image capturing apparatus, and image processing method therefor, and program
Nonaka et al. Monocular color-IR imaging system applicable for various light environments
CN109300186B (en) Image processing method and device, storage medium and electronic equipment
CN118264914A (en) Image pickup device
WO2021184353A1 (en) Camera device
CN114143443B (en) Dual-sensor imaging system and imaging method thereof
JP5545596B2 (en) Image input device
JP4530149B2 (en) High dynamic range camera system

Legal Events

Date Code Title Description
PB01 Publication