CN113728618A - Camera device - Google Patents

Camera device Download PDF

Info

Publication number
CN113728618A
CN113728618A CN202080000505.1A CN202080000505A CN113728618A CN 113728618 A CN113728618 A CN 113728618A CN 202080000505 A CN202080000505 A CN 202080000505A CN 113728618 A CN113728618 A CN 113728618A
Authority
CN
China
Prior art keywords
image
image sensor
optical signal
resolution
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080000505.1A
Other languages
Chinese (zh)
Inventor
黄进新
汪鹏程
刘军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN113728618A publication Critical patent/CN113728618A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Abstract

The embodiment of the application discloses an image pickup device using an asymmetric image sensor, which comprises two sensors with different resolutions, wherein a low-resolution image sensor generates a color image, a high-resolution image sensor generates a color image or a gray image, and color information and brightness information of the images generated by the two image sensors are acquired and fused, so that a target image is generated, and the imaging effect of a camera in a low-illumination environment is improved.

Description

Camera device Technical Field
The embodiment of the application relates to the field of security monitoring, in particular to a camera device.
Background
Low illumination scene refers to a scene with insufficient light, such as outdoors at night or indoors without sufficient lighting. In the field of security monitoring, in order to acquire images with high definition and rich colors in a low-illumination scene, a camera is often used in cooperation with a visible light fill-in lamp or an infrared light fill-in lamp. However, the use of the visible light supplement lamp is easy to cause light pollution and is not beneficial to hidden monitoring; although the imaging is clear, the infrared fill light cannot record the color. In recent years, the industry begins to widely adopt a dual-light fusion architecture, and under the architecture, the camera uses 2 sensors to respectively image infrared light and visible light, and then fuses the infrared image and the visible light image so as to improve the imaging capability of the camera under low illumination.
Specifically, a beam splitter prism that splits incident light into visible light and infrared light according to a frequency spectrum is provided in the camera. Then, the camera inputs the aforementioned visible light and infrared light to two identical image sensors, respectively. The image sensor that inputs visible light outputs a color image, the image sensor that inputs infrared light outputs a grayscale image, and the size and resolution of the color image and the grayscale image are the same. And the camera fuses the color image and the gray image to obtain a target image, wherein the details and the textures of the target image mainly come from the gray image, and the color information comes from the color image.
In the foregoing solution, when the ambient light illumination is lower than the imaging signal-to-noise ratio lower limit of the visible light image sensor, the color information of the color image is submerged by noise, so that the color saturation of the finally output fused image is very low, and even degenerates to a grayscale image. Therefore, under the current dual-light fusion architecture, it is very important to further reduce the lower limit of the working illumination of the camera.
Disclosure of Invention
The embodiment of the application provides an image pickup device, which is used for outputting a high-quality image in a low-illumination environment and reducing the lower limit of the working illumination of the image pickup device.
In a first aspect, an embodiment of the present application provides an image pickup apparatus, including: the image sensor system comprises an optical assembly, a first image sensor, a second image sensor and an image processor. The resolution of the first image sensor is less than the resolution of the second image sensor. The optical component is used for receiving an incident light signal and processing the incident light signal into a first light signal and a second light signal. The first image sensor is used for sensing the first optical signal to generate a first image; meanwhile, the second image sensor is used for sensing the second optical signal to generate a second image. Wherein the image information of the first image includes first color information and first brightness information; the image information of the second image includes second luminance information. Furthermore, the resolution of the first image is less than the resolution of the second image. The image processor is configured to generate a target image based on the first image and the second image. The color and brightness of the target image are determined by the image information of the first image and the image information of the second image.
In the embodiment of the application, the image pickup device adopts two image sensors with different resolutions. Since the image resolution is inversely related to the color sensitivity and positively related to the image definition. Therefore, the first image output by the first image sensor with low resolution has higher color sensitivity, and the color reality of the first image can be ensured. And the second image sensor with high resolution has more pixels, so that the output second image has higher definition and can present abundant details and textures. Therefore, the target image generated based on the two images can retain the advantages of the first image and the second image, and therefore, the camera device can be operated in an environment with lower illumination intensity.
According to the first aspect, in a first implementation manner of the first aspect of the embodiments of the present application, the image processor is specifically configured to: adjusting the resolution of the first image to be the same as that of the second image to obtain a third image; and then, fusing the third image and the second image to obtain a target image. Wherein the third image carries the first color information and the first brightness information.
In this embodiment, since the resolution of the first image is smaller than the resolution of the second image, it can be understood that the resolution of the first image is adjusted to be the same as the resolution of the second image, and the third image is obtained by increasing the resolution of the first image to the resolution of the second image. Since the first image has a high color sensitivity, a real color, and a high luminance, the third image also has a real color and a high luminance. Therefore, the target image determined based on the aforementioned third image and second image can retain the advantages of the third image and second image, and therefore, the image pickup apparatus can be operated in an environment with lower light intensity.
According to the first aspect or the first implementation manner of the first aspect, in a second implementation manner of the first aspect of the present application, the first optical signal includes visible light, the second optical signal includes visible light and infrared light, the energy of the visible light in the first optical signal is greater than the energy of the visible light in the second optical signal, and the frequency band of the visible light in the first optical signal is the same as the frequency band of the visible light in the second optical signal.
The ratio between the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal can be flexibly controlled according to the actual application requirements.
In this embodiment, it is proposed that the optical module processes the incident light signal not only by frequency division but also by energy division. The energy division refers to dividing visible light in an incident light signal, and it can also be understood that the intensity of the visible light in the first light signal is different from the intensity of the visible light in the second light signal through processing. Because the energy of the visible light in the first optical signal is different from the energy of the visible light in the second optical signal, it is beneficial for the two image sensors to obtain different brightness of the two images when the illumination intensity is higher than the preset value (i.e. in the daytime). And determining the target image based on the two images with different brightness is beneficial to improving the dynamic range.
According to a second implementation form of the first aspect as such or according to a third implementation form of the first aspect of the present application, the first image sensor is a color image sensor and the second image sensor is a black and white image sensor.
In this embodiment, it is proposed that the second image sensor is a black-and-white image sensor. Because the light transmittance of the color filter matrix of the black-and-white image sensor is higher than that of the color filter matrix of the color image sensor with the same specification, the photoelectric conversion efficiency is higher. Therefore, the brightness of the second image output by the second image sensor can be improved, and the quality of the target image can be improved. Therefore, the image pickup apparatus can be operated in an environment with lower light intensity.
According to a third aspect of the present invention, in a fourth aspect of the first aspect of the present invention, the imaging apparatus further includes an infrared cut filter. The camera device is further used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is located between the optical assembly and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal. The second image sensor is specifically configured to sense visible light in the second optical signal and generate the second image. The image processor is specifically configured to combine first color information of the first image with second luminance information of the second image to obtain the target image, where a color of the target image is determined by the first color information, and a luminance of the target image is determined by the second luminance information.
In this embodiment, it is proposed that the infrared cut filter is used to filter the infrared light in the second optical signal under the condition of high illumination intensity, so as to cut off the influence of the infrared light on the second image sensor when sensing the visible light. In addition, the second image sensor is a black and white image sensor, and the second image determined by the second image sensor sensing only the visible light in the second light signal only presents brightness and no color. Although the resolution of the first image is different from that of the second image, the first image provides rich color (i.e., first color information) and the second image provides texture detail of high luminance (i.e., second luminance information). Thus, in some cases, some or all of the first color information in the first image and some or all of the second luminance information in the second image may be combined to obtain the target image. For example, when the brightness indicated by the first brightness information is equivalent to the brightness indicated by the second brightness information, the partial or entire first color information in the first image and the partial or entire second brightness information in the second image may be combined to obtain the target image.
According to a third implementation manner of the first aspect, in a fifth implementation manner of the first aspect of the embodiments of the present application, when the illumination intensity is lower than the preset value, the second image sensor is specifically configured to sense the visible light and the infrared light in the second light signal, and generate the second image. The image processor is specifically configured to fuse the third image and the second image to obtain the target image.
In this embodiment, it is proposed that the second image sensor simultaneously senses infrared light and a part of visible light when the light intensity is low. Since the second image sensor is a black and white image sensor, the second image output by the second image sensor has only brightness. However, since the second image sensor senses visible light in addition to infrared light, the luminance of the second image output by the second image sensor sensing infrared light and visible light is greater than the luminance of the second image output by the second image sensor sensing only infrared light. Therefore, the quality of the second image can be improved, and the quality of the target image can be improved.
According to a sixth implementation form of the first aspect of the present application, in the second implementation form of the first aspect of the present application, both the first image sensor and the second image sensor are color image sensors.
According to a sixth implementation form of the first aspect, in the seventh implementation form of the first aspect of the present application, the imaging device further comprises an infrared cut filter. The camera device is further used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is located between the optical assembly and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal. The second image sensor is specifically configured to sense visible light in the second optical signal and generate the second image, and the image information of the second image further includes second color information. The image processor is specifically configured to fuse the third image and the second image to obtain the target image.
In this embodiment, it is proposed that the infrared cut filter is used to filter the infrared light in the second optical signal under the condition of high illumination intensity, so as to cut off the influence of the infrared light on the second image sensor when sensing the visible light. Since the second image sensor is a color image sensor, the second image determined by the second image sensor sensing the visible light includes not only the second luminance information but also the second color information. Since the energies of the visible light sensed by the first image sensor and the second image sensor are different, the third image and the second image are two images having different brightness. The target image is determined based on the two images with different brightness, so that the quality of the target image can be improved, and the dynamic range can also be improved.
According to a sixth implementation form of the first aspect, in the eighth implementation form of the first aspect of the present application, the imaging device further comprises a visible light cut filter. The camera device is further used for starting the visible light cut-off filter when the illumination intensity is lower than a preset value, the visible light cut-off filter is located between the optical assembly and the second image sensor, and the visible light cut-off filter is used for filtering visible light in the second optical signal. The second image sensor is specifically configured to sense infrared light in the second optical signal and generate the second image. The image processor is specifically configured to fuse the third image and the second image to obtain the target image.
In this embodiment, it is proposed that the visible light filter is used to filter the visible light in the second optical signal when the illumination intensity is low, so as to reduce the influence of the visible light on the second image sensor when sensing the infrared light. Although the second image sensor is a color image sensor, the second light signal contains only infrared light. Thus, the second image contains only the second luminance information and no color information.
According to the first aspect, the first implementation manner of the first aspect, and any one of the eighth implementation manner of the first aspect, in a ninth implementation manner of the first aspect of the present application, the optical assembly includes a lens and a beam splitting prism, and the beam splitting prism is located between the lens and the image sensor. The lens is used for receiving the incident light signal. The beam splitting prism is used for splitting an incident light signal received by the lens into the first light signal and the second light signal.
In a tenth implementation form of the first aspect of the present application, according to the ninth implementation form of the first aspect, the lens is an infrared confocal lens.
According to the first aspect or any one of the first to eighth implementations of the first aspect, in an eleventh implementation of the first aspect of the present application, the image pickup apparatus further includes an infrared cut filter. The optical assembly comprises a first lens and a second lens, the first lens and the second lens are used for receiving the incident light signal together, the focal length of the first lens is the same as that of the second lens, the aperture of the first lens is larger than that of the second lens, an infrared cut-off filter is arranged between the first lens and the first image sensor, and the second lens is an infrared confocal lens. The first lens is used for receiving a part of the incident light signal and transmitting the received light signal to the infrared cut-off filter. The infrared cut-off filter is used for filtering infrared light in the optical signal from the first lens to obtain the first optical signal and transmitting the first optical signal to the first image sensor. The second lens is used for receiving the rest part of the incident light signal and transmitting the received light signal to the second image sensor as a second light signal.
In this embodiment, it is proposed that the binocular lenses with different apertures can be used to make the energy of the optical signals to different image sensors different. Since the larger the aperture is, the more the luminous flux is, the energy of the visible light in the first optical signal output by the first lens is larger than the energy of the visible light in the second optical signal output by the second lens. In addition, an infrared cut-off filter is arranged between the first lens and the first image sensor, so that only visible light but no infrared light is contained in the first optical signal.
In a twelfth implementation form of the first aspect of the embodiments of the present application, according to any of the preceding implementation forms, the resolution of the second image is equal to the resolution of the target image.
In a second aspect, embodiments of the present application provide an image processor connected to a memory in an image pickup apparatus. The memory is used for storing data or programs processed by the processor, such as the first image, the second image and the third image; the image processor is used for calling the program in the memory to perform image processing on the first image, the second image and the third image.
According to the technical scheme, the embodiment of the application has the following advantages:
in the embodiment of the application, the image pickup device adopts two image sensors with different resolutions. Since the image resolution is inversely related to the color sensitivity and positively related to the image definition. Therefore, the first image output by the first image sensor with low resolution has higher color sensitivity, and the color reality of the first image can be ensured. And the second image sensor with high resolution has more pixels, so that the output second image has higher definition and can present abundant details and textures. Therefore, the target image determined based on the two images can retain the advantages of the first image and the second image, and therefore, the camera device can be operated in an environment with lower illumination intensity.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below, and it is apparent that the drawings in the following description are only some embodiments of the present application.
Fig. 1A is a schematic diagram of an embodiment of an image pickup apparatus in an embodiment of the present application;
fig. 1B is a schematic diagram of another embodiment of an image pickup apparatus in an embodiment of the present application;
fig. 1C is a schematic diagram of another embodiment of an image pickup apparatus in an embodiment of the present application;
fig. 2A is a schematic diagram of another embodiment of an image pickup apparatus in an embodiment of the present application;
FIG. 2B is a diagram illustrating an embodiment of an image processing flow according to an embodiment of the present application;
FIG. 2C is a diagram illustrating another embodiment of an image processing flow according to an embodiment of the present application;
fig. 3A is a schematic diagram of another embodiment of an image pickup apparatus in an embodiment of the present application;
FIG. 3B is a diagram illustrating another embodiment of an image processing flow according to an embodiment of the present application;
fig. 3C is a schematic diagram of another embodiment of an image processing flow in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the application provides an image pickup device, which is used for outputting a high-quality image in a low-illumination environment and reducing the lower limit of the working illumination of the image pickup device.
For convenience of understanding, an application scenario of the imaging apparatus proposed in the embodiment of the present application is described below:
the imaging device provided by the embodiment of the application can be applied to imaging in a low illumination (low illumination/low light) environment. Specifically, the low-illuminance environment refers to an environment in which the illumination intensity is lower than a certain value, and is generally measured by the energy of visible light received per unit area of an image sensor in an image pickup apparatus, and the unit is Lux (Lux, also simply referred to as Lx). Generally, an illumination environment of more than 0Lux and less than 1Lux may be referred to as a low illumination environment. In particular, the low light environment may be an outdoor dim street, for example, a street at night, or a street in a rainy day; the lighting device may be a room with only weak light, for example, a shop or a warehouse with only weak light, and is not limited herein.
In the low-light scene, when the photosensitive area of the image sensor is fixed, the color sensitivity of the output image and the resolution of the image are inversely correlated. It is considered that the gray-scale image generated by the image sensor sensing infrared light can provide a higher spatial resolution, and the human visual system is more sensitive to luminance information and less sensitive to color spatial resolution. Therefore, the spatial resolution of a part of the color image can be sacrificed to improve the color sensitivity, and the effect of reducing the lower limit of the working illumination of the camera device is achieved. For example, the lower limit of the requirement of the working illumination of the original camera is 1Lux, and a color image which can be accepted by human eyes is difficult to obtain below 1 Lux. After the scheme provided by the embodiment of the invention is used, the lower limit of the working illumination of the camera device can be reduced to 0.1Lux, even 0.01Lux, and the specific description is not limited herein.
An image pickup apparatus according to an embodiment of the present application will be described below, and as shown in fig. 1A, the image pickup apparatus 10 (e.g., a video camera) includes: a first image sensor 101, a second image sensor 102, an optical assembly 103 and an image processor 104. The optical module 103 is configured to receive an incident light signal emitted from a subject photographed by the image pickup device 10, and process the incident light signal into a first light signal and a second light signal. The optical assembly 103 is further configured to control the first optical signal to be directed to the first image sensor 101 and the second optical signal to be directed to the second image sensor 102. The first image sensor 101 is configured to sense the first light signal to generate a first image, where the first image is a color image, and image information of the first image includes first color information and first brightness information. The second image sensor 102 is configured to sense the second light signal to generate a second image, the second image is a color image or a grayscale image, and the image information of the second image includes second luminance information (when the second image is a color image, the second image further includes second color information). In addition, the resolution of the first image sensor 101 is smaller than the resolution of the second image sensor 102. The image processor 104 is configured to generate a target image based on the first image and the second image, and the color and brightness of the target image are determined by the image information of the first image and the image information of the second image. For example, the image processor 104 may be a system on chip (SoC).
It should be appreciated that since the resolution of the first image sensor 101 is less than the resolution of the second image sensor 102, the resolution of the first image is less than the resolution of the second image. It should also be understood that the first image sensor 101 is a color image sensor, since the first image contains first color information, i.e. the first image is capable of representing colors. The second image sensor 102 may be a color image sensor or a black-and-white image sensor, which please refer to the related description of the corresponding embodiments of fig. 2A and fig. 3A, and will not be described herein again.
In this embodiment, the imaging device 10 employs two image sensors having different resolutions. Since the image resolution is inversely related to the color sensitivity and positively related to the image definition. Therefore, the first image output by the first image sensor 101 with low resolution has higher color sensitivity, and the color of the first image can be ensured to be real. The second image sensor 102 with high resolution has more pixels, so that the output second image has higher definition and can present abundant details and textures. Therefore, the target image determined based on the two images can retain the advantages of the first image and the second image, and therefore, the camera device can be operated in an environment with lower illumination intensity.
Based on the foregoing embodiment, the process of the image processor 104 determining the target image may include the following:
first, the image processor 104 adjusts the resolution of the first image to be the same as the resolution of the second image, and obtains a third image. The image processor 104 then determines the target image based on the third image and the second image. Optionally, the resolution of the target image is equal to the resolution of the second image. In the foregoing process, although the resolution of the third image is different from that of the first image, the third image exhibits colors and brightness from the first image. Therefore, the process of adjusting the low-resolution first image to the high-resolution third image can not only preserve the color and brightness presented by the first image, but also facilitate the determination of the target image based on the third image and the second image.
It should be understood that, in the present embodiment, the resolutions of the two images are the same, and it should be understood that the resolutions of the two images are completely the same; it can also be understood that there is a certain difference between the resolutions of the two images, but the difference is not enough to affect the subsequent processing of the two images, and is not limited herein.
It should be appreciated that the image processor 104 may employ an up-sampling algorithm or a super-resolution algorithm, without limitation, in the process of the image processor 104 adjusting the resolution of the first image to be the same as the resolution of the second image. Further, the image processor 104 mainly refers to an image fusing process for determining a target image based on the third image and the second image. Image fusion (image fusion) refers to an image processing technique that can synthesize two or more images into a new image using a specific algorithm, and the synthesized image has the excellent characteristics of the original image (i.e., the two or more images before synthesis). Such as brightness, sharpness, and color.
Optionally, the image capturing apparatus 10 further includes an Image Signal Processor (ISP) chip (not shown), and the ISP chip is located between the image sensor and the image processor 104. Optionally, the ISP chip may be configured to perform two processing procedures, respectively process the first image output by the first image sensor 101 and the second image output by the second image sensor 102, and send the processed first image and second image to the image processor 104 for subsequent processing. Optionally, the ISP chip may perform a plurality of ISP processes on the first image and the second image, for example, 3D noise reduction, demosaic, brightness correction, and color correction. The basic image processing may be adjusted according to actual application requirements, and the content of the basic image processing is not limited herein.
In this embodiment and the subsequent embodiments, the image output from the first image sensor 101 to the image before the resolution is adjusted is referred to as a first image. Similarly, the image output from the second image sensor 102 to the image before the image fusion or combination is performed is referred to as a second image. That is, in practical applications, the raw image output from the image sensor may be subjected to various processing flows, and the foregoing processing flows are not limited in the embodiments of the present application. In other words, if the RAW image output by the image sensor is fused directly subsequently, the first image is in RAW format; if the RGB images output by the ISP chip are fused, the first image is in an RGB format; if the YUV images output by the ISP chip are fused, the first image is in YUV format (in order to reduce redundancy, the following embodiments only take the first and second images in YUV format as an example for description). In addition, the first image and the second image are not subjected to compression coding, and the fused images can be coded to generate image formats which are easily recognized by human eyes and occupy less storage space, such as a jpeg format (abbreviated as jpg format), a bmp format, a tga format png format and a gif format.
Optionally, before the third image and the second image are fused, the ISP chip may further perform image correction on the third image and the second image. Specifically, the ISP chip may correct the coordinate system of the third image into the coordinate system of the second image, so that the scenes in the two images are aligned, or may be understood as aligning the presented texture details of the two images. Optionally, the ISP chip may adjust the two images by using preset correction parameters, or may adaptively configure the correction parameters according to the change of the current temperature to complete image correction.
Alternatively, the first image sensor 101 or the second image sensor 102 may be a CCD image sensor formed by a Charge Coupled Device (CCD), a CMOS image sensor formed by a Complementary Metal Oxide Semiconductor (CMOS), or another type of image sensor, and is not limited herein.
In the foregoing embodiment, the image capture device 10 employs an asymmetric image sensor architecture (i.e., the two image sensors in the image capture device 10 have different resolutions), and the spatial resolution of the first image sensor sensing visible light can be traded for color sensitivity (i.e., low illumination sensitivity). Therefore, the imaging apparatus 10 can be preferably operated in a low illuminance environment. On this basis, the optical component 103 is specifically configured to make the energy of the visible light in the first optical signal greater than the energy of the visible light in the second optical signal by processing the incident optical signal. Wherein the first optical signal comprises visible light and the second optical signal comprises visible light and infrared light. It will also be appreciated that the optical component 103 divides the incident optical signal into frequency and power. The frequency division refers to dividing an incident light signal according to different frequencies, for example, dividing the incident light signal into visible light and infrared light. The energy is proportional to the square of the amplitude of the light wave, and the energy of the visible light can be understood as the intensity of the visible light. Therefore, the fractional energy can be understood as separating the visible light in the incident light signal into a first light signal and a second light signal by using a physical structure such as a coating film of a lens, the intensity of the visible light in the first light signal is different from the intensity of the visible light in the second light signal, and the intensity ratio between the visible light in the first light signal and the visible light in the second light signal is fixed. It should be noted that the frequency band of the visible light in the first optical signal is the same as the frequency band of the visible light in the second optical signal; alternatively, the frequency band of the visible light in the first optical signal and the light of the visible light in the second optical signal have the same frequency band, for example, both the first optical signal and the second optical signal have green light.
In such an embodiment, since the energy of the visible light in the first light signal is different from the energy of the visible light in the second light signal, it is beneficial for the two image sensors to obtain different brightness of the two images when the illumination intensity is higher than the preset value (i.e. in the daytime). And determining the target image based on the two images with different brightness is beneficial to improving the dynamic range. In addition, the ratio between the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal can be flexibly controlled according to the actual application requirements. For example, the first optical signal and the second optical signal contain the same visible light frequency band, and the ratio of the energy of the visible light in the first optical signal to the energy of the visible light in the second optical signal may be maintained at 9: 1, or 8:2, or 7:3, or 6:4, or 6.5:3.5, etc., and is not limited herein.
Further, the optical component 103 may be implemented by any one of the following methods:
as shown in fig. 1B, is an implementation of the optical assembly 103. The optical assembly 103 includes a beam splitter prism 1031, which may also be referred to as a beam splitter prism (beam splitter), and a lens 1032, and the beam splitter prism 1031 is an optical device that splits an incident optical signal into two beams by refraction and reflection of light by coating one or more thin films on an optical glass surface. In this embodiment, the splitting prism 1031 is used to split the incident optical signal into a first optical signal and a second optical signal. Wherein the first optical signal comprises visible light and the second optical signal comprises visible light and infrared light. Specifically, a part of visible light in the incident light signal is emitted to the first image sensor 101 through the coating layer, and another part of visible light and all infrared light in the incident light signal are reflected at the coating layer and are emitted to the second image sensor 102. It should be appreciated that if different optical components are selected for use with optical glasses having different thickness, compositional coatings, the different optical components will each correspond to a different ratio of the energy of visible light in the first optical signal to the energy of visible light in the second optical signal. Optionally, the lens 1032 is an infrared confocal lens, and the infrared confocal lens is used for realizing infrared confocal.
In addition, the optical assembly 103 is often used with a filter. Specifically, a filter (for example, the filter 1051) may be provided between the beam splitter prism 1031 and the first image sensor 101, and a filter (for example, the filter 1052) may also be provided between the beam splitter prism 1032 and the second image sensor 102. Alternatively, when the foregoing optical filter 1051 is an infrared cut filter, the first optical signal entering the first image sensor 101 may be prevented from being mixed with infrared light. Optionally, when the filter 1052 is an infrared cut filter, infrared light in the second optical signal may be filtered; when the filter 1052 is a visible light cut filter, visible light in the second optical signal can be filtered; when the optical filter 1052 is a white glass slide (the material of the white glass slide is colorless transparent glass, and does not filter light), visible light and infrared light in the second optical signal can pass through the optical filter, that is, the optical filter allows full-band optical signals to pass through the white glass slide.
In this embodiment, the splitting prism is redesigned and combined with the optical filter to realize frequency division (in the first and second optical signals, by using the infrared filtering cutoff optical filter, infrared light is prevented from appearing in the first optical signal, and the splitting of the infrared light frequency band and the visible light frequency band is realized), split energy (the first and second optical signals both include visible light of the same frequency band, and the energy of the two paths of visible light is different, so that the splitting of energy is realized), the visible light and the infrared light can be divided to two image sensors, and simultaneously, the proportion that the visible light in the incident optical signal respectively enters the two image sensors can be controlled. Therefore, the final output fused image can be made to have a dynamic range (dynamic range) increased when the illumination intensity is high (e.g., daytime), and can be made to have a more natural color of the target image based on the aforementioned increase in low illumination when the illumination intensity is low (e.g., nighttime).
Fig. 1C shows another implementation of the optical assembly 103.
The optical assembly 103 includes a first lens 1033 and a second lens 1034, the first lens 1033 being configured to focus a portion of the incident optical signal such that the output optical signal is directed to the first image sensor 101, and the second lens 1034 being configured to focus a remaining portion of the incident optical signal such that the output optical signal is directed to the second image sensor 102. The focal length of the first lens 1033 is the same as the focal length of the second lens 1034. Optionally, the aperture of the first lens 1033 is larger than the aperture of the second lens 1034. Therefore, the luminous flux of the first lens 1033 is larger than that of the second lens 1034. Therefore, the energy of the visible light output by the first lens 1033 is greater than the energy of the visible light output by the second lens 1034. It can also be understood that the first lens 1033 and the second lens 1034 actually pass different optical signals. In addition, for the explanation of the energy of the visible light, reference may be made to the related description of the corresponding embodiment of fig. 1B, and details are not repeated here.
In addition, a filter (e.g., filter 1051) is disposed between the first lens 1033 and the first image sensor 101, and a filter (e.g., filter 1052) may be disposed between the second lens 1034 and the second image sensor 102. Specifically, the filter 1051 is an infrared cut filter for filtering infrared light in the optical signal from the first lens 1033, so that the first optical signal sensed by the first image sensor 101 has only visible light but no infrared light. Optionally, when the filter 1052 is an infrared cut filter, infrared light in the second optical signal may be filtered; when the filter 1052 is a visible light cut filter, visible light in the second optical signal can be filtered; when the filter 1052 is a white glass slide (the material of the white glass slide is colorless transparent glass, and does not filter light), visible light and infrared light in the second optical signal can pass through the filter, that is, the optical signal of the full wavelength band is allowed to pass through the white glass slide. In addition, the second lens 1034 is an infrared confocal lens.
In this embodiment, two lenses with the same focal length but different aperture sizes are used to form a binocular lens, so as to realize energy splitting (different light energies split by different sensors). Because the size of the aperture determines the intensity of light rays which can pass through the lens, the energy of visible light entering the two image sensors can be controlled to be different by adopting apertures with different sizes. By combining the asymmetric image sensor architecture and adopting different filters under different illumination intensities, the final output fused image can have a dynamic range improved when the illumination intensity is high (for example, in the daytime) and further have more natural colors when the illumination intensity is low (for example, at night).
In practical applications, the optical component 103 in fig. 1A and the optical components referred to later may adopt the implementation shown in fig. 1B, and may also adopt the implementation shown in fig. 1C, which may be specifically selected according to application scenarios, and is not limited herein.
It should be understood that the two image sensors in the aforementioned imaging device may have different light sensing performances, except for different resolutions. The following are introduced separately:
as shown in fig. 2A, an embodiment of the imaging device is described above. In this embodiment, the first image sensor is a color image sensor, and the second image sensor is a black-and-white image sensor. Since the resolution of the first image sensor is smaller than that of the second image sensor, the first image sensor will be referred to as a low-resolution color image sensor and the second image sensor will be referred to as a high-resolution monochrome image sensor hereinafter. Alternatively, the low-resolution color image sensor may be a Bayer image sensor (Bayer image sensor) or other format color image sensor; the high resolution black-and-white image sensor may be a MONO image sensor (MONO image sensor) or other format black-and-white image sensors, and is not limited herein. An infrared cut-off filter is arranged between the low-resolution color image sensor and the optical component, and a double-optical filter is arranged between the high-resolution black-and-white image sensor and the optical component. The dual optical filter is also known as an IR-CUT auto-switching filter. The IR-CUT automatic switching filter is provided with a photosensitive device, or the IR-CUT automatic switching filter is connected with the photosensitive device, and the photosensitive device transmits the sensed illumination intensity to the camera device. When the change of the illumination intensity is detected, the camera device (specifically, a filter control chip in the camera device) can control the IR-CUT to automatically switch the filter. For example, when the illumination intensity is greater than a preset value (for example, daytime), the infrared cut filter is switched to, and when the illumination intensity is less than the preset value (for example, nighttime), the visible cut external filter is switched to. Alternatively, the bifocal filter can be switched to a white glass slide, allowing both visible and infrared light to pass. Alternatively, the dual optical filter may be replaced by an infrared cut filter, and the imaging device controls enabling and disabling of the infrared cut filter.
When the illumination intensity is lower than the preset value, the description is provided with reference to fig. 2B.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. The first color information is used for indicating the color of the low-resolution color image, and the first brightness information is used for indicating the brightness presented by the low-resolution color image. The present embodiment does not limit the specific form of the first luminance information and the first color information.
The double-optical filter arranged between the high-resolution black-and-white image sensor and the optical component is switched to a white glass slide, and the high-resolution black-and-white image sensor senses visible light and infrared light in the second optical signal and outputs a high-resolution gray image. Wherein the high resolution grayscale image includes second luminance information indicating a luminance of the high resolution grayscale image presentation. Alternatively, the second luminance information may be represented by a luminance component Y. The embodiment of the present application does not limit the specific form of the second luminance information. The image directly generated by the image sensor is in RAW format, and the RAW format is divided into a plurality of types according to the design of the image sensor, for example, bayer RGGB, RYYB, RCCC, RCCB, RGBW, CMYW, and other formats. Using the ISP chip, RAW images of various formats can be converted into RGB format images. The ISP chip may also be used to convert the RAW format image into YUV format image, HSV format image, Lab format image, CMY format image, or YCbCr format image. For example: the ISP chip converts the RAW format image into the RGB format image, and then converts the RGB format image into the YUV format image. The ISP chip in the imaging apparatus may perform basic image processing, such as 3D noise reduction, demosaicing, brightness correction, and color correction, on the low-resolution color image and the high-resolution grayscale image.
In addition, the image processor adjusts the low-resolution color image into a high-resolution color image having the same resolution as the high-resolution gray image using an up-sampling algorithm or a super-resolution algorithm. Then, the image processor fuses the high-resolution color image and the high-resolution gray-scale image to obtain a target image.
Taking fig. 2B as an example, the RGB format image 201 with the low resolution is converted into the YUV format image 202 with the low resolution, and then the YUV format image 202 with the low resolution is up-sampled to obtain the YUV format image 203 with the high resolution. Then, the high-resolution YUV format image 203 and the high-resolution gray image 204 having only the Y component are fused to obtain a high-resolution YUV format image 205 (i.e., a target image).
It should be understood that, in the actual image processing process, images in other formats may appear in addition to the images listed in fig. 2B, and are not limited herein.
It should be understood that the aforementioned Y component, Y in YUV format, represents brightness (luminance/luma) and U and V represent chroma/concentration (chroma). The YUV format mainly includes YUV420, YUV422, and YUV 444. Wherein YUV444 refers to a set of UV components for each Y component; YUV422 refers to every two Y components sharing a set of UV components; YUV420 refers to sharing a set of UV components for every four Y components. Although the foregoing fig. 2B adopts YUV420 to exemplify, in practical applications, the format of the image may be adjusted according to specific requirements, and is not limited herein. In addition, although the YUV format and the RGB format are different color coding methods, the change of the coding format does not affect the color of the image.
In the embodiment, it is proposed that the high-resolution black-and-white image sensor simultaneously senses infrared light and a part of visible light when the illumination intensity is low, instead of sensing only infrared light or only visible light. Therefore, the brightness of the second image output by the second image sensor sensing infrared light and visible light is greater than the brightness of the second image output by the second image sensor sensing only infrared light. Therefore, the quality of the second image can be improved, and the quality of the target image can be improved.
When the illumination intensity is higher than the preset value, the description is made with reference to fig. 2C.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. The first color information is used for indicating the color of the low-resolution color image, and the first brightness information is used for indicating the brightness presented by the low-resolution color image. The embodiment of the present application does not limit the specific forms of the first luminance information and the first color information. Optionally, when the low-resolution color image is converted into a YUV format, the first color information is a U/V component, and the first luminance information is a Y component.
And the double-optical filter arranged between the high-resolution black-and-white image sensor and the optical component is switched into an infrared cut-off filter, and the infrared cut-off filter is used for filtering infrared light in the second optical signal. Therefore, the high-resolution monochrome image sensor senses the visible light in the second optical signal and outputs a high-resolution grayscale image. Wherein the high resolution grayscale image includes second luminance information indicating a luminance of the high resolution grayscale image presentation. Alternatively, the second luminance information may be represented by a luminance component Y. Since the high resolution monochrome image sensor cannot record color, the second image only appears in luminance and not in color. Thus, the second image has only the luminance component Y and no chrominance components U/V. It should be understood that when the formats of the image sensors are different, the formats of the output images will be different, and the detailed description is given above and will not be repeated here.
In this embodiment, when the exposure time and the gain are controlled so that the brightness represented by the first brightness information is equivalent to the brightness represented by the second brightness information, the image processor may combine the first color information of the low-resolution color image with the second brightness information of the high-resolution grayscale image to obtain the target image. The color of the target image is determined by the first color information, and the brightness of the target image is determined by the second brightness information. Specifically, the image processor may combine the color component (i.e., U/V component) of the aforementioned low-resolution color image with the luminance component (i.e., Y component) of the high-resolution grayscale image to obtain the target image. In such an embodiment, a higher quality target image can be obtained without using a complex fusion algorithm, and the data processing amount of the image processor can be reduced.
Optionally, the ratio of the resolution of the color image sensor to the resolution of the black-and-white image sensor is 1:4, and the ratio of the resolution of the low-resolution color image (i.e., the first image) to the resolution of the high-resolution grayscale image (i.e., the second image) is 1: 4. Optionally, the low-resolution color image and the high-resolution gray scale image are both expressed in YUV format. Optionally, the low-resolution color image is in YUV444 format, and the high-resolution grayscale image is in YUV420 format. In this case, the ratio of the number of Y components of the high-resolution gray image to the number of U/V components of the low-resolution color image is 4: 1. Thus, the image processor may output a target image in the format YUV 420. Taking fig. 2C as an example, the low-resolution RGB format image 211 is converted into the low-resolution YUV444 format image 212. Then, the U/V component in the low-resolution YUV444 format image 212 and the Y component in the high-resolution gray scale image 213 are combined to obtain a high-resolution YUV420 format image 214 (i.e., a target image).
It should be understood that the ratio of the resolution of the low-resolution color image sensor to the resolution of the high-resolution gray-scale image sensor may be other values, for example, 1:2 or 1:16, and is not limited herein. When the ratio of the resolutions of the two image sensors is different, the image format adopted in the image fusion process is also adjusted adaptively, so that the calculated amount of the output target image is reduced while the target image with better quality is output.
Optionally, in this embodiment, the ratio of the visible light in the first optical signal to the visible light in the second optical signal is 3: 2. It will also be appreciated that the aforementioned optical components are split spectrally and energy simultaneously, wherein 60% of the visible light in the incident light signal is directed to the low resolution color image sensor and 40% of the visible light and 100% of the infrared light in the incident light signal is directed to the high resolution black and white image sensor. Of course, due to the function of the optical filter, the optical signals actually entering the two aforementioned image sensors can be further adjusted.
In this embodiment, it is proposed that the first image sensor adopts a low-resolution color image sensor and the second image sensor adopts a high-resolution black-and-white image sensor. First, the asymmetric image sensor architecture can reduce the lower operating illumination limit of the imaging device. Secondly, the light transmittance of the color filter matrix of the black-and-white image sensor is higher than that of the color filter matrix of the color image sensor with the same specification, and the photoelectric conversion efficiency is higher. Therefore, the brightness of the high-resolution gray scale image output by the high-resolution black-and-white image sensor (i.e., the brightness indicated by the second brightness information) can be improved, which is favorable for improving the quality of the target image. Therefore, the image pickup apparatus can be further operated in an environment with lower light intensity.
As shown in fig. 3A, another embodiment of the imaging device is described above. In this embodiment, the first image sensor and the second image sensor are both color image sensors. Alternatively, the color image sensor may be a Bayer pattern image sensor (Bayer image sensor) or other pattern color image sensor. Since the resolution of the first image sensor is smaller than that of the second image sensor, the first image sensor will be referred to as a low-resolution color image sensor and the second image sensor will be referred to as a high-resolution color image sensor hereinafter. An infrared cut-off filter is arranged between the low-resolution color image sensor and the optical component, and a dual-optical filter is arranged between the high-resolution color image sensor and the optical component. The dual optical filter is described in the embodiment corresponding to fig. 2A, and is not described herein again.
When the illumination intensity is lower than the preset value, the description is provided with reference to fig. 3B.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. The first color information is used for indicating the color of the low-resolution color image, and the first brightness information is used for indicating the brightness presented by the low-resolution color image. The embodiment of the present application does not limit the specific forms of the first luminance information and the first color information. When the low-resolution color image is converted into a YUV format, the first color information is a U/V component and the first luminance information is a Y component.
The double optical filter arranged between the high-resolution color image sensor and the optical component is switched to a visible light cut-off filter, and the high-resolution color image sensor senses infrared light in the second optical signal and outputs a high-resolution gray-scale image. Wherein the high resolution grayscale image includes second luminance information indicating a luminance of the high resolution grayscale image presentation. The second luminance information may be represented by a luminance component Y. It should be appreciated that while the high resolution color image sensor can record colors, the high resolution color image sensor only senses infrared light and no visible light, and thus the second image only exhibits brightness and no color. Thus, the second image has only the luminance component Y and no chrominance components U/V.
The ISP chip in the imaging apparatus may perform the ISP processing on the low-resolution color image and the high-resolution color image, respectively. For example, 3D noise reduction, demosaicing, brightness correction, and color correction. Optionally, the ISP chip may further adjust formats of the low-resolution color image and the high-resolution color image, for example, adjust a bayer format to a YUV format, and the like, which is not limited herein.
Further, the image processor adjusts the low-resolution color image into a high-resolution color image having the same resolution as the high-resolution gray image using an up-sampling algorithm or a super-resolution algorithm. Then, the image processor fuses the high-resolution color image and the high-resolution gray-scale image to obtain a target image.
Taking fig. 3B as an example, the RGB format image 301 with the low resolution is converted into the YUV format image 302 with the low resolution, and then the YUV format image 302 with the low resolution is up-sampled to obtain the YUV format image 303 with the high resolution. Then, the high-resolution YUV format image 303 and the Y-component-only grayscale image 304 are fused, resulting in a high-resolution YUV format image 305 (i.e., a target image).
It should be understood that, in the actual image processing process, images in other formats may appear in addition to the images listed in fig. 3B, and are not limited herein.
In this embodiment, it is proposed that the high-resolution color image sensor only senses the infrared light in the second light signal when the illumination intensity is low, and generates a high-resolution gray scale image with only brightness. The low-resolution color image and the high-resolution gray image are fused, so that the advantages of the two images can be kept, and the quality of the target image is improved.
When the illumination intensity is higher than the preset value, the description is made with reference to fig. 3C.
The low-resolution color image sensor senses visible light in the first optical signal and outputs a low-resolution color image. Wherein the low resolution color image includes first color information and first luminance information. Specifically, similar to the case where the illumination intensity is higher than the preset value, the description is omitted here.
And the double-optical filter arranged between the high-resolution color image sensor and the optical component is switched into an infrared cut-off filter, and the infrared cut-off filter is used for filtering infrared light in the second optical signal. Therefore, the high-resolution color image sensor senses the visible light in the second optical signal and outputs a high-resolution color image. At this time, the high resolution color image (i.e., the aforementioned second image) includes not only the second luminance information but also the second color information. The second color information is used for indicating the color of the high-resolution color image, and the second brightness information is used for indicating the brightness of the high-resolution color image. The embodiment of the present application does not limit the specific forms of the second luminance information and the second color information. Optionally, when the high-resolution color image is converted into YUV format, the second color information is U/V component, and the second luminance information is Y component.
The ISP chip in the image pickup apparatus may perform the ISP processing on the low-resolution color image and the high-resolution color image. For example, 3D noise reduction, demosaicing, brightness correction, and color correction. Optionally, the ISP chip may further adjust formats of the low-resolution color image and the high-resolution color image, for example, adjust a bayer format to a YUV format, and the like, which is not limited herein.
In addition, the image processor adjusts the low-resolution color image into a high-resolution color image by using an up-sampling algorithm or a super-resolution algorithm, and the two high-resolution color images have the same resolution. Then, the image processor fuses the two high-resolution color images to obtain a target image. The dynamic range of the target image is favorably improved.
Taking fig. 3C as an example, the RGB format image 311 with the low resolution is converted into the YUV format image 312 with the low resolution, and then the YUV format image 312 with the low resolution is up-sampled to obtain the YUV format image 313 with the high resolution. At the same time, the high-resolution RGB format image 314 is converted into a high-resolution YUV format image 315. Then, the high-resolution YUV format image 313 and the high-resolution YUV format image 315 are fused to obtain a high-resolution YUV format image 316 (i.e., a target image). The target image can have the advantages of the two images, and the dynamic range is improved while the quality of the target image is improved. Although the foregoing fig. 3B and fig. 3C use YUV420 to exemplify, in practical applications, the format of the image may be adjusted according to specific requirements, and is not limited herein. In addition, although the YUV format and the RGB format are different color coding methods, the change of the coding format does not affect the color of the image.
Optionally, the ratio of the visible light in the first optical signal to the visible light in the second optical signal in this embodiment is 4: 1. It is also understood that the aforementioned optical components are split spectrally and energy simultaneously, wherein 80% of the visible light in the incident optical signal is directed to the low resolution color image sensor and 20% of the visible light and 100% of the infrared light in the incident optical signal is directed to the high resolution color image sensor. And further adjusts the optical signals actually entering the two aforementioned image sensors in conjunction with the optical filter.
In this embodiment, since both the two image sensors are color image sensors, when the illumination intensity is greater than a preset value, the high-resolution color image sensor can output a color image compared with the high-resolution black-and-white image sensor. Since the energy of the visible light sensed between the low-resolution color image sensor and the high-resolution color image sensor is different, the luminance of the output low-resolution color image and the luminance of the output high-resolution color image are different. The two images are fused, so that the dynamic range can be improved, and the target image can be more real.
The present invention also provides an image processing method for executing the functions of the image processor in the above embodiments, for example: adjusting the resolution of the first image to be the same as the resolution of the second image; and generating a target image based on the first image and the second image.
It should be understood that, in the embodiments of the present application, the same reference numerals in different drawings may be regarded as the same objects. The same reference numerals are used in the explanations of the preceding figures unless otherwise specified. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
For convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, modifications may be made to the technical solutions described in the foregoing embodiments, or some technical features may be replaced with equivalents; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (13)

  1. An image pickup apparatus, comprising:
    the optical component is used for receiving an incident optical signal and processing the incident optical signal into a first optical signal and a second optical signal;
    the first image sensor is used for sensing the first optical signal to generate a first image, and image information of the first image comprises first color information and first brightness information;
    the second image sensor is used for sensing the second optical signal to generate a second image, the image information of the second image comprises second brightness information, the resolution of the first image sensor is smaller than that of the second image sensor, and the resolution of the second image is larger than that of the first image;
    an image processor for generating a target image based on the first image and the second image, the color and brightness of the target image being determined by the image information of the first image and the image information of the second image.
  2. The imaging apparatus according to claim 1, wherein the image processor is specifically configured to:
    adjusting the resolution of the first image to be the same as the resolution of the second image to obtain a third image, wherein the third image carries the first color information and the first brightness information;
    and fusing the third image and the second image to obtain the target image.
  3. The imaging apparatus according to claim 1 or 2, wherein the first optical signal includes visible light, the second optical signal includes visible light and infrared light, energy of the visible light in the first optical signal is greater than energy of the visible light in the second optical signal, and a frequency band of the visible light in the first optical signal is the same as a frequency band of the visible light in the second optical signal.
  4. The image pickup apparatus according to claim 3, wherein said first image sensor is a color image sensor, and said second image sensor is a black-and-white image sensor.
  5. The image pickup apparatus according to claim 4, further comprising an infrared cut filter;
    the camera device is further used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is located between the optical assembly and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal;
    the second image sensor is specifically configured to sense visible light in the second optical signal and generate the second image;
    the image processor is specifically configured to combine first color information of the first image with second luminance information of the second image to obtain the target image, where the color of the target image is determined by the first color information, and the luminance of the target image is determined by the second luminance information.
  6. The imaging apparatus according to claim 4, wherein when the illumination intensity is lower than a preset value, the second image sensor is specifically configured to sense visible light and infrared light in the second optical signal to generate the second image;
    the image processor is specifically configured to fuse the third image and the second image to obtain the target image.
  7. The image pickup apparatus according to claim 3, wherein each of said first image sensor and said second image sensor is a color image sensor.
  8. The image pickup apparatus according to claim 7, further comprising an infrared cut filter;
    the camera device is further used for starting the infrared cut-off filter when the illumination intensity is higher than a preset value, the infrared cut-off filter is located between the optical assembly and the second image sensor, and the infrared cut-off filter is used for filtering infrared light in the second optical signal;
    the second image sensor is specifically configured to sense visible light in the second optical signal and generate the second image, where image information of the second image further includes second color information;
    the image processor is specifically configured to fuse the third image and the second image to obtain the target image.
  9. The image pickup apparatus according to claim 7, further comprising a visible light cut filter;
    the camera device is further configured to start the visible light cut-off filter when the illumination intensity is lower than a preset value, the visible light cut-off filter is located between the optical assembly and the second image sensor, and the visible light cut-off filter is configured to filter visible light in the second optical signal;
    the second image sensor is specifically configured to sense infrared light in the second optical signal and generate the second image;
    the image processor is specifically configured to fuse the third image and the second image to obtain the target image.
  10. The image pickup apparatus according to any one of claims 1 to 9, wherein the optical component includes a lens and a beam splitter prism, the beam splitter prism being located between the lens and the image sensor;
    the lens is used for receiving the incident light signal;
    the beam splitting prism is used for splitting an incident light signal received by the lens into the first light signal and the second light signal.
  11. The image pickup apparatus according to claim 10, wherein said lens is an infrared confocal lens.
  12. The image pickup apparatus according to any one of claims 1 to 9, further comprising an infrared cut filter;
    the optical assembly comprises a first lens and a second lens, the first lens and the second lens are used for receiving the incident light signal together, the focal length of the first lens is the same as that of the second lens, the aperture of the first lens is larger than that of the second lens, the infrared cut-off filter is arranged between the first lens and the first image sensor, and the second lens is an infrared confocal lens;
    the first lens is used for receiving a part of the incident light signal and transmitting the received light signal to the infrared cut-off filter;
    the infrared cut-off filter is used for filtering infrared light in the optical signal from the first lens to obtain the first optical signal and transmitting the first optical signal to the first image sensor;
    the second lens is used for receiving the rest part of the incident light signal and transmitting the received light signal to the second image sensor as a second light signal.
  13. The image pickup apparatus according to claim 1 or 2,
    the format of the first image is a YUV format;
    the format of the second image is a YUV format.
CN202080000505.1A 2020-03-20 2020-03-20 Camera device Pending CN113728618A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/080408 WO2021184353A1 (en) 2020-03-20 2020-03-20 Camera device

Publications (1)

Publication Number Publication Date
CN113728618A true CN113728618A (en) 2021-11-30

Family

ID=77769970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080000505.1A Pending CN113728618A (en) 2020-03-20 2020-03-20 Camera device

Country Status (2)

Country Link
CN (1) CN113728618A (en)
WO (1) WO2021184353A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105554483A (en) * 2015-07-16 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN106454149A (en) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 Image photographing method and device and terminal device
CN107563971A (en) * 2017-08-12 2018-01-09 四川精视科技有限公司 A kind of very color high-definition night-viewing imaging method
JP2018007016A (en) * 2016-07-01 2018-01-11 オリンパス株式会社 Imaging device, imaging method, and program
CN110891138A (en) * 2018-09-10 2020-03-17 杭州萤石软件有限公司 Black light full-color realization method and black light full-color camera

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7964835B2 (en) * 2005-08-25 2011-06-21 Protarius Filo Ag, L.L.C. Digital cameras with direct luminance and chrominance detection
JP2011239259A (en) * 2010-05-12 2011-11-24 Sony Corp Image processing device, image processing method, and program
CN107197168A (en) * 2017-06-01 2017-09-22 松下电器(中国)有限公司苏州系统网络研究开发分公司 The image capturing system of image-pickup method and application this method
CN109040534A (en) * 2017-06-12 2018-12-18 杭州海康威视数字技术股份有限公司 A kind of image processing method and image capture device
CN107820066A (en) * 2017-08-12 2018-03-20 四川聚强创新科技有限公司 A kind of low-luminance color video camera
CN208890917U (en) * 2018-11-14 2019-05-21 杭州海康威视数字技术股份有限公司 A kind of lens assembly and video camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105554483A (en) * 2015-07-16 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
JP2018007016A (en) * 2016-07-01 2018-01-11 オリンパス株式会社 Imaging device, imaging method, and program
CN106454149A (en) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 Image photographing method and device and terminal device
CN107563971A (en) * 2017-08-12 2018-01-09 四川精视科技有限公司 A kind of very color high-definition night-viewing imaging method
CN110891138A (en) * 2018-09-10 2020-03-17 杭州萤石软件有限公司 Black light full-color realization method and black light full-color camera

Also Published As

Publication number Publication date
WO2021184353A1 (en) 2021-09-23

Similar Documents

Publication Publication Date Title
CN113711584B (en) Camera device
CN108712608B (en) Terminal equipment shooting method and device
KR101428635B1 (en) Dual image capture processing
US9979941B2 (en) Imaging system using a lens unit with longitudinal chromatic aberrations and method of operating
CN106911876B (en) Method and apparatus for outputting image
CN110365878B (en) Camera device and method
US8803994B2 (en) Adaptive spatial sampling using an imaging assembly having a tunable spectral response
CN108154514B (en) Image processing method, device and equipment
EP4109894A1 (en) Image sensor and image sensitization method
CN111131798B (en) Image processing method, image processing apparatus, and imaging apparatus
CN113676675B (en) Image generation method, device, electronic equipment and computer readable storage medium
WO2020168465A1 (en) Image processing device and method
JP5010909B2 (en) Imaging apparatus and image data correction method
US10977777B2 (en) Image processing apparatus, method for controlling the same, and recording medium
US10395347B2 (en) Image processing device, imaging device, image processing method, and image processing program
JP2010021791A (en) Image capturing apparatus, and image processing method therefor, and program
CN109300186B (en) Image processing method and device, storage medium and electronic equipment
CN113728618A (en) Camera device
JP5545596B2 (en) Image input device
CN114143443B (en) Dual-sensor imaging system and imaging method thereof
JP4530149B2 (en) High dynamic range camera system
CN109447925B (en) Image processing method and device, storage medium and electronic equipment
CN114374776A (en) Camera and camera control method
CN112241935A (en) Image processing method, device and equipment and storage medium
JP6717660B2 (en) Imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination