WO2017101641A1 - 图像传感器的成像方法、成像装置和电子设备 - Google Patents

图像传感器的成像方法、成像装置和电子设备 Download PDF

Info

Publication number
WO2017101641A1
WO2017101641A1 PCT/CN2016/106800 CN2016106800W WO2017101641A1 WO 2017101641 A1 WO2017101641 A1 WO 2017101641A1 CN 2016106800 W CN2016106800 W CN 2016106800W WO 2017101641 A1 WO2017101641 A1 WO 2017101641A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
image
scene
component
output values
Prior art date
Application number
PCT/CN2016/106800
Other languages
English (en)
French (fr)
Inventor
毛水江
郭先清
Original Assignee
比亚迪股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 比亚迪股份有限公司 filed Critical 比亚迪股份有限公司
Priority to US15/777,796 priority Critical patent/US20180350860A1/en
Priority to EP16874698.0A priority patent/EP3393124A4/en
Publication of WO2017101641A1 publication Critical patent/WO2017101641A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14609Pixel-elements with integrated switching, control, storage or amplification elements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/148Charge coupled imagers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure

Definitions

  • the present application relates to the field of imaging technologies, and in particular, to an imaging method, an imaging device, and an electronic device of an image sensor.
  • the pixel size of the image sensor is smaller and the pixel size of the image sensor is smaller, which has a great influence on the imaging quality of the sensor, and one of the most influential indicators is the image.
  • the low light effect of the sensor The smaller the pixel, the lower the sensitivity of the image sensor and the less the brightness of the image low light.
  • the solutions adopted in the related art are: 1. increasing the analog or digital gain; 2. adding a brightness to the image in the image processing part; 3.
  • a lens that emits infrared light in which visible light is light that can be seen by the human eye, and infrared light refers to light that is invisible to the human eye at a wavelength of about 850 nm.
  • the all-pass lens only the visible light passes through the ordinary lens, and the all-pass lens can also pass the infrared light, so that the image obtained by the image sensor has higher brightness, but the image is easily color cast during the daytime.
  • the present application aims to solve at least one of the technical problems in the related art to some extent.
  • one of the applications The purpose is to propose an imaging method of an image sensor, which greatly improves the brightness of an image when shooting in a dark scene, and does not cause image color cast when shooting in a non-dark scene, thereby improving the user experience.
  • a second object of the present application is to propose an image forming apparatus.
  • a third object of the present application is to propose an electronic device.
  • the image sensor includes a pixel array and a microlens array disposed on the pixel array, wherein each adjacent four of the pixel arrays
  • the pixel includes a red pixel, a green pixel, a blue pixel and an infrared pixel
  • the microlens array includes a plurality of microlenses, each of which covers one pixel
  • the imaging method comprises the following steps: reading An output of each pixel in the pixel array; performing an interpolation process on the output of each pixel to obtain a red component, a green component, a blue component, and an infrared component corresponding to each pixel; acquiring a type of the current shooting scene;
  • the three primary color output values of each of the pixels are determined according to the type of the current shooting scene to generate an image according to the three primary color output values of each of the pixels.
  • the imaging method of the image sensor of the embodiment of the present application the brightness of the image when shooting in a dark scene is greatly improved, and the image color cast when shooting in a non-dark scene is not caused, thereby improving the user experience.
  • an imaging apparatus includes: an image sensor, the image sensor includes: a pixel array, wherein each adjacent four pixels in the pixel array includes one red pixel and one green pixel a blue pixel and an infrared pixel; a microlens array disposed on the pixel array, the microlens array comprising a plurality of microlenses, each microlens correspondingly covering one pixel; and connected to the image sensor
  • An image processing module configured to read an output of each pixel in the pixel array, and perform interpolation processing on an output of each pixel to obtain a red component corresponding to each pixel, and a green color a component, a blue component, and an infrared component, and acquiring a type of the current shooting scene, and determining a three primary color output value of each pixel according to the type of the current shooting scene to generate an image according to the three primary color output values of each pixel .
  • the brightness of the image at the time of dark scene shooting is greatly improved, and the image color cast at the time of shooting in a non-dark scene is not caused, thereby improving the user experience.
  • an electronic device includes the imaging device of the second aspect of the present application.
  • the imaging device since the imaging device is provided, the brightness of the image at the time of shooting in a dark scene is greatly improved, and the image color cast at the time of shooting in a non-dark scene is not caused, thereby improving the user experience.
  • FIG. 1 is a schematic diagram of a workflow of a CMOS image sensor
  • FIG. 2 is a flow chart of an image forming method of an image sensor according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of R, G, B, and IR channel response curves
  • FIG. 5 is a schematic diagram of a pixel array in an image sensor according to an embodiment of the present application.
  • FIG. 6 is a block schematic diagram of an image forming apparatus according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a microlens and a pixel covered thereby according to an embodiment of the present application.
  • FIG. 8 is a block schematic diagram of an electronic device in accordance with an embodiment of the present application.
  • the image sensor pixel array portion converts the optical signal into an electrical signal by photoelectric induction
  • 2 the electrical signal is processed by the analog circuit processing portion
  • 3 the analog electrical signal is converted into a digital signal through the analog-to-digital conversion portion
  • 4 The digital signal is processed by the digital processing part; finally, it is outputted to the display through the 5 image data output control part.
  • the image sensor comprises a pixel array and a microlens array disposed on the pixel array.
  • Each adjacent four pixels in the pixel array includes one red pixel, one green pixel, one blue pixel, and one infrared pixel.
  • the microlens array includes a plurality of microlenses. Each microlens corresponds to one pixel.
  • each pixel in the array of pixels includes a filter and a photosensitive device covered by the filter.
  • the red filter and the covered photosensitive device form a red pixel
  • the green filter and the photosensitive device covered therein form a green pixel
  • the blue filter and the covered photosensitive device constitute a blue pixel
  • the infrared filter and the covered photosensitive device thereof Forms infrared pixels.
  • the microlens corresponding to the red pixel, the green pixel, and the blue pixel only allow visible light to pass through, and the microlens corresponding to the infrared pixel allows only near-infrared light to pass through.
  • the microlens on the red pixel R, the green pixel G, and the blue pixel B only pass visible light having a wavelength of 650 nm or less
  • the microlens on the infrared pixel ir only passes near-infrared light having a wavelength of 650 nm or more and 850 nm or so, as shown in FIG. Show.
  • the image sensor pixel array commonly used in the related art is a Bell array, as shown in FIG. 4, where B represents a blue component in the three primary colors, G represents a green component in the three primary colors, and R represents a red component in the three primary colors.
  • the pixel array employed by the image sensor is as shown in FIG. 5, that is, one of the green components G of the Bell array is replaced with the component ir that only senses infrared.
  • R shown in FIG. 5 is a red component that only passes visible light (R is configured to pass a red component of a visible light band that does not include an infrared component), and G is a green component that only passes visible light (G is configured as : Passing the green component of the visible light band that does not include the infrared component, and B is the blue component that only passes the visible light (B is configured to pass the blue component of the visible light band that does not include the infrared component).
  • the image sensor is a CMOS image sensor.
  • an image sensor imaging method includes:
  • the CMOS image sensor After exposing the CMOS image sensor, the CMOS image sensor senses the output image original signal, and each pixel in the image original signal contains only one color component. Among them, the CMOS image sensor senses the output image original signal, which is a photoelectric conversion process. The CMOS image sensor converts the external light signal into an electrical signal through a photodiode, and then processes it through an analog circuit, and then passes the analog signal through an analog-to-digital converter. Converted to a digital signal for subsequent digital signal processing.
  • the output of each pixel in the pixel array ie, the digital image signal of each pixel in the pixel array
  • the output of each pixel contains only one color component, for example, the output of the red pixel contains only one red component.
  • each pixel contains only one color component, it is necessary to perform interpolation processing on the output of each pixel to obtain four components R, G, B, and ir of each pixel.
  • the output of the red pixel contains only the red component R, and the other color components G, B, and ir of the red pixel can be obtained by interpolating the red pixel.
  • each pixel has four color components R, G, B, and ir.
  • the interpolation processing of the output of each pixel employs any of the following interpolation methods: adjacent point interpolation, bilinear interpolation, and edge adaptive interpolation.
  • acquiring the type of the current shooting scene specifically includes: acquiring an exposure time of the pixel array; determining whether the exposure time is greater than a preset exposure time threshold; and if the exposure time is greater than the preset exposure time threshold, determining the current The shooting scene is a dark scene; if the exposure time is less than or equal to the preset exposure time threshold, it is determined that the current shooting scene is a non-dark scene.
  • the exposure time T it takes a certain time for the image sensor to be sensitive. This time is called the exposure time T.
  • the longer the exposure time T the brighter the image brightness sensed by the image sensor.
  • the exposure time For normal scenes during the day, because the ambient light is strong, the image The sensor only needs a short exposure time to achieve the desired target brightness, but for dark scenes such as night, the image sensor requires a longer exposure time because of the low ambient light.
  • Long exposure time means that an image can be sensed for a long time.
  • the exposure time will have an upper limit value T_th (ie, the preset exposure time threshold).
  • the three primary color output values of each pixel are determined according to the red component, the green component, and the blue component corresponding to each pixel.
  • the image sensed by the image sensor is displayed in the display in the form of three primary colors.
  • the output values of the three primary colors of each pixel are:
  • R', G', and B' respectively represent the three primary color output values of one pixel
  • R represents the red component corresponding to the pixel
  • G represents the green component corresponding to the pixel
  • B represents the blue component corresponding to the pixel.
  • the R, G, and B components passing only visible light are used in the non-dark scene to ensure that the image of the non-dark scene is not color cast.
  • the three primary color output values of each pixel are determined according to the red component, the green component, the blue component, and the infrared component corresponding to each pixel, that is, each pixel
  • the output values of the three primary colors are:
  • R', G', and B' respectively represent the output values of the three primary colors of one pixel
  • R represents the red component corresponding to the pixel
  • G represents the green component corresponding to the pixel
  • B represents the blue component corresponding to the pixel
  • ir represents the The infrared component corresponding to the pixel.
  • the brightness of the image can be improved by superimposing the infrared component in the dark scene, because many dark scenes of the monitoring product are not required for color, and only the brightness and sharpness of the image are emphasized, so the dark scene is finally output in the form of a black and white image.
  • the embodiment of the present application is beneficial in that when in a dark scene, the brightness of the image is increased from the data source, and therefore, no amplification of noise is caused;
  • the embodiment of the application increases the amount of light that the image sensor senses, instead of adding a brightness to the image as a whole, so that the image is not blurred;
  • the embodiment of the present application uses R, G, and B that only pass visible light in a non-dark scene.
  • the three primary colors do not affect the color of the image, and the infrared component ir is added in the dark scene to improve the brightness of the image in dark scenes. This greatly improves the image quality.
  • the imaging method of the image sensor of the embodiment of the present invention greatly improves the brightness of the image when shooting in a dark scene, and does not cause image color cast when shooting in a non-dark scene, thereby improving the user experience.
  • the present application also proposes an image forming apparatus.
  • FIG. 6 is a block schematic diagram of an image forming apparatus according to an embodiment of the present application.
  • the imaging apparatus 100 of the embodiment of the present application includes an image sensor 10 and an image processing module 20.
  • the image sensor 10 includes a pixel array 11 and a microlens array 12 disposed on the pixel array 11.
  • each adjacent four pixels 111 in the pixel array 11 includes one red pixel R, one green pixel G, one blue pixel B, and one infrared pixel ir. That is, one of the green components G in the Bell array is replaced with the component ir that only senses the infrared.
  • the microlens array 12 disposed above the pixel array 11 includes a plurality of microlenses 121, each of which covers one pixel 111, as shown in FIG.
  • each of the pixels 111 in the pixel array 11 includes a filter 1111 and a photosensitive device 1112 covered by the filter 1111, wherein the red filter and the photosensitive device covered thereon form a red pixel.
  • the microlens corresponding to the red pixel, the green pixel, and the blue pixel only allow visible light to pass through, and the microlens corresponding to the infrared pixel allows only near-infrared light to pass through.
  • the microlens 121 on the red pixel R, the green pixel G, and the blue pixel B passes only visible light having a wavelength of 650 nm or less
  • the microlens 121 on the infrared pixel ir passes only near-infrared light having a wavelength of 650 nm or more and 850 nm or so.
  • image sensor 10 is a CMOS image sensor.
  • An image processing module 20 connected to the image sensor 10 is configured to read an output of each pixel in the pixel array, that is, to read a digital image signal of each pixel in the pixel array, and perform interpolation processing on the output of each pixel to obtain
  • the red component, the green component, the blue component, and the infrared component corresponding to each pixel, and the type of the current shooting scene are acquired, and the three primary color output values of each pixel are determined according to the type of the current shooting scene to be output according to the three primary colors of each pixel.
  • the value is generated as an image.
  • the image sensor 10 After exposure of the image sensor 10, the image sensor 10 senses an image raw signal, and the original signal contains only one color component per pixel.
  • the image sensor 10 induces an image original signal, which is a photoelectric conversion process.
  • the image sensor 10 converts an external light signal into an electrical signal through a photodiode, and then processes it through an analog circuit, and then passes the analog signal through an analog-to-digital converter.
  • the digital signal is converted for processing by the image processing module 20.
  • image processing module 20 reads the output of each pixel in the pixel array, the output of each pixel containing only one color component, for example, the output of the red pixel contains only one red component. Since the output of each pixel contains only one color component, the output of each pixel needs to be interpolated to obtain the four components R, G, B, and ir of each pixel.
  • the output of the red pixel includes only the red component R, and the image processing module 20 interpolates the red pixel to obtain other color components G, B, and ir of the red pixel.
  • each pixel has four color components R, G, B, and ir.
  • the interpolation processing of the output of each pixel employs any of the following interpolation methods: adjacent point interpolation, bilinear interpolation, and edge adaptive interpolation.
  • the image processing module 20 acquires the type of the current shooting scene, and determines the three primary color output values of each pixel according to the type of the current shooting scene to generate an image according to the three primary color output values of each pixel. The details will be described below.
  • the image processing module 20 is specifically configured to acquire an exposure time of the pixel array, and determine whether the exposure time is greater than or equal to a preset exposure time threshold, and if the exposure time is greater than or equal to the preset exposure time threshold, It is determined that the current shooting scene is a dark scene, and if the exposure time is less than the preset exposure time threshold, it is determined that the current shooting scene is a non-dark scene.
  • the image sensor 10 needs a certain time for sensitization. This time is called the exposure time T.
  • the longer the exposure time T the brighter the image brightness sensed by the image sensor 10.
  • the exposure time For normal scenes during the day, because the ambient light is strong, the image sensor 10 only needs a short exposure time to achieve the desired target brightness, but for dark scenes such as night, the image sensor 10 needs to be longer because the ambient light is low.
  • Exposure time Long exposure time means that an image can be sensed for a long time. In order to meet the frame rate (that is, the number of images sensed in one second), the exposure time will have an upper limit value T_th (ie, the preset exposure time threshold).
  • the image processing module 20 can determine whether it is a dark scene or a non-dark scene by comparing the exposure time T and the upper limit value T_th, and is a non-dark scene when T is smaller than T_th, and is a dark scene.
  • the image processing module 20 is specifically configured to determine three primary colors of each pixel according to a red component, a green component, and a blue component corresponding to each pixel when the current shooting scene is a non-dark scene. output value.
  • the image sensed by the image sensor 10 is displayed in the display in the format of three primary colors. For a non-dark scene, the output values of the three primary colors of each pixel are:
  • R', G', and B' respectively represent the three primary color output values of one pixel
  • R represents the red component corresponding to the pixel
  • G represents the green component corresponding to the pixel
  • B represents the blue component corresponding to the pixel.
  • the R, G, and B components passing only visible light are used in the non-dark scene to ensure that the image of the non-dark scene is not color cast.
  • the image processing module 20 is specifically configured to determine the three primary color outputs of each pixel according to the red component, the green component, the blue component, and the infrared component corresponding to each pixel when the current shooting scene is a dark scene.
  • the value, that is, the output values of the three primary colors of each pixel are:
  • R', G', and B' respectively represent the three primary color output values of one pixel, and R represents the red color corresponding to the pixel.
  • the component, G represents the green component corresponding to the pixel, B represents the blue component corresponding to the pixel, and ir represents the infrared component corresponding to the pixel.
  • the brightness of the image can be improved by superimposing the infrared component in the dark scene, because many dark scenes of the monitoring product are not required for color, and only the brightness and sharpness of the image are emphasized, so the dark scene is finally output in the form of a black and white image.
  • the imaging device of the embodiment of the present application greatly improves the brightness of an image when shooting in a dark scene, and does not cause image color cast when shooting in a non-dark scene, thereby improving the user experience.
  • the present application also proposes an electronic device 200.
  • the electronic device 200 includes the imaging device 100 of the embodiment of the present application.
  • the electronic device 200 is a monitoring device.
  • the electronic device of the embodiment of the present application has the imaging device, the brightness of the image during dark scene shooting is greatly improved, and the image color cast when shooting in a non-dark scene is not caused, thereby improving the user experience.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include at least one of the features, either explicitly or implicitly.
  • the meaning of "a plurality” is at least two, such as two, three, etc., unless specifically defined otherwise.
  • the terms “installation”, “connected”, “connected”, “fixed” and the like shall be understood broadly, and may be either a fixed connection or a detachable connection, unless otherwise explicitly stated and defined. , or integrated; can be mechanical or electrical connection; can be directly connected, or indirectly connected through an intermediate medium, can be the internal communication of two elements or the interaction of two elements, unless otherwise specified Limited.
  • the specific meanings of the above terms in the present application can be understood on a case-by-case basis.
  • the first feature "on” or “below” the second feature may be the direct contact of the first and second features, or the first and second features are indirectly through the intermediate medium, unless otherwise explicitly stated and defined. contact.
  • the first feature "above”, “above” and “above” the second feature may be that the first feature is directly above or above the second feature, or merely that the first feature level is higher than the second feature.
  • the first feature “below”, “below” and “below” the second feature may be that the first feature is directly below or obliquely below the second feature, or merely that the first feature level is less than the second feature.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

本申请公开了一种图像传感器的成像方法、成像装置和电子设备。图像传感器包括像素阵列和设置在像素阵列之上的微透镜阵列,像素阵列中每相邻的四个像素包括一个红色像素、一个绿色像素、一个蓝色像素和一个红外像素,微透镜阵列包括多个微透镜,每个微透镜对应覆盖一个像素。成像方法包括:读取像素阵列中每个像素的输出;对每个像素的输出进行插值处理,以获得每个像素对应的红色分量、绿色分量、蓝色分量和红外分量;获取当前拍摄场景的类型;根据当前拍摄场景的类型确定每个像素的三原色输出值,以根据每个像素的三原色输出值生成图像。

Description

图像传感器的成像方法、成像装置和电子设备
相关申请的交叉引用
本申请基于申请号为201510925379.1、申请日为2015/12/14的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及成像技术领域,尤其涉及一种图像传感器的成像方法、成像装置和电子设备。
背景技术
近年来图像传感器发展突飞猛进,销量不断攀升,市场竞争异常激烈,在价格不断下降的同时,对图像质量的要求反而不断提高。为了减小成本和传感器的面积,图像传感器的像素尺寸越做越小,图像传感器的像素尺寸变小,这样对传感器的成像质量带来了很大的影响,其中影响最大的一项指标即图像传感器的低光效果。像素越小,图像传感器的灵敏度就越低,图像低光的亮度就越不足。为了提高低光亮度,相关技术中的采用的方案有:1、增大模拟或数字增益;2、在图像处理部分对图像加一个亮度;3、采用全通镜头,即既可通可见光又可通红外光的镜头,其中,可见光即人眼可以看到的光线,红外光是指波长在850nm左右为人眼所不能看见的光。
但是,上述方案存在以下缺点:
(1)增大模拟或数字增益,即对图像信号乘上一个大于1的倍数,这样可以放大图像信号从而提高图像亮度,但是在放大图像信号的同时,对图像噪声也会带来同样倍数的放大,从而导致放大后图像噪声太大。
(2)在图像处理部分对图像加一个亮度,对图像整体加一个亮度,可以达到提高图像低光亮度的效果,但加亮度的同时,会减小图像细节和非细节之间的反差,从而导致图像看起来非常模糊。
(3)采用全通镜头,相比普通镜头只让可见光通过,全通镜头还可以通过红外光,这样图像传感器所得到的图像亮度就更高,但是在白天图像容易偏色。
发明内容
本申请旨在至少在一定程度上解决相关技术中的技术问题之一。为此,本申请的一个 目的在于提出一种图像传感器的成像方法,该成像方法大大提高了暗场景拍摄时图像的亮度,且不会造成非暗场景拍摄时的图像偏色,从而提升了用户体验。
本申请的第二个目的在于提出一种成像装置。
本申请的第三个目的在于提出一种电子设备。
为了实现上述目的,本申请第一方面实施例的图像传感器的成像方法,图像传感器包括像素阵列和设置在所述像素阵列之上的微透镜阵列,其中,所述像素阵列中每相邻的四个像素包括一个红色像素、一个绿色像素、一个蓝色像素和一个红外像素,所述微透镜阵列包括多个微透镜,每个微透镜对应覆盖一个像素,所述成像方法包括以下步骤:读取所述像素阵列中每个像素的输出;对所述每个像素的输出进行插值处理,以获得每个像素对应的红色分量、绿色分量、蓝色分量和红外分量;获取当前拍摄场景的类型;根据所述当前拍摄场景的类型确定所述每个像素的三原色输出值,以根据所述每个像素的三原色输出值生成图像。
根据本申请实施例的图像传感器的成像方法,大大提高了暗场景拍摄时图像的亮度,且不会造成非暗场景拍摄时的图像偏色,从而提升了用户体验。
为了实现上述目的,本申请第二方面实施例的成像装置,包括:图像传感器,所述图像传感器包括:像素阵列,所述像素阵列中每相邻的四个像素包括一个红色像素、一个绿色像素、一个蓝色像素和一个红外像素;设置在所述像素阵列之上的微透镜阵列,所述微透镜阵列包括多个微透镜,每个微透镜对应覆盖一个像素;以及与所述图像传感器相连的图像处理模块,所述图像处理模块,用于读取所述像素阵列中每个像素的输出,并对所述每个像素的输出进行插值处理,以获得每个像素对应的红色分量、绿色分量、蓝色分量和红外分量,以及获取当前拍摄场景的类型,并根据所述当前拍摄场景的类型确定所述每个像素的三原色输出值,以根据所述每个像素的三原色输出值生成图像。
根据本申请实施例的成像装置,大大提高了暗场景拍摄时图像的亮度,且不会造成非暗场景拍摄时的图像偏色,从而提升了用户体验。
为了实现上述目的,本申请第三方面实施例的电子设备,包括本申请第二方面实施例的成像装置。
根据本申请实施例的电子设备,由于具有了该成像装置,大大提高了暗场景拍摄时图像的亮度,且不会造成非暗场景拍摄时的图像偏色,从而提升了用户体验。
附图说明
图1是CMOS图像传感器的工作流程示意图;
图2是根据本申请一个实施例的图像传感器的成像方法的流程图;
图3是R、G、B、IR通道响应曲线的示意图;
图4是相关技术中经典的贝尔阵列示意图;
图5是根据本申请一个实施例的图像传感器中像素阵列的示意图;
图6是根据本申请一个实施例的成像装置的方框示意图;
图7是根据本申请一个实施例的一个微透镜及其覆盖的一个像素的示意图;
图8是根据本申请一个实施例的电子设备的方框示意图。
具体实施方式
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。
首先对相关技术中的CMOS图像传感器的工作流程进行介绍。如图1所示,1:图像传感器像素阵列部分通过光电感应把光信号转换为电信号;2:电信号经过模拟电路处理部分处理;3:模拟电信号经过模数转换部分转换为数字信号;4:数字信号经过数字处理部分处理;最终经过5图像数据输出控制部分输出到显示器上显示。
下面结合附图描述本申请实施例的图像传感器的成像方法、成像装置和电子设备。
图2是根据本申请一个实施例的图像传感器的成像方法的流程图。其中,图像传感器包括像素阵列和设置在像素阵列之上的微透镜阵列。像素阵列中每相邻的四个像素包括一个红色像素、一个绿色像素、一个蓝色像素和一个红外像素。微透镜阵列包括多个微透镜。每个微透镜对应覆盖一个像素。
在本申请的一个实施例中,像素阵列中的每个像素包括滤光片和被滤光片覆盖的感光器件。红色滤光片与其覆盖的感光器件构成红色像素,绿色滤光片与其覆盖的感光器件构成绿色像素,蓝色滤光片与其覆盖的感光器件构成蓝色像素,红外滤光片与其覆盖的感光器件构成红外像素。
在本申请的一个实施例中,红色像素、绿色像素和蓝色像素对应的微透镜只允许可见光透过,红外像素对应的微透镜只允许近红外光透过。
具体地,在图像传感器的设计制造过程中,需要对每个像素的微透镜做特殊处理。例如,红色像素R、绿色像素G和蓝色像素B上的微透镜只通过波长650nm以下的可见光,红外像素ir上的微透镜只通过波长650nm以上、850nm左右的近红外光,如图3所示。
其中,相关技术中常用的图像传感器像素阵列为贝尔阵列,如图4所示,其中B代表三原色中的蓝色分量,G代表三原色中的绿色分量,R代表三原色中的红色分量。
在本申请的实施例中,图像传感器所采用的像素阵列如图5所示,也就是将贝尔阵列中其中一个绿色分量G替换成只感应红外的分量ir。
具体地,图5中所示的R为只通可见光的红色分量(R被配置为:使不包含红外分量的可见光波段的红色分量通过),G为只通可见光的绿色分量(G被配置为:使不包含红外分量的可见光波段的绿色分量通过),B为只通可见光的蓝色分量(B被配置为:使不包含红外分量的可见光波段的蓝色分量通过)。
在本申请的一个实施例中,图像传感器为CMOS图像传感器。
如图2所示,图像传感器的成像方法,包括:
S1,读取像素阵列中每个像素的输出,即像素阵列中每个像素的数字图像信号。
对CMOS图像传感器进行曝光后,CMOS图像传感器感应输出图像原始信号,图像原始信号中的每个像素仅包含一个颜色分量。其中,CMOS图像传感器感应输出图像原始信号,这是一个光电转换的过程,CMOS图像传感器把外界的光信号通过光电二极管转换成电信号,然后经过模拟电路处理,再通过模数转换器将模拟信号转换成数字信号,以供后续进行数字信号处理。
具体地,读取像素阵列中每个像素的输出即像素阵列中每个像素的数字图像信号,每个像素的输出仅包含一个颜色分量,例如,红色像素的输出仅包含一个红色分量。
S2,对每个像素的输出进行插值处理,以获得每个像素对应的红色分量、绿色分量、蓝色分量和红外分量。
具体地,由于每个像素的输出仅包含一个颜色分量,所以需要对每个像素的输出进行插值处理,以获得每个像素的四个分量R、G、B和ir。
例如,对于红色像素来说,红色像素的输出仅包含红色分量R,对该红色像素进行插值处理就可以得到该红色像素的其它颜色分量G、B和ir。这样,经过插值后,每个像素就都具有四个颜色分量R、G、B和ir。
在本申请的一个实施例中,对每个像素的输出进行插值处理采用下列插值方法中的任一种:邻近点插值法、双线性插值法和边缘自适应插值法。
S3,获取当前拍摄场景的类型。
在本申请的一个实施例中,获取当前拍摄场景的类型,具体包括:获取像素阵列的曝光时间;判断曝光时间是否大于预设曝光时间阈值;如果曝光时间大于预设曝光时间阈值,则判断当前拍摄场景为暗场景;如果曝光时间小于或等于预设曝光时间阈值,则判断当前拍摄场景为非暗场景。
具体地,图像传感器感光需要一定的时间,这个时间称为曝光时间T,曝光时间T越长,图像传感器感应出来的图像亮度就越亮。对于白天正常场景,因为环境光线强,图像 传感器只需要很短的曝光时间就能达到想要的目标亮度,但对于晚上等暗场景,因为环境光线很低,图像传感器就需要更长的曝光时间。曝光时间长就意味着很长时间才能感应出来一张图像,为了满足帧率(即1秒钟感应的图像数量)的要求,曝光时间会有一个上限值T_th(即预设曝光时间阈值),因此,可以通过比较曝光时间T和上限值T_th来判断是暗场景还是非暗场景,当T小于T_th时为非暗场景,反之则为暗场景。
S4,根据当前拍摄场景的类型确定每个像素的三原色输出值,以根据每个像素的三原色输出值生成图像。
在本申请的一个实施例中,如果当前拍摄场景为非暗场景,则根据每个像素对应的红色分量、绿色分量和蓝色分量确定每个像素的三原色输出值。图像传感器感应出来的图像是以三原色的格式在显示器中显示的,对于非暗场景,每个像素的三原色输出值为:
R’=R,G’=G,B’=B,
其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量。
这样,在非暗场景使用只通过可见光的R、G、B分量,以保证非暗场景的图像不偏色。
在本申请的一个实施例中,如果当前拍摄场景为暗场景,则根据每个像素对应的红色分量、绿色分量、蓝色分量和红外分量确定每个像素的三原色输出值,即每个像素的三原色输出值为:
R’=R+ir,G’=G+ir,B’=B+ir,
其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量,ir表示该像素对应的红外分量。
这样,在暗场景通过叠加红外分量可以提高图像的亮度,因为现在很多监控产品暗场景对颜色要求不高,只看重图像的亮度和清晰度,所以暗场景最终以黑白图像的形式输出。
相较于相关技术中提高低光亮度的方案,本申请的实施例的有益之处在于:当处于暗场景时,是从数据源头上提高图像亮度,因此,不会带来噪声的放大;本申请的实施例增加的是图像传感器感应光线的多少,而不是对图像整体加一个亮度,所以不会使图像变模糊;本申请的实施例在非暗场景使用只通可见光的R、G、B三原色,不会对图像的色彩产生影响,而在暗场景时增加红外分量ir,可以提高暗场景时图像的亮度。从而大大提升了图像质量。
本申请实施例的图像传感器的成像方法,大大提高了暗场景拍摄时图像的亮度,且不会造成非暗场景拍摄时的图像偏色,从而提升了用户体验。
为了实现上述实施例,本申请还提出了一种成像装置。
图6是根据本申请一个实施例的成像装置的方框示意图。如图6所示,本申请实施例的成像装置100,包括:图像传感器10和图像处理模块20。
其中,图像传感器10包括:像素阵列11和设置在像素阵列11之上的微透镜阵列12。
如图5所示,像素阵列11中每相邻的四个像素111包括一个红色像素R、一个绿色像素G、一个蓝色像素B和一个红外像素ir。也就是将贝尔阵列中其中一个绿色分量G替换成只感应红外的分量ir。
设置在像素阵列11之上的微透镜阵列12,包括多个微透镜121,每个微透镜121对应覆盖一个像素111,如图7所示。
在本申请的一个实施例中,像素阵列11中的每个像素111包括滤光片1111和被滤光片1111覆盖的感光器件1112,其中,红色滤光片与其覆盖的感光器件构成红色像素,绿色滤光片与其覆盖的感光器件构成绿色像素,蓝色滤光片与其覆盖的感光器件构成蓝色像素,红外滤光片与其覆盖的感光器件构成红外像素。
在本申请的一个实施例中,红色像素、绿色像素和蓝色像素对应的微透镜只允许可见光透过,红外像素对应的微透镜只允许近红外光透过。
具体地,在图像传感器10的设计制造过程中,需要对每个像素的微透镜121做特殊处理。例如,红色像素R、绿色像素G和蓝色像素B上的微透镜121只通过波长650nm以下的可见光,红外像素ir上的微透镜121只通过波长650nm以上、850nm左右的近红外光。
在本申请的一个实施例中,图像传感器10为CMOS图像传感器。
与图像传感器10相连的图像处理模块20,用于读取像素阵列中每个像素的输出即读取像素阵列中每个像素的数字图像信号,并对每个像素的输出进行插值处理,以获得每个像素对应的红色分量、绿色分量、蓝色分量和红外分量,以及获取当前拍摄场景的类型,并根据当前拍摄场景的类型确定每个像素的三原色输出值,以根据每个像素的三原色输出值生成图像。
对图像传感器10进行曝光后,图像传感器10感应出图像原始信号,原始信号每个像素点仅包含一个颜色分量。其中,图像传感器10感应出图像原始信号,这是一个光电转换的过程,图像传感器10把外界的光信号通过光电二极管转换成电信号,然后经过模拟电路处理,再通过模数转换器将模拟信号转换成数字信号,以供图像处理模块20进行处理。
具体地,图像处理模块20读取像素阵列中每个像素的输出,每个像素的输出仅包含一个颜色分量,例如,红色像素的输出仅包含一个红色分量。由于每个像素的输出仅包含一个颜色分量,所以需要对每个像素的输出进行插值处理,以获得每个像素的四个分量R、G、B和ir。
例如,对于红色像素来说,红色像素的输出仅包含红色分量R,图像处理模块20对该红色像素进行插值处理就可以得到该红色像素的其它颜色分量G、B和ir。这样,经过插值后,每个像素就都具有四个颜色分量R、G、B和ir。
在本申请的一个实施例中,对每个像素的输出进行插值处理采用下列插值方法中的任一种:邻近点插值法、双线性插值法和边缘自适应插值法。
进一步地,图像处理模块20获取当前拍摄场景的类型,根据当前拍摄场景的类型确定每个像素的三原色输出值,以根据每个像素的三原色输出值生成图像。下面进行详细说明。
在本申请的一个实施例中,图像处理模块20具体用于获取像素阵列的曝光时间,并判断曝光时间是否大于或等于预设曝光时间阈值,如果曝光时间大于或等于预设曝光时间阈值,则判断当前拍摄场景为暗场景,如果曝光时间小于预设曝光时间阈值,则判断当前拍摄场景为非暗场景。
具体地,图像传感器10感光需要一定的时间,这个时间称为曝光时间T,曝光时间T越长,图像传感器10感应出来的图像亮度就越亮。对于白天正常场景,因为环境光线强,图像传感器10只需要很短的曝光时间就能达到想要的目标亮度,但对于晚上等暗场景,因为环境光线很低,图像传感器10就需要更长的曝光时间。曝光时间长就意味着很长时间才能感应出来一张图像,为了满足帧率(即1秒钟感应的图像数量)的要求,曝光时间会有一个上限值T_th(即预设曝光时间阈值),因此,图像处理模块20可以通过比较曝光时间T和上限值T_th来判断是暗场景还是非暗场景,当T小于T_th时为非暗场景,反之则为暗场景。
进一步地,在本申请的一个实施例中,图像处理模块20具体用于在当前拍摄场景为非暗场景时,根据每个像素对应的红色分量、绿色分量和蓝色分量确定每个像素的三原色输出值。图像传感器10感应出来的图像是以三原色的格式在显示器中显示的,对于非暗场景,每个像素的三原色输出值为:
R’=R,G’=G,B’=B,
其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量。
这样,在非暗场景使用只通过可见光的R、G、B分量,以保证非暗场景的图像不偏色。
在本申请的一个实施例中,图像处理模块20具体用于在当前拍摄场景为暗场景时,根据每个像素对应的红色分量、绿色分量、蓝色分量和红外分量确定每个像素的三原色输出值,即每个像素的三原色输出值为:
R’=R+ir,G’=G+ir,B’=B+ir,
其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色 分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量,ir表示该像素对应的红外分量。
这样,在暗场景通过叠加红外分量可以提高图像的亮度,因为现在很多监控产品暗场景对颜色要求不高,只看重图像的亮度和清晰度,所以暗场景最终以黑白图像的形式输出。
本申请实施例的成像装置,大大提高了暗场景拍摄时图像的亮度,且不会造成非暗场景拍摄时的图像偏色,从而提升了用户体验。
为了实现上述实施例,本申请还提出了一种电子设备200,如图8所示,该电子设备200包括本申请实施例的成像装置100。
在本申请的一个实施例中,电子设备200为监控设备。
本申请实施例的电子设备,由于具有了该成像装置,大大提高了暗场景拍摄时图像的亮度,且不会造成非暗场景拍摄时的图像偏色,从而提升了用户体验。
在本申请的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”、“顺时针”、“逆时针”、“轴向”、“径向”、“周向”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
在本申请中,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”、“固定”等术语应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或成一体;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系,除非另有明确的限定。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征“上”或“下”可以是第一和第二特征直接接触,或第一和第二特征通过中间媒介间接接触。而且,第一特征在第二特征“之上”、“上方”和“上面”可是第一特征在第二特征正上方或斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”可以是第一特征在第二特征正下方或斜下方,或仅仅表示第一特征水平高度小于第二特征。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种图像传感器的成像方法,其特征在于,所述图像传感器包括像素阵列和设置在所述像素阵列之上的微透镜阵列,其中,所述像素阵列中每相邻的四个像素包括一个红色像素、一个绿色像素、一个蓝色像素和一个红外像素,所述微透镜阵列包括多个微透镜,每个微透镜对应覆盖一个像素,所述成像方法包括:
    读取所述像素阵列中每个像素的输出;
    对所述每个像素的输出进行插值处理,以获得每个像素对应的红色分量、绿色分量、蓝色分量和红外分量;
    获取当前拍摄场景的类型;
    根据所述当前拍摄场景的类型确定所述每个像素的三原色输出值,以根据所述每个像素的三原色输出值生成图像。
  2. 如权利要求1所述的图像传感器的成像方法,其特征在于,所述获取当前拍摄场景的类型,具体包括:
    获取所述像素阵列的曝光时间;
    判断所述曝光时间是否大于或等于预设曝光时间阈值;
    如果所述曝光时间大于或等于所述预设曝光时间阈值,则判断所述当前拍摄场景为暗场景;
    如果所述曝光时间小于所述预设曝光时间阈值,则判断所述当前拍摄场景为非暗场景。
  3. 如权利要求2所述的图像传感器的成像方法,其特征在于,所述根据所述当前拍摄场景的类型确定所述每个像素的三原色输出值,具体包括:
    如果所述当前拍摄场景为所述暗场景,则根据所述每个像素对应的红色分量、绿色分量、蓝色分量和红外分量确定所述每个像素的三原色输出值;
    如果所述当前拍摄场景为所述非暗场景,则根据所述每个像素对应的红色分量、绿色分量和蓝色分量确定所述每个像素的三原色输出值。
  4. 如权利要求3所述的图像传感器的成像方法,其特征在于,如果所述当前拍摄场景为所述非暗场景,根据下述公式计算所述每个像素的三原色输出值:
    R’=R,G’=G,B’=B,
    其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量。
  5. 如权利要求3所述的图像传感器的成像方法,其特征在于,如果所述当前拍摄场景为所述暗场景,根据下述公式计算所述每个像素的三原色输出值:
    R’=R+ir,G’=G+ir,B’=B+ir,
    其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量,ir表示该像素对应的红外分量。
  6. 如权利要求2所述的图像传感器的成像方法,其特征在于,所述根据所述每个像素的三原色输出值生成图像,具体包括:
    当所述当前拍摄场景为所述非暗场景时,根据所述每个像素的三原色输出值生成彩色图像;
    当所述当前拍摄场景为所述暗场景时,根据所述每个像素的三原色输出值生成黑白图像。
  7. 如权利要求1-6任意一项所述的图像传感器的成像方法,其特征在于,所述像素阵列中的每个像素包括滤光片和被所述滤光片覆盖的感光器件,其中,
    红色滤光片与其覆盖的感光器件构成所述红色像素,绿色滤光片与其覆盖的感光器件构成所述绿色像素,蓝色滤光片与其覆盖的感光器件构成所述蓝色像素,红外滤光片与其覆盖的感光器件构成所述红外像素。
  8. 如权利要求1-7任意一项所述的图像传感器的成像方法,其特征在于,所述红色像素、绿色像素和蓝色像素对应的微透镜只允许可见光透过,所述红外像素对应的微透镜只允许近红外光透过。
  9. 如权利要求1-8任意一项所述的图像传感器的成像方法,其特征在于,所述对所述每个像素的输出进行插值处理采用下列插值方法中的任一种:
    邻近点插值法、双线性插值法和边缘自适应插值法。
  10. 一种成像装置,其特征在于,包括:
    图像传感器,所述图像传感器包括:
    像素阵列,所述像素阵列中每相邻的四个像素包括一个红色像素、一个绿色像素、一个蓝色像素和一个红外像素;
    设置在所述像素阵列之上的微透镜阵列,所述微透镜阵列包括多个微透镜,每个微透镜对应覆盖一个像素;以及
    与所述图像传感器相连的图像处理模块,所述图像处理模块用于读取所述像素阵列中每个像素的输出,并对所述每个像素的输出进行插值处理,以获得每个像素对应的红色分量、绿色分量、蓝色分量和红外分量,以及获取当前拍摄场景的类型,并根据所述当前拍摄场景的类型确定所述每个像素的三原色输出值,以根据所述每个像素的三原色输出值生成图像。
  11. 如权利要求10所述的成像装置,其特征在于,所述图像处理模块具体用于:
    获取所述像素阵列的曝光时间,并判断所述曝光时间是否大于或等于预设曝光时间阈值,如果所述曝光时间大于或等于所述预设曝光时间阈值,则判断所述当前拍摄场景为暗场景,如果所述曝光时间小于所述预设曝光时间阈值,则判断所述当前拍摄场景为非暗场景。
  12. 如权利要求11所述的成像装置,其特征在于,所述图像处理模块具体用于:
    在所述当前拍摄场景为所述暗场景时,根据所述每个像素对应的红色分量、绿色分量、蓝色分量和红外分量确定所述每个像素的三原色输出值;
    在所述当前拍摄场景为所述非暗场景时,根据所述每个像素对应的红色分量、绿色分量和蓝色分量确定所述每个像素的三原色输出值。
  13. 如权利要求12所述的成像装置,其特征在于,在所述当前拍摄场景为所述非暗场景时,所述图像处理模块根据下述公式计算所述每个像素的三原色输出值:
    R’=R,G’=G,B’=B,
    其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量。
  14. 如权利要求12所述的成像装置,其特征在于,在所述当前拍摄场景为所述暗场景,所述图像处理模块根据下述公式计算所述每个像素的三原色输出值:
    R’=R+ir,G’=G+ir,B’=B+ir,
    其中,R’、G’和B’分别表示一个像素的三原色输出值,R表示该像素对应的红色分量、G表示该像素对应的绿色分量,B表示该像素对应的蓝色分量,ir表示该像素对应的红外分量。
  15. 如权利要求11所述的成像装置,其特征在于,所述图像处理模块具体用于:
    在所述当前拍摄场景为非暗场景时,根据所述每个像素的三原色输出值生成彩色图像;
    在所述当前拍摄场景为暗场景时,根据所述每个像素的三原色输出值生成黑白图像。
  16. 如权利要求10所述的成像装置,其特征在于,所述像素阵列中的每个像素包括滤光片和被所述滤光片覆盖的感光器件,其中,
    红色滤光片与其覆盖的感光器件构成所述红色像素,绿色滤光片与其覆盖的感光器件构成所述绿色像素,蓝色滤光片与其覆盖的感光器件构成所述蓝色像素,红外滤光片与其覆盖的感光器件构成所述红外像素。
  17. 如权利要求10所述的成像装置,其特征在于,所述红色像素、绿色像素和蓝色像素对应的微透镜只允许可见光透过,所述红外像素对应的微透镜只允许近红外光透过。
  18. 如权利要求10所述的成像装置,其特征在于,所述图像处理模块采用邻近点插值 法、双线性插值法或边缘自适应插值法对所述每个像素的输出进行插值处理。
  19. 一种电子设备,其特征在于,包括如权利要求10-18中任一项所述的成像装置。
  20. 如权利要求19所述的电子设备,其特征在于,所述电子设备为监控设备。
PCT/CN2016/106800 2015-12-14 2016-11-22 图像传感器的成像方法、成像装置和电子设备 WO2017101641A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/777,796 US20180350860A1 (en) 2015-12-14 2016-11-22 Image method of image sensor, imaging apparatus and electronic device
EP16874698.0A EP3393124A4 (en) 2015-12-14 2016-11-22 Imaging method of image sensor, imaging apparatus and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510925379.1 2015-12-14
CN201510925379.1A CN106878690A (zh) 2015-12-14 2015-12-14 图像传感器的成像方法、成像装置和电子设备

Publications (1)

Publication Number Publication Date
WO2017101641A1 true WO2017101641A1 (zh) 2017-06-22

Family

ID=59055768

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/106800 WO2017101641A1 (zh) 2015-12-14 2016-11-22 图像传感器的成像方法、成像装置和电子设备

Country Status (4)

Country Link
US (1) US20180350860A1 (zh)
EP (1) EP3393124A4 (zh)
CN (1) CN106878690A (zh)
WO (1) WO2017101641A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965704A (zh) * 2018-07-19 2018-12-07 维沃移动通信有限公司 一种图像传感器、移动终端及图像拍摄方法
CN112532898A (zh) * 2020-12-03 2021-03-19 北京灵汐科技有限公司 双模态红外仿生视觉传感器
CN114697584A (zh) * 2020-12-31 2022-07-01 杭州海康威视数字技术股份有限公司 一种图像处理系统及图像处理方法

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107205139A (zh) * 2017-06-28 2017-09-26 重庆中科云丛科技有限公司 多通道采集的图像传感器及采集方法
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端
JP2019175912A (ja) 2018-03-27 2019-10-10 ソニーセミコンダクタソリューションズ株式会社 撮像装置、及び、画像処理システム
CN108426637A (zh) * 2018-05-11 2018-08-21 Oppo广东移动通信有限公司 一种光分量计算方法、图像传感器及摄像头模组
CN108965703A (zh) * 2018-07-19 2018-12-07 维沃移动通信有限公司 一种图像传感器、移动终端及图像拍摄方法
CN108600712B (zh) 2018-07-19 2020-03-31 维沃移动通信有限公司 一种图像传感器、移动终端及图像拍摄方法
CN109040720B (zh) * 2018-07-24 2019-11-19 浙江大华技术股份有限公司 一种生成rgb图像的方法及装置
EP3910917B1 (en) * 2019-02-01 2023-10-11 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method, storage medium and electronic device
CN110574367A (zh) * 2019-07-31 2019-12-13 华为技术有限公司 一种图像传感器和图像感光的方法
CN118102135A (zh) * 2020-12-31 2024-05-28 杭州海康威视数字技术股份有限公司 一种图像传感器、图像处理系统及图像处理方法
CN113706637B (zh) * 2021-08-03 2023-10-13 哈尔滨工程大学 一种彩色图像传感器线性区内色彩混叠分离方法
CN115022562A (zh) * 2022-05-25 2022-09-06 Oppo广东移动通信有限公司 图像传感器、摄像头和电子装置
CN115914857A (zh) * 2022-12-22 2023-04-04 创视微电子(成都)有限公司 一种图像传感器中实时自动白平衡补偿方法及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1825143A (zh) * 2005-02-22 2006-08-30 三洋电机株式会社 滤色器阵列以及固体摄像元件
CN101179742A (zh) * 2006-11-10 2008-05-14 三洋电机株式会社 摄像装置和图像信号处理装置
US20080315104A1 (en) * 2007-06-19 2008-12-25 Maru Lsi Co., Ltd. Color image sensing apparatus and method of processing infrared-ray signal
CN103139572A (zh) * 2011-11-24 2013-06-05 比亚迪股份有限公司 感光装置及用于其的白平衡方法和装置
CN103686111A (zh) * 2013-12-31 2014-03-26 上海富瀚微电子有限公司 一种基于rgbir图像传感器的颜色校正方法以及装置
WO2014170359A1 (fr) * 2013-04-17 2014-10-23 Photonis France Dispositif d'acquisition d'images bimode
CN104871527A (zh) * 2012-12-04 2015-08-26 (株)赛丽康 包括具有提高的光谱特性的红外像素的cmos图像传感器及其制造方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9143704B2 (en) * 2012-01-20 2015-09-22 Htc Corporation Image capturing device and method thereof
KR101695252B1 (ko) * 2012-06-07 2017-01-13 한화테크윈 주식회사 멀티 대역 필터 어레이 기반 카메라 시스템 및 그의 영상 처리 방법
CN103945201B (zh) * 2013-01-21 2016-04-13 浙江大华技术股份有限公司 一种IR-Cut滤光片切换方法、装置和摄像机
CN103617432B (zh) * 2013-11-12 2017-10-03 华为技术有限公司 一种场景识别方法及装置
CN104661008B (zh) * 2013-11-18 2017-10-31 深圳中兴力维技术有限公司 低照度条件下彩色图像质量提升的处理方法和装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1825143A (zh) * 2005-02-22 2006-08-30 三洋电机株式会社 滤色器阵列以及固体摄像元件
CN101179742A (zh) * 2006-11-10 2008-05-14 三洋电机株式会社 摄像装置和图像信号处理装置
US20080315104A1 (en) * 2007-06-19 2008-12-25 Maru Lsi Co., Ltd. Color image sensing apparatus and method of processing infrared-ray signal
CN103139572A (zh) * 2011-11-24 2013-06-05 比亚迪股份有限公司 感光装置及用于其的白平衡方法和装置
CN104871527A (zh) * 2012-12-04 2015-08-26 (株)赛丽康 包括具有提高的光谱特性的红外像素的cmos图像传感器及其制造方法
WO2014170359A1 (fr) * 2013-04-17 2014-10-23 Photonis France Dispositif d'acquisition d'images bimode
CN103686111A (zh) * 2013-12-31 2014-03-26 上海富瀚微电子有限公司 一种基于rgbir图像传感器的颜色校正方法以及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3393124A4 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965704A (zh) * 2018-07-19 2018-12-07 维沃移动通信有限公司 一种图像传感器、移动终端及图像拍摄方法
CN108965704B (zh) * 2018-07-19 2020-01-31 维沃移动通信有限公司 一种图像传感器、移动终端及图像拍摄方法
CN112532898A (zh) * 2020-12-03 2021-03-19 北京灵汐科技有限公司 双模态红外仿生视觉传感器
CN114697584A (zh) * 2020-12-31 2022-07-01 杭州海康威视数字技术股份有限公司 一种图像处理系统及图像处理方法
CN114697584B (zh) * 2020-12-31 2023-12-26 杭州海康威视数字技术股份有限公司 一种图像处理系统及图像处理方法

Also Published As

Publication number Publication date
EP3393124A1 (en) 2018-10-24
CN106878690A (zh) 2017-06-20
US20180350860A1 (en) 2018-12-06
EP3393124A4 (en) 2018-11-14

Similar Documents

Publication Publication Date Title
WO2017101641A1 (zh) 图像传感器的成像方法、成像装置和电子设备
JP5206796B2 (ja) 画像入力装置
US10630920B2 (en) Image processing apparatus
US20110115964A1 (en) Dichroic aperture for electronic imaging device
CN107018395B (zh) 图像处理装置、图像处理方法和摄像装置
US10277803B2 (en) Control method and electronic apparatus
JP5663564B2 (ja) 撮像装置並びに撮像画像処理方法と撮像画像処理プログラム
JPWO2015199163A1 (ja) 撮像センサおよび撮像装置
WO2010116923A1 (ja) 画像入力装置
WO2016047240A1 (ja) 画像処理装置、撮像素子、撮像装置および画像処理方法
JP2009194604A (ja) 撮像装置及び撮像装置の駆動方法
JP2011239252A (ja) 撮像装置
JP2010109875A (ja) 撮像装置
JP5990008B2 (ja) 撮像装置およびその制御方法
JP2016100879A (ja) 撮像装置、画像処理方法
JP2007318630A (ja) 画像入力装置、撮像モジュール、及び固体撮像装置
JP6415094B2 (ja) 画像処理装置、撮像装置、画像処理方法およびプログラム
JP6136948B2 (ja) 撮像装置、映像信号処理方法及び映像信号処理プログラム
WO2015008383A1 (ja) 撮像装置
JP2011109411A (ja) ホワイトバランス補正係数決定方法、ホワイトバランス補正係数決定装置、ホワイトバランス補正方法、ホワイトバランス補正装置、及び撮像装置
JP4993275B2 (ja) 画像処理装置
JP2010252077A (ja) 撮像装置
WO2018193544A1 (ja) 撮像装置および内視鏡装置
JP4245373B2 (ja) 画素信号処理回路
WO2011162155A1 (ja) 撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16874698

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016874698

Country of ref document: EP