WO2021227250A1 - 图像传感器和电子设备 - Google Patents

图像传感器和电子设备 Download PDF

Info

Publication number
WO2021227250A1
WO2021227250A1 PCT/CN2020/103268 CN2020103268W WO2021227250A1 WO 2021227250 A1 WO2021227250 A1 WO 2021227250A1 CN 2020103268 W CN2020103268 W CN 2020103268W WO 2021227250 A1 WO2021227250 A1 WO 2021227250A1
Authority
WO
WIPO (PCT)
Prior art keywords
filter unit
array
filter
image sensor
light
Prior art date
Application number
PCT/CN2020/103268
Other languages
English (en)
French (fr)
Inventor
程祥
张玮
Original Assignee
深圳市汇顶科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市汇顶科技股份有限公司 filed Critical 深圳市汇顶科技股份有限公司
Publication of WO2021227250A1 publication Critical patent/WO2021227250A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the embodiments of the present application relate to the field of imaging, and more specifically, to image sensors and electronic devices.
  • Imaging systems in electronic devices usually rely on image sensors to establish an electronic display of visual images.
  • image sensors include charge-coupled device (CCD) image sensors and active pixel sensor (APS) devices.
  • CCD charge-coupled device
  • APS active pixel sensor
  • CMOS Complementary Metal Oxide Semiconductor
  • CMOS Complementary Metal Oxide Semiconductor
  • image sensors include multiple photosensitive pixels, often arranged in a regular pattern of rows and columns.
  • color filters are installed in the image sensor.
  • a filter having a Bayer array configured to include each color of red, green and blue (RGB) is generally used.
  • the color filters need to be set to different colors to pass the light signal of the corresponding color, then the amount of light reaching each photosensitive pixel is reduced, thereby reducing each The light sensitivity of the photosensitive pixel.
  • the size of the image sensor is usually limited, and the photosensitive area of the corresponding pixel array is also limited, so in a low-light environment, the performance of taking pictures will be limited.
  • the present application provides an image sensor and electronic equipment, which can improve the photographing performance in a low-light environment.
  • an image sensor in a first aspect, includes: a microlens array for converging light signals returned from a photographic subject to a filter unit array; and the filter unit array is located below the microlens array, so Each microlens in the microlens array corresponds to at least one filter unit, the filter unit array includes a plurality of filter unit groups with the same color distribution, and each filter unit group in the plurality of filter unit groups It includes at least one mixed color filter unit and at least one monochromatic filter unit; a photosensitive unit array located below the filter unit array, the photosensitive unit in the photosensitive unit array and the filter unit in the filter unit array The units have a one-to-one correspondence.
  • the photosensitive unit corresponding to the monochrome filter unit in the photosensitive unit array is used to receive the monochromatic light signal passing through the monochrome filter unit.
  • the corresponding light-sensing unit is used to receive the mixed color light signal passing through the mixed color filter unit, and the monochromatic light signal and the mixed light signal are used to generate the target image of the shooting object.
  • a mixed color filter unit may be provided in the filter unit array, and the mixed color filter unit allows the multi-color light formed by mixing two or more monochromatic lights to pass through
  • the mixed-color filter unit may be a white filter unit, and the filter unit array including this mixed-color filter unit has the amount of light input compared to the filter unit array with only monochromatic filter units. It will be greatly improved, and accordingly, the amount of light entering the entire image sensor is increased, so that even in a low-light environment, the performance of the image sensor is still not affected.
  • each filter unit group includes at least one white filter unit, at least one red filter unit, at least one green filter unit, and at least one blue filter unit. Filter unit.
  • the proportion of the white filter units in each filter unit group is greater than or equal to 25%.
  • the proportion of the white filter units in each filter unit group is less than or equal to 75%.
  • the proportion of white filter units in each filter unit group is 25%, 37.5%, 50%, 62.5%, or 75%.
  • the ratio of the red filter unit, the green filter unit, and the blue filter unit in each filter unit group is 1 :2:1 or 1:1:1.
  • each filter unit group includes 4*4, 4*8, 6*6, or 8*8 Filter unit.
  • the microlenses in the microlens array correspond to the filter units in the filter unit array in a one-to-one correspondence.
  • the microlens array includes at least one first microlens and at least one second microlens, and the first microlens and One filter unit in the filter unit array corresponds to, and the second microlens corresponds to a plurality of filter units in the filter unit array.
  • the plurality of filter units are 2*2 filter units.
  • the multiple filter units have the same color.
  • the plurality of filter units are all white filter units.
  • the multiple filter units belong to the same filter unit group.
  • the monochromatic light signal is used to generate first image data of the photographed object
  • the mixed color light signal is used to generate the The second image data of the shooting object
  • the first image data and the second image data are used to synthesize the target image of the shooting object.
  • the resolutions of the first image data and the second image data are the same.
  • the second image data is generated by an interpolation algorithm.
  • an electronic device which includes: the image sensor in the first aspect or any possible implementation of the first aspect.
  • the electronic device further includes: a processing unit configured to generate a target image of the photographing object according to the monochromatic light signal and the mixed color light signal .
  • the processing unit is configured to: generate first image data of the subject according to the monochromatic light signal;
  • the mixed color light signal generates second image data of the shooting object; and synthesizing the first image data and the second image data into the target image of the shooting object.
  • the resolutions of the first image data and the second image data are the same.
  • the processing unit is configured to: generate second image data of the photographed object through an interpolation algorithm according to the mixed color light signal .
  • a mixed color filter unit is added to the filter unit array, for example, a white filter unit is added, which can effectively increase the overall light input of the image sensor. For example, adding 25% of the white filter unit will increase the light input of the entire sensor by about 30%, and adding 50% of the white filter unit will increase the amount of light by about 60%.
  • the mixed-color filter units such as white have a higher spatial sampling rate, which is beneficial to the subsequent acquisition of better high-resolution gray.
  • Degree images it can also ensure that the R pixels, G pixels, and B pixels have a relatively average spatial sampling rate, which is conducive to the subsequent acquisition of color images.
  • Fig. 1 is a schematic block diagram of an image processing device according to an embodiment of the present application.
  • Fig. 2 is a schematic diagram of an image sensor according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of an image sensor according to another embodiment of the present application.
  • Fig. 4 is a schematic diagram of an image sensor according to still another embodiment of the present application.
  • 5-14 are schematic diagrams of the color distribution of different filter unit groups according to embodiments of the present application.
  • Fig. 15 is a schematic diagram of determining a target image of a photographing object according to an embodiment of the present application.
  • FIG. 1 shows a schematic block diagram of an image processing apparatus 100.
  • the image processing apparatus 100 may refer to any electronic device.
  • the image processing apparatus 100 may be a mobile phone; or, the image processing apparatus 100 may also be an electronic device.
  • a part of, for example, may be a camera module in an electronic device, and the embodiment of the present application is not limited thereto.
  • the image processing apparatus 100 generally includes a pixel array 101 (or may also be referred to as a photoelectric conversion unit 101 or an image sensor 101), a signal reading circuit 102, a signal processor 103, a controller 104, Interface circuit 105 and power supply 106.
  • the electrical signal output terminal of the pixel array 101 is connected to the input terminal of the signal reading circuit 102
  • the control terminal of the pixel array 101 is connected to the output terminal of the controller 104
  • the output terminal of the signal reading circuit 102 is connected to the signal processor 103.
  • the input terminal is connected, and the power supply 106 is used to provide power for the signal reading circuit 102, the signal processor 103, the controller 104, and the interface circuit 105.
  • the pixel array 101 is used to collect the light signal returned via the imaging object, and convert the light signal into an electrical signal, and reflect the light image of the imaging object through the strength of the electrical signal.
  • the signal reading circuit 102 is used to read the electrical signal output by each pixel.
  • the signal processor 103 is configured to perform analog-to-digital conversion on the electrical signal output by the pixel array, and output image data of the imaging object.
  • the interface circuit 105 is used to transmit image data to the outside.
  • the controller 104 is used to output a control signal, and the control signal is used to control each pixel in the pixel array to work together.
  • each pixel structure in the pixel array 101 is similar.
  • each pixel structure may include a lens (or a microlens), a color filter, and a photosensitive element.
  • the lens is located above the filter, and the filter is located above the photosensitive element.
  • the light returning after passing through the imaging object is focused by the lens, and then emitted from the lens exit area. After being filtered by the filter, it enters the photosensitive element such as the photodiode (PD), and the optical signal is converted into electricity by the photosensitive element.
  • PD photodiode
  • the pixels may include red pixels (hereinafter referred to as R pixels), green pixels (hereinafter referred to as G pixels), and blue pixels (hereinafter referred to as B pixels).
  • R pixels means that only red light enters the photosensitive element after being filtered by the filter.
  • G pixels green pixels
  • B pixels blue pixels
  • the principles of the G pixel and the B pixel are the same as the R pixel, and will not be repeated here.
  • each pixel in the pixel array can only convert one type of light signal into an electrical signal, and then combine with the light signals collected by other types of pixels around to perform interpolation operations, which can be restored
  • This process is also called demosaicing (Demosaicing) and is usually completed in the processor.
  • the current pixel is an R pixel
  • the R pixel can only convert the red light signal into an electrical signal
  • the electrical signals collected by the surrounding B pixel or G pixel can be combined to restore the blue light and green light intensity of the current pixel to determine the current pixel The color of the image.
  • a color filter with a specific color arrangement needs to be arranged above the photosensitive element array included in the pixel array, or it can also be called a color filter array (CFA).
  • CFA color filter array
  • the included CFA adopts the Bayer format based on the three primary colors of RGB.
  • the characteristic of the Bayer format is that its basic unit is a 2 ⁇ 2 four-pixel array, including a red pixel R, a blue pixel B and two green pixels G. Among them, the two green pixels G are arranged adjacent to each other with a common vertex. . Since any pixel can actually only obtain the signal of a certain color in RGB, to restore the complete color information, it must be realized through a specific image processing algorithm.
  • FIG. 2 shows a schematic diagram of an image sensor 200 according to an embodiment of the present application.
  • the image sensor 200 may correspond to the pixel array 101 in FIG. 1, and the image sensor 200 is suitable for a CCD or CMOS structure.
  • the image sensor 200 may include a microlens array 210, a filter unit array 220 and a photosensitive unit array 230.
  • the microlens array 210 is used to: converge the light signal returned by the photographed object to the filter unit array 220, that is, the light irradiates the photographed object to generate return light, and the microlens in the microlens array 210 converges the returned light signal To the filter unit array 220 below.
  • the filter unit array 220 is located below the micro lens array 210, that is, between the photosensitive unit array 230 and the micro lens array 210.
  • the filter unit array 220 may be located on the lower surface of the micro lens array 210.
  • Each microlens in the microlens array 210 corresponds to at least one filter unit in the filter unit array 220.
  • the filter unit array 220 may include filter units of multiple colors.
  • the filter units included in the filter unit array 220 may be divided into multiple groups of filter units according to different color distributions, that is, the filter unit array 220 It includes a plurality of filter unit groups, wherein the plurality of filter unit groups have the same color distribution.
  • each filter unit group in the plurality of filter unit groups includes at least one mixed color filter unit and at least one monochromatic filter unit.
  • the monochromatic filter unit is used to filter the light passing through the microlens and output corresponding monochromatic light;
  • the mixed color filter unit is used to filter the light passing through the microlens and output the corresponding mixed color light.
  • the monochromatic light in the embodiments of the present application refers to the spectral color light separated by white light or sunlight by the refraction of a triangular prism, for example: red light, orange light, yellow light, green light, blue light, indigo light, purple light, etc. Color light.
  • the separated spectral colored light passes through the prism again and will not be broken down into other colored lights.
  • This kind of color light that can no longer be decomposed is monochromatic light; and mixed light refers to the polychromatic light formed by mixing any two or more monochromatic lights.
  • the monochromatic light in the embodiment of the present application may be red light (Red), green light (Green) and blue light (Blue), and the mixed light may be white light.
  • the red filter unit allows red light to pass through, so it outputs red light.
  • the green filter unit is used to output green light
  • the blue filter unit is used to output blue light
  • the white filter unit is used to output white light.
  • the embodiments of the present application are not limited to this.
  • the photosensitive unit array 230 is located below the filter unit array 220, and the photosensitive units in the photosensitive unit array 230 correspond to the filter units in the filter unit array 220 in a one-to-one correspondence.
  • the photosensitive unit corresponding to the monochromatic filter unit in the photosensitive unit array is used to receive the monochromatic light signal passing through the monochromatic filter unit, and correspondingly output the monochromatic pixels;
  • the photosensitive unit array 230 corresponds to the mixed-color filter unit
  • the photosensitive unit is used to receive the mixed color light signal passing through the mixed color filter unit, and correspondingly output the mixed color pixel; the monochromatic light signal and the mixed light signal are jointly used to generate the target image of the shooting object.
  • the sensing unit corresponding to the red filter unit receives the red light signal, and the corresponding output pixel may be called the red pixel; the sensing unit corresponding to the white filter unit receives the white light signal, and the corresponding output pixel may be called the white pixel.
  • FIGS. 3 and 4 show other schematic diagrams of the image sensor 200.
  • a dielectric layer 240 may be further included between the filter unit array 220 and the photosensitive unit array 230.
  • the filter unit array 220 may also include a surrounding dielectric 225 and a reflective grid 226;
  • the photosensitive unit array 230 may include a semiconductor substrate 231 and a photosensitive element 232, wherein the photosensitive element 232 is located on the semiconductor substrate In 231, the photosensitive element 232 may be a PD.
  • the photosensitive unit array 230 may also include an isolation region 233 between the two photosensitive elements 232, but the embodiment of the present application is not limited thereto.
  • the filter unit array 220 in the image sensor 200 in the embodiment of the present application may include a mixed color filter unit, which allows the multi-color light formed by a mixture of two or more monochromatic lights to pass, for example,
  • the mixed-color filter unit may be a white filter unit, and the filter unit array 220 including this mixed-color filter unit has a smaller amount of light compared to a filter unit array with only a single-color filter unit. It will be greatly improved, and accordingly, the amount of light entering the entire image sensor 200 is increased, so that even in a low-light environment, the performance of the image sensor 200 is still not affected.
  • the filter unit array 220 in the embodiment of the present application includes multiple filter unit groups, and the color distribution of the multiple filter unit groups is the same, that is, the filter unit array 220 includes multiple repeated filters.
  • Unit groups, these repeated filter unit groups are tiled so as to cover the entire surface of the photosensitive unit array in the image sensor 200.
  • the filter unit group described below takes the smallest repeating unit group in the filter unit array 220 as an example.
  • the smallest repeating unit group means that repeating unit groups of other sizes do not have fewer filters.
  • the light unit, a plurality of minimum repeating unit groups are tiled so as to cover the entire surface of the photosensitive unit array in the image sensor 200, but the embodiment of the present application is not limited thereto.
  • FIGS. 5 to 14 show schematic diagrams of filter unit groups according to embodiments of the present application.
  • filter units of the same color are identified in the same way, that is, a square filled with diagonal lines indicates red, and a cross line filled The squares indicated by green are green, and the squares filled with dots are green; and, in Figure 5-14, each small square represents a filter unit, and each large square represents a filter unit group.
  • FIGS. 3 and 4 a dielectric 225 and a reflective grid 226 may be provided between adjacent filter units.
  • FIGS. 5-14 the distance between adjacent filter units is ignored, that is, FIG. 5-14 mainly shows the color distribution of the filter unit groups included in the filter unit array 220.
  • each filter unit group in the embodiments of the present application can be set according to actual applications, and can be set to any size .
  • the size of each filter unit group can be 4*4; or as shown in Figure 11, the size of each filter unit can be 4*8; or as shown in Figure 12
  • the size of each filter unit can be 6*6; or as shown in Figures 13 and 14, the size of each filter unit can also be 8*8. That is, each filter unit may include 4*4, 4*8, 6*6, or 8*8 filter units, but the embodiment of the present application is not limited thereto.
  • each filter unit group in the embodiments of the present application may include at least one mixed color filter unit and at least one monochromatic filter unit.
  • the mixed color filter unit is a white filter unit as an example
  • the monochromatic filter unit includes a red filter unit, a green filter unit, and a blue filter unit as examples for description, namely
  • each filter unit group includes at least one white filter unit, at least one red filter unit, at least one green filter unit, and at least one blue filter unit as an example for description, but the embodiment of the present application is not limited to this.
  • FIG. 5 shows a total of two left and right filter unit groups.
  • the filter unit group includes a red filter unit 221 and a blue filter unit.
  • each filter unit group is configured to include four colors of filters including a red filter unit 221, a blue filter unit 222, a green filter unit 223, and a white filter unit 224. Unit, and lists the different distributions of the four color filter units.
  • the ratio of filter units of different colors in each filter unit group in the embodiments of the present application can be set according to actual applications, and can be set to any value.
  • the proportion of the white filter unit in each filter unit group can be set to be greater than or equal to 25%; it can also be set to be less than or equal to 75%.
  • the proportion of white filter units in each filter unit group can be set to 50%; or, as shown in FIG.
  • the proportion of white filter units in each filter unit group It can also be set to 25%; as shown in Figure 7, the proportion of white filter units in each filter unit group can also be set to 37.5%; as shown in Figure 8, or 10-14, each filter unit group The proportion of the white filter unit can also be set to 50%; as shown in Figure 9, the proportion of the white filter unit in each filter unit group can also be set to 62.5%; as shown in Figure 10, each filter unit The proportion of the white filter units in the unit group can also be set to 75%, and the embodiment of the present application is not limited to this.
  • the ratio between the monochromatic filter units of different colors included in each filter unit group in the embodiments of the present application can also be set according to actual applications, and can be set to arbitrary data.
  • the ratio of the red filter unit, the green filter unit and the blue filter unit in each filter unit group can usually be 1:2:1 (for example, Figure 5 or Figure 10) or 1:1:1 ( As shown in the partial filter unit groups in Figures 6-9 and 11-14), the embodiment of the present application is not limited to this.
  • FIGS. 5-14 in the embodiments of the present application illustrate the distribution of filter units of different colors in filter unit groups of different sizes, but other possible color distributions are not excluded.
  • the color distribution of the filter unit group can be as shown in the four filters in Figure 6. Any one of the light unit groups is shown, or other distribution methods can also be used.
  • the color distribution of the filter unit group can be as shown in any filter unit group in FIG. 5, or as shown in the figure As shown in any one of 8, alternatively, other distribution methods can also be used.
  • the proportions of the filter unit groups of other sizes and the white filter units are other values, and so on, so I won’t repeat them here.
  • the distribution of the microlens array 210 can be set corresponding to the filter unit array 220 underneath it.
  • each microlens in the microlens array 210 can be connected to the filter unit array 220 below.
  • the microlenses in the microlens array 210 and the filter units in the filter unit array 220 may have a one-to-one correspondence.
  • the microlens array 210 includes a plurality of first microlenses 211, and each first microlens 211 corresponds to a filter unit, and also corresponds to a photosensitive unit, that is to say
  • the image sensor 200 has a pixel array, and each pixel array includes a first microlens 211, a filter unit, and a photosensitive unit.
  • the microlens array 210 may include a plurality of second microlenses 212, and each second microlens 212 corresponds to a plurality of filter units.
  • each second microlens 212 may be associated with 2*2 filter units.
  • the light units correspond to each other, and each filter unit corresponds to a photosensitive unit.
  • the microlens array 210 may also include at least one first microlens 211 and at least one second microlens 212, wherein each first microlens 211 and one filter unit in the filter unit array 220 Correspondingly, each second microlens 212 corresponds to a plurality of filter units in the filter unit array 220.
  • the microlens array 210 includes a plurality of first microlenses 211 corresponding to the filter units one-to-one, and also includes at least one second microlens 212 corresponding to 2*2 filter units. .
  • the number of filter units corresponding to the second microlens 212 can be set according to actual applications, and can be set to any value.
  • the second microlens 212 may correspond to 2*2 filter units or 1*2 filter units, and the embodiment of the present application is not limited thereto.
  • the plurality of filter units corresponding to the same second microlens 212 may have the same or different colors.
  • the 2*2 filter units corresponding to the second microlens 212 may all be white filter units, where the 2*2 white filter units may belong to the same filter unit group. Or they may not belong to the same filter unit group.
  • the 2*2 filter units corresponding to the second microlens 212 may also belong to two or more adjacent filter unit groups. The embodiments of this application do not belong to the same filter unit group. Not limited to this.
  • the electrical signals converted by the mixed light emitted from different exit areas of the second microlens can be used to calculate the phase difference of the electrical signal to adjust the phase difference.
  • the focal length of the image sensor is not limited to the same second microlens.
  • the filter unit array 220 in the embodiment of the present application since at least one mixed-color filter unit is provided therein, when determining the target image of the photographing object according to the input light signal, it is different from only including a single color filter.
  • the image sensor of the light unit (for example, an image sensor using a filter with a Bayer array) is also different.
  • FIG. 15 shows a schematic diagram of the process of determining the corresponding target image.
  • the process may be executed by a processor, for example, may be executed by a processor included in an electronic device where the image sensor 200 is located; or, taking FIG. 1 as an example, it may also be executed by the processor 103 in the image processing apparatus.
  • the filter unit array 220 having any one of the filter unit groups shown in FIGS. 5-14 can be passed.
  • a filter unit array 220 composed of a filter unit group is taken as an example.
  • the obtained image data includes white (W) pixels corresponding to the white filter unit, R pixels corresponding to the red filter unit, and corresponding to the green filter unit.
  • the G pixel and the B pixel corresponding to the blue filter unit are shown in the first image in FIG. 15.
  • the W pixels can be separated from other pixels to obtain two images. That is, the monochromatic light signal that has passed through a variety of monochromatic filter units is used to generate an image, for example, the second row of the right image shown in Figure 15 includes R pixels, G pixels, and B pixels; after mixed color filtering The mixed color light signal of the unit is used to generate another image, for example, the left image in the second row as shown in FIG. 15 includes only W pixels.
  • the monochromatic light signal that has passed through a variety of monochromatic filter units is used to generate an image, for example, the second row of the right image shown in Figure 15 includes R pixels, G pixels, and B pixels; after mixed color filtering
  • the mixed color light signal of the unit is used to generate another image, for example, the left image in the second row as shown in FIG. 15 includes only W pixels.
  • the separated image corresponding to the monochrome filter unit as shown in FIG. 15, it can be transformed into an RGGB combination, that is, first image data is generated, and the first image data is color image data.
  • a customized remosaic algorithm can be used to synthesize the image on the right side of the second row in FIG. 15 into the first image data corresponding to the image on the right side of the third row.
  • the missing position of the W pixel can be filled by interpolation algorithm to obtain the second image data of all pixels, the first image data and the second image data.
  • the resolution of the image data is the same.
  • the second image data is a grayscale image.
  • the first image data and the second image data are synthesized to obtain a target image corresponding to the shooting.
  • the distribution and ratio of the mixed color filter units in the filter unit array 220 can be correspondingly set according to various factors in the above process of determining the target image. For example, when considering determining the first image, if the number of monochromatic filter units is too small, that is, the number of RGB pixels is too small, there will be a serious color difference between the final target image and the actual subject.
  • the ratios of the monochromatic filter units in the filter unit array 220 are different and too low; for another example, adding mixed-color filter units can increase the overall light input of the image sensor. Therefore, the ratio of the mixed-color filter units cannot be too low.
  • two filter unit groups as shown in FIG. 5 can be used, in which the proportion of white filter units in the filter unit group is 50%, and the distribution is relatively uniform, and the distribution of red, green and blue filter units is also It is more average, and the image corresponding to the acquired shooting has a better effect.
  • a mixed color filter unit is added to the filter unit array, for example, a white filter unit is added, which can effectively increase the overall light input of the image sensor. For example, adding 25% of the white filter unit will increase the light input of the entire sensor by about 30%, and adding 50% of the white filter unit will increase the amount of light by about 60%.
  • the mixed-color filter units such as white have a higher spatial sampling rate, which is beneficial to the subsequent acquisition of better high-resolution gray.
  • Degree image It can also ensure that R pixels, G pixels, and B pixels have a relatively average spatial sampling rate, which is beneficial to the subsequent Remosaic algorithm to obtain color images.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program code .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

本申请实施例涉及图像传感器和电子设备。该图像传感器包括:微透镜阵列;滤光单元阵列,位于该微透镜阵列下方,每个微透镜对应至少一个滤光单元,该滤光单元阵列包括颜色分布相同的多个滤光单元组,每个滤光单元组包括至少一个混合色滤光单元和至少一个单色滤光单元;感光单元阵列,位于该滤光单元阵列下方,感光单元与滤光单元一一对应,与单色滤光单元对应的感光单元用于接收经过该单色滤光单元的单色光信号,与混合色滤光单元对应的感光单元用于接收经过该混合色滤光单元的混合色光信号,该单色光信号和该混合光信号用于生成该拍摄对象的目标图像。本申请实施例的图像传感器和电子设备,能够提高低光照环境下的拍照性能。

Description

图像传感器和电子设备
本申请要求2020年5月15日提交中国专利局、申请号为202010410639.2、发明名称为“图像传感器和电子设备”的中国发明申请的优先权,其全部内容通过应用结合在本申请中。
技术领域
本申请实施例涉及图像领域,并且更具体地,涉及图像传感器和电子设备。
背景技术
电子设备中的成像系统通常依靠图像传感器来建立可视图像的电子显示。这样的图像传感器的例子包括电荷耦合装置(Charge-coupled Device,CCD)图像传感器和有源像素传感器(Active Pixel Sensor,APS)装置,其中,因为能够在互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)处理中制造APS装置,所以APS装置经常也被叫做CMOS传感器。
这些图像传感器包括多个光敏像素,经常以行和列的规则图案进行排列。为了捕获彩色图像,需要在不同像素上累积特定波长的光信号,即对应接收特定颜色的信号,所以会在图像传感器中安装滤色器。例如,通常使用具有被配置成包括红绿蓝(RGB)中的每种颜色的拜耳(Bayer)阵列的滤波器。
为了使像素阵列中不同像素仅对部分可见光谱感光,需要将滤色器设置为不同颜色,以通过对应颜色的光信号,那么也就减少了到达每个光敏像素的光量,从而减少了每个光敏像素的光敏度。另外,由于在用于移动设备中时,通常图像传感器尺寸受限,对应的像素阵列的感光面积也受限,所以在低光照环境中下,拍照的性能会受限。
发明内容
本申请提供了一种图像传感器和电子设备,能够提高低光照环境下的拍照性能。
第一方面,提供了一种图像传感器,该图像传感器包括:微透镜阵列,用于将拍摄对象返回的光信号汇聚至滤光单元阵列;滤光单元阵列,位于所述微透镜阵列下方,所述微透镜阵列中的每个微透镜对应至少一个滤光单元,所述滤光单元阵列包括颜色分布相同的多个滤光单元组,所述多个滤光单元组中每个滤光单元组包括至少一个混合色滤光单元和至少一个单色滤光单元;感光单元阵列,位于所述滤光单元阵列下方,所述感光单元阵列中的感光单元与所述滤光单元阵列中的滤光单元一一对应,所述感光单元阵列中与单色滤光单元对应的感光单元用于接收经过所述单色滤光单元的单色光信号,所述感光单元阵列中与混合色滤光单元对应的感光单元用于接收经过所述混合色滤光单元的混合色光信号,所述单色光信号和所述混合光信号用于生成所述拍摄对象的目标图像。
因此,本申请实施例的图像传感器,其包括的滤光单元阵列中可以设置混合色滤光单元,该混合色滤光单元允许两种或者两者以上单色光混合而成的复色光通过,例如,该混合色滤光单元可以为白色滤光单元,那么包括这种混合色滤光单元的滤光单元阵列,相比于仅设置单色滤光单元的滤光单元阵列而言,进光量则会大大提高,相应的,也就提高了整个图像传感器的进光量,使得该图像传感器即使在低光照环境下,性能仍然不受影响。
结合第一方面,在第一方面的一种实现方式中,所述每个滤光单元组包括至少一个白色滤光单元、至少一个红色滤光单元、至少一个绿色滤光单元和至少一个蓝色滤光单元。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述每个滤光单元组中白色滤光单元的比例大于或者等于25%。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述每个滤光单元组中白色滤光单元的比例小于或者等于75%。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述每个滤光单元组中白色滤光单元的比例为25%、37.5%、50%、62.5%或者75%。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述每个滤光单元组中红色滤光单元、绿色滤光单元和蓝色滤光单元的比例为1:2:1或者1:1:1。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所 述每个滤光单元组包括4*4个、4*8个、6*6个或者8*8个滤光单元。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述微透镜阵列中的微透镜与所述滤光单元阵列中的滤光单元一一对应。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述微透镜阵列中包括至少一个第一微透镜和至少一个第二微透镜,所述第一微透镜与所述滤光单元阵列中的一个滤光单元对应,所述第二微透镜与所述滤光单元阵列中的多个滤光单元对应。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述多个滤光单元为2*2个滤光单元。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述多个滤光单元具有相同颜色。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述多个滤光单元均为白色滤光单元。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述多个滤光单元属于同一滤光单元组。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述单色光信号用于生成所述拍摄对象的第一图像数据,所述混合色光信号用于生成所述拍摄对象的第二图像数据,所述第一图像数据和所述第二图像数据用于合成所述拍摄对象的所述目标图像。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述第一图像数据和所述第二图像数据的分辨率相同。
结合第一方面及其上述实现方式,在第一方面的另一种实现方式中,所述第二图像数据为通过插值算法生成的。
第二方面,提供了一种电子设备,该电子设备包括:上述第一方面或第一方面的任意可能的实现方式中的图像传感器。
结合第二方面,在第二方面的一种实现方式中,所述电子设备还包括:处理单元,用于根据所述单色光信号和所述混合色光信号,生成所述拍摄对象的目标图像。
结合第二方面及其上述实现方式,在第二方面的另一种实现方式中,所述处理单元用于:根据所述单色光信号,生成所述拍摄对象的第一图像数据;根据所述混合色光信号,生成所述拍摄对象的第二图像数据;将所述第一图 像数据和所述第二图像数据合成为所述拍摄对象的所述目标图像。
结合第二方面及其上述实现方式,在第二方面的另一种实现方式中,所述第一图像数据和所述第二图像数据的分辨率相同。
结合第二方面及其上述实现方式,在第二方面的另一种实现方式中,所述处理单元用于:根据所述混合色光信号,通过插值算法,生成所述拍摄对象的第二图像数据。
因此,本申请实施例的图像传感器和电子设备,在其中的滤光单元阵列中增加混合色滤光单元,例如,增加白色滤光单元,这样可以有效的增加图像传感器的整体进光量。例如,增加25%的白色滤光单元,那么将提升整个传感器的进光量30%左右,而增加50%的白色滤光单元,则可以提升60%左右。另外,通过合理地设置滤光单元阵列200的各个颜色的滤光单元的分布,能够保证例如白色等混合色滤光单元具有较高的空间采样率,有利于后续获取更好的高分辨率灰度图像;也可以保证R像素、G像素、B像素有相对平均的空间采样率,有利于后续获取彩色图像。
附图说明
图1是根据本申请实施例的图像处理装置的示意性框图。
图2是根据本申请实施例的图像传感器的示意图。
图3是根据本申请另一实施例的图像传感器的示意图。
图4是根据本申请再一实施例的图像传感器的示意图。
图5-14是根据本申请实施例的不同滤光单元组的颜色分布的示意图。
图15是根据本申请实施例的确定拍摄对象的目标图像的示意图。
具体实施方式
下面将结合附图,对本申请实施例中的技术方案进行描述。
图像处理装置是利用像素阵列的光电转换功能,将成像对象的光像转换为与光像成相应比例关系的电信号,进而获得成像对象的图像。图1示出了一种图像处理装置100的示意性框图,该图像处理装置100可以指任意电子设备,例如,该图像处理装置100可以为手机;或者,该图像处理装置100也可以为电子设备的一部分,例如,可以为电子设备中的摄像模组,本申请实施例并不限于此。
如图1所示,图像处理装置100通常包括像素阵列(Pixel Array)101(或者也可以称为光电转换单元101或者图像传感器101)、信号读取电路102、信号处理器103、控制器104、接口电路105以及电源106。其中,像素阵列101的电信号输出端与信号读取电路102的输入端连接,像素阵列101的控制端与控制器104的输出端连接,信号读取电路102的输出端与信号处理器103的输入端连接,电源106用于为信号读取电路102、信号处理器103、控制器104以及接口电路105提供电源。
其中,像素阵列101用于采集经由成像对象返回的光信号,并将该光信号转换为电信号,通过电信号强弱反应成像对象的光像。信号读取电路102用于读取每个像素输出的电信号。信号处理器103用于对像素阵列输出的电信号进行模数转换,输出成像对象的图像数据。接口电路105用于向外传输图像数据。控制器104用于输出控制信号,控制信号用于控制像素阵列中每个像素协同工作。
其中,图像处理装置100的核心部件就是像素阵列101。像素阵列101中的每个像素结构相似,通常每个像素结构可以包括透镜(或者微透镜)、滤光片(Color Filter)以及光敏元件。其中,透镜位于滤光片的上方,滤光片位于光敏元件的上方。经由成像对象后返回的光经过透镜聚焦后,由透镜出射区域射出,经过滤光片过滤后,射入例如光电二极管(Photo-Diode,PD)等光敏元件,由光敏元件将光信号转换为电信号。根据不同滤光片所能透过的光的类型,像素可以包括红色像素(以下称为R像素)、绿色像素(以下称为G像素)以及蓝色像素(以下称为B像素)。R像素是指经过滤光片过滤后仅有红色光射入光敏元件,G像素和B像素的原理与R像素相同,此处不再赘述。
其中,图像传感器生成彩色图像数据的原理为:像素阵列中每个像素仅能将一种类型的光信号转换为电信号,再结合周围其他类型的像素采集的光信号进行插值运算,即可还原出当前像素所采集区域的图像颜色,这一过程也称作去马赛克(Demosaicing),通常在处理器中完成。例如:当前像素为R像素,R像素仅能将红色光信号转换电信号,则可以结合周围的B像素或者G像素采集的电信号,还原出当前像素的蓝色光和绿色光强度,确定当前像素的图像颜色。
因此,为了采集彩色图像,需要像素阵列中包括的光敏元件阵列上方设 置颜色特定排布的滤色器,或者也可以称为颜色滤镜阵列(Color Filter Array,CFA)。目前,对于大部分像素阵列,例如CCD和CMOS图像传感器,其包括的CFA采用基于RGB三原色的Bayer格式。Bayer格式的特点是其基本单元是一个2×2的四像素阵列,包括1个红色像素R,一个蓝色像素B以及2个绿色像素G,其中,两个绿色像素G共顶角相邻设置。由于任一像素实际只能获得RGB中某一种颜色的信号,要还原出完整的色彩信息就必须通过特定的图像处理算法加以实现。
这种纯RGB的bayer布局,仅允许通过特定颜色的光,也就是会截断大部分的光子,那么在低光照环境下,就可能无法准确的还原图像。因此,本申请实施例提出了一种图像传感器,可以解决该问题。
图2示出了本申请实施例的图像传感器200的示意图,例如,该图像传感器200可以对应于图1中的像素阵列101,并且,该图像传感器200适用于CCD或者CMOS结构。具体地,如图2所示,该图像传感器200可以包括微透镜阵列210、滤光单元阵列220和感光单元阵列230。
其中,该微透镜阵列210用于:将拍摄对象返回的光信号汇聚至滤光单元阵列220,即光照射被拍摄的对象产生返回光,微透镜阵列210中的微透镜将返回的光信号汇聚至下方的滤光单元阵列220中。
滤光单元阵列220位于该微透镜阵列210下方,也就是位于感光单元阵列230和微透镜阵列210之间,例如,滤光单元阵列220可以位于该微透镜阵列210的下表面。该微透镜阵列210中的每个微透镜对应滤光单元阵列220中的至少一个滤光单元。滤光单元阵列220可以包括多种颜色的滤光单元,例如,可以根据不同颜色分布,将该滤光单元阵列220中包括的滤光单元分为多组滤光单元,即滤光单元阵列220包括多个滤光单元组,其中,该多个滤光单元组具有相同的颜色分布。
具体地,该多个滤光单元组中每个滤光单元组包括至少一个混合色滤光单元和至少一个单色滤光单元。单色滤光单元用于对经过微透镜的光进行过滤,输出对应的单色光;混合色滤光单元用于对经过微透镜的光进行过滤,输出对应的混合色光。
应理解,本申请实施例中的单色光是指白光或太阳光经三棱镜折射所分离出光谱色光,例如:红光、橙光、黄光、绿光、蓝光、靛光、紫光等七个颜色光。被分离出光谱色光再次通过三棱镜不会再分解为其他的色光。这种 不能再分解的色光为单色光;而混合光是指有任意两种或者以上单色光混合而成的复色光。例如,本申请实施例中的单色光可以为红光(Red)、绿光(Green)和蓝光(Blue),混合光可以为白光。对应的,红色滤光单元至允许红光通过,因此输出红光,类似的,绿色滤光单元用于输出绿光,蓝色滤光单元用于输出蓝光,白色滤光单元用于输出白光,但本申请实施例并不限于此。
感光单元阵列230,位于该滤光单元阵列220下方,该感光单元阵列230中的感光单元与该滤光单元阵列220中的滤光单元一一对应。该感光单元阵列中与单色滤光单元对应的感光单元用于接收经过该单色滤光单元的单色光信号,对应输出单色像素;该感光单元阵列230中与混合色滤光单元对应的感光单元用于接收经过该混合色滤光单元的混合色光信号,对应输出混合色像素;该单色光信号和该混合光信号共同用于生成该拍摄对象的目标图像。例如,与红色滤光单元对应的感应单元接收红色光信号,对应输出的像素可以称为红色像素;与白色滤光单元对应的感应单元接收白色光信号,对应输出的像素可以称为白色像素。
应理解,本申请实施例中的该图像传感器200还可以包括其他部分。例如,图3和图4示出了图像传感器200的其他示意图。如图3和图4所示,在滤光单元阵列220和感光单元阵列230之间还可以包括电介质层240。
如图3和图4所示,滤光单元阵列220还可以包括位于周围的电介质225和反射栅格226;感光单元阵列230可以包括半导体基板231和光敏元件232,其中,光敏元件232位于半导体基板231中,光敏元件232可以为PD。可选地,感光单元阵列230还可以包括两个光敏元件232之间的隔离区233,但本申请实施例并不限于此。
本申请实施例中的图像传感器200中的滤光单元阵列220可以包括混合色滤光单元,该混合色滤光单元允许两种或者两者以上单色光混合而成的复色光通过,例如,该混合色滤光单元可以为白色滤光单元,那么包括这种混合色滤光单元的滤光单元阵列220,相比于仅设置单色滤光单元的滤光单元阵列而言,进光量则会大大提高,相应的,也就提高了整个图像传感器200的进光量,使得该图像传感器200即使在低光照环境下,性能仍然不受影响。
下面从滤光单元阵列220的角度进行详细描述。
具体地,本申请实施例中的滤光单元阵列220包括多个滤光单元组,并 且多个滤光单元组的颜色分布相同,也就是说,滤光单元阵列220包括多个重复的滤光单元组,这些重复的滤光单元组被平铺以便覆盖图像传感器200中的整个感光单元阵列的表面。为了便于说明,下文中描述的滤光单元组以表示该滤光单元阵列220中的最小重复单元组为例,该最小的重复单元组说明其他的尺寸的重复单元组都不具有更少的滤光单元,多个最小重复单元组被平铺以便覆盖图像传感器200中的整个感光单元阵列的表面,但本申请实施例并不限于此。
图5至图14示出了本申请实施例的滤光单元组的示意图,其中,图5-14中相同颜色的滤光单元的标识方式相同,即斜线填充的方块表示红色,交叉线填充的方块表示绿色,点状填充的方块表示绿色;并且,图5-14中每个小方格都表示一个滤光单元,每个大方格表示一个滤光单元组。应理解,本申请实施例中的相邻滤光单元之间还可以设置有其他结构,例如,如图3和图4所示,相邻滤光单元之间可以设置电介质225和反射栅格226,但在图5-14中,将相邻滤光单元之间的距离忽略,即图5-14主要表示的是滤光单元阵列220包括的滤光单元组的颜色分布。
具体地,对于本申请实施例中的每个滤光单元组的尺寸,也就是每个滤光单元组中包括的滤光单元的个数,可以根据实际应用进行设置,并且可以设置为任意尺寸。例如,如图5至图10所示,每个滤光单元组的尺寸可以为4*4;或者如图11所示,每个滤光单元的尺寸可以为4*8;或者如图12所示,每个滤光单元的尺寸可以为6*6;或者如图13和14所示,每个滤光单元的尺寸还可以为8*8。即每个滤光单元中可以包括4*4个、4*8个、6*6个或者8*8个滤光单元,但本申请实施例并不限于此。
应理解,本申请实施例中的每个滤光单元组可以包括至少一个混合色滤光单元和至少一个单色滤光单元。可选地,下文中以该混合色滤光单元为白色滤光单元为例,单色滤光单元以包括红色滤光单元、绿色滤光单元和蓝色滤光单元为例进行描述,即下文中以每个滤光单元组包括至少一个白色滤光单元、至少一个红色滤光单元、至少一个绿色滤光单元和至少一个蓝色滤光单元为例进行描述,但本申请实施例并不限于此。
例如,图5共示出了左右两种滤光单元组的情况,对于这两种滤光单元组中任一种而言,该滤光单元组都包括红色滤光单元221、蓝色滤光单元222、绿色滤光单元223以及白色滤光单元224。或者,如图6-14所示,每个滤光 单元组均设置为包括红色滤光单元221、蓝色滤光单元222、绿色滤光单元223以及白色滤光单元224四种颜色的滤光单元,并且列出了四种颜色滤光单元的不同的分布情况。
应理解,本申请实施例中的每个滤光单元组中不同颜色的滤光单元的比例可以根据实际应用进行设置,并且可以设置为任意数值。例如,以白色滤光单元为例,每个滤光单元组中白色滤光单元的比例可以设置为大于或者等于25%;也可以设置为小于或者等于75%。具体地,仍然以图5为例,每个滤光单元组中白色滤光单元的比例可以设置为50%;或者,如图6所示,每个滤光单元组中白色滤光单元的比例还可以设置为25%;如图7所示,每个滤光单元组中白色滤光单元的比例还可以设置为37.5%;如图8、或者10-14所示,每个滤光单元组中白色滤光单元的比例还可以设置为50%;如图9所示,每个滤光单元组中白色滤光单元的比例还可以设置为62.5%;如图10所示,每个滤光单元组中白色滤光单元的比例还可以设置为75%,本申请实施例并不限于此。
应理解,本申请实施例中的每个滤光单元组中包括的不同颜色的单色滤光单元之间的比例也可以根据实际应用进行设置,并且可以设置为任意数据。例如,通常可以将每个滤光单元组中红色滤光单元、绿色滤光单元和蓝色滤光单元的比例为1:2:1(例如图5或者图10)或者1:1:1(如图6-9和11-14中部分滤光单元组所示),本申请实施例并不限于此。
应理解,本申请实施例中的图5-14是针对不同尺寸的滤光单元组中不同颜色滤光单元的分布进行举例说明,但不排除其他可能的颜色分布。例如,以4*4的滤光单元组为例,如图6所示,在白色滤光单元占比为25%的情况下,滤光单元组的颜色分布可以如图6中的四个滤光单元组任意一个所示,或者,也可以采用其他分布方式。再例如,如图5和8所示,在白色滤光单元占比为50%的情况下,滤光单元组的颜色分布可以如图5中任意一个滤光单元组所示,也可以如图8中任意一个所示,或者,也可以采用其他分布方式。如图5-14所示,其他尺寸的滤光单元组和白色滤光单元的占比为其他数值的情况以此类推,在此不再赘述。
在本申请实施例中,微透镜阵列210的分布可以对应位于其下方的滤光单元阵列220进行设置,例如,微透镜阵列210中的每个微透镜可以与下方的滤光单元阵列220中的一个或者多个对应。
可选地,作为一种实施例,该微透镜阵列210中的微透镜与该滤光单元阵列220中的滤光单元可以一一对应。具体地,如图3所示,该微透镜阵列210中包括多个第一微透镜211,每个第一微透镜211与一个滤光单元相对应,也与一个感光单元相对应,也就是说,图像传感器200具有像素阵列,每个像素阵列包括一个第一微透镜211、一个滤光单元和一个感光单元。
可选地,作为另一种实施例,该微透镜阵列210中也可以存在至少一个微透镜与该滤光单元阵列220中的多个滤光单元对应。例如,该微透镜阵列210中可以包括多个第二微透镜212,每个第二微透镜212与多个滤光单元相对应,例如,每个第二微透镜212可以与2*2个滤光单元相对应,每个滤光单元与一个感光单元相对应。
再例如,该微透镜阵列210中也可以包括至少一个第一微透镜211和至少一个第二微透镜212,其中,每个第一微透镜211与该滤光单元阵列220中的一个滤光单元对应,每个第二微透镜212与该滤光单元阵列220中的多个滤光单元对应。例如,如图4所示,该微透镜阵列210中包括多个与滤光单元一一对应的第一微透镜211,也包括至少一个与2*2个滤光单元对应的第二微透镜212。
对于上述与多个滤光单元对应的第二微透镜212,该第二微透镜212对应的滤光单元的个数可以根据实际应用进行设置,并且可以设置为任意数值。例如,该第二微透镜212可以与2*2个滤光单元或者1*2个滤光单元对应,本申请实施例并不限于此。
另外,与同一个第二微透镜212对应的该多个滤光单元可以具有相同或者不同的颜色。例如,如图4所示,与第二微透镜212对应的2*2个滤光单元可以均为白色滤光单元,其中,该2*2的白色滤光单元可以属于同一滤光单元组,或者也可以不属于同一滤光单元组,例如,与第二微透镜212对应的2*2个滤光单元也可能属于相邻的两个或者更多个滤光单元组,本申请实施例并不限于此。
将多个混合色滤光单元对应同一个第二微透镜设置时,可以利用该第二微透镜不同出射区域射出的混合光转换的电信号,来计算电信号的相位差,以根据相位差调整图像传感器的焦距。
应理解,对于本申请实施例中的滤光单元阵列220,由于在其中设置了至少一个混合色滤光单元,因此在根据输入的光信号确定拍摄对象的目标图 像时,与仅包括单色滤光单元的图像传感器(例如采用具有拜耳阵列的滤波器的图像传感器)也有所不同。
具体地,仍然以本申请实施例中的滤光单元阵列220包括白色、红色、蓝色和绿色四种颜色的滤光单元为例,图15示出了确定拍摄对应的目标图像的过程的示意图,该过程可以由处理器执行,例如,可以由图像传感器200所在的电子设备中包括的处理器执行;或者,以图1为例,也可以由图像处理装置中的处理器103执行。如图15所示,经过本申请实施例中的滤光单元阵列220,例如,可以经过具有如图5-14中任意一种滤光单元组的滤光单元阵列220,这里以图5中第一个滤光单元组构成的滤光单元阵列220为例,获得的图像数据包括与白色滤光单元对应的白色(W)像素、与红色滤光单元对应的R像素、与绿色滤光单元对应的G像素和与蓝色滤光单元对应的B像素,如图15所示的第一个图。
首先,如图15所示,可以将其中的W像素与其他像素分离,以获得两张图像。即经过多种单色滤光单元的单色光信号用于生成一张图像,例如,如图15所示的第二行右边图像,包括R像素、G像素和B像素;经过混合色滤光单元的混合色光信号用于生成另一张图像,例如,如图15所示的第二行左边图像,仅包括W像素。
对于单色滤光单元对应的分离出来的图像,如图15所示,可以将其变为RGGB组合,即生成第一图像数据,该第一图像数据为彩色图像数据。例如,可以通过定制的重建马赛克(remosaic)算法,将图15中第二行右边图像合成为第三行右边的图像对应的第一图像数据。
对于混合色滤光单元对应的分离出来的图像,如图15所示,可以通过插值算法,对W像素缺失的位置进行填充,以获得全像素的第二图像数据,第一图像数据和第二图像数据的分辨率相同。其中,该第二图像数据为灰度图。
将第一图像数据与第二图像数据进行合成,以获得拍摄对应的目标图像。
可选地,可以根据上述确定目标图像的过程中的各种因素,对应设置滤光单元阵列220中混合色滤光单元的分布以及比例。例如,考虑确定第一图像时,如果单色滤光单元数量过少,也就是RGB像素数量过少,那么会使得最后获得的目标图像与实际被拍摄对象之间会存在较为严重的色差,所以 滤光单元阵列220中单色滤光单元的比例不同太低;再例如,考虑增加混合色滤光单元能够提高图像传感器的整体进光量,因此,混合色滤光单元的比例也不能太低。综合来看,通常也可以选择白色滤光单元占比50%左右较为合适。例如,可以采用如图5所示的两种滤光单元组,该滤光单元组中白色滤光单元占比均为50%,并且分布较为均匀,红色、绿色和蓝色滤光单元分布也较为平均,对应获取的拍摄对应的图像效果较好。
因此,本申请实施例的图像传感器,在滤光单元阵列中增加混合色滤光单元,例如,增加白色滤光单元,这样可以有效的增加图像传感器的整体进光量。例如,增加25%的白色滤光单元,那么将提升整个传感器的进光量30%左右,而增加50%的白色滤光单元,则可以提升60%左右。另外,通过合理地设置滤光单元阵列200的各个颜色的滤光单元的分布,能够保证例如白色等混合色滤光单元具有较高的空间采样率,有利于后续获取更好的高分辨率灰度图像;也可以保证R像素、G像素、B像素有相对平均的空间采样率,有利于后续的Remosaic算法获取彩色图像。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方, 或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应所述以权利要求的保护范围为准。

Claims (21)

  1. 一种图像传感器,其特征在于,包括:
    微透镜阵列,用于将拍摄对象返回的光信号汇聚至滤光单元阵列;
    滤光单元阵列,位于所述微透镜阵列下方,所述微透镜阵列中的每个微透镜对应至少一个滤光单元,所述滤光单元阵列包括颜色分布相同的多个滤光单元组,所述多个滤光单元组中每个滤光单元组包括至少一个混合色滤光单元和至少一个单色滤光单元;
    感光单元阵列,位于所述滤光单元阵列下方,所述感光单元阵列中的感光单元与所述滤光单元阵列中的滤光单元一一对应,所述感光单元阵列中与单色滤光单元对应的感光单元用于接收经过所述单色滤光单元的单色光信号,所述感光单元阵列中与混合色滤光单元对应的感光单元用于接收经过所述混合色滤光单元的混合色光信号,所述单色光信号和所述混合光信号用于生成所述拍摄对象的目标图像。
  2. 根据权利要求1所述的图像传感器,其特征在于,所述每个滤光单元组包括至少一个白色滤光单元、至少一个红色滤光单元、至少一个绿色滤光单元和至少一个蓝色滤光单元。
  3. 根据权利要求2所述的图像传感器,其特征在于,所述每个滤光单元组中白色滤光单元的比例大于或者等于25%。
  4. 根据权利要求3所述的图像传感器,其特征在于,所述每个滤光单元组中白色滤光单元的比例小于或者等于75%。
  5. 根据权利要求3或4所述的图像传感器,其特征在于,所述每个滤光单元组中白色滤光单元的比例为25%、37.5%、50%、62.5%或者75%。
  6. 根据权利要求2至5中任一项所述的图像传感器,其特征在于,所述每个滤光单元组中红色滤光单元、绿色滤光单元和蓝色滤光单元的比例为1:2:1或者1:1:1。
  7. 根据权利要求1至6中任一项所述的图像传感器,其特征在于,所述每个滤光单元组包括4*4个、4*8个、6*6个或者8*8个滤光单元。
  8. 根据权利要求1至7中任一项所述的图像传感器,其特征在于,所述微透镜阵列中的微透镜与所述滤光单元阵列中的滤光单元一一对应。
  9. 根据权利要求1至7中任一项所述的图像传感器,其特征在于,所述微透镜阵列中包括至少一个第一微透镜和至少一个第二微透镜,
    所述第一微透镜与所述滤光单元阵列中的一个滤光单元对应,
    所述第二微透镜与所述滤光单元阵列中的多个滤光单元对应。
  10. 根据权利要求9所述的图像传感器,其特征在于,所述多个滤光单元为2*2个滤光单元。
  11. 根据权利要求9或10所述的图像传感器,其特征在于,所述多个滤光单元具有相同颜色。
  12. 根据权利要求11所述的图像传感器,其特征在于,所述多个滤光单元均为白色滤光单元。
  13. 根据权利要求12所述的图像传感器,其特征在于,所述多个滤光单元属于同一滤光单元组。
  14. 根据权利要求1至13中任一项所述的图像传感器,其特征在于,所述单色光信号用于生成所述拍摄对象的第一图像数据,所述混合色光信号用于生成所述拍摄对象的第二图像数据,所述第一图像数据和所述第二图像数据用于合成所述拍摄对象的所述目标图像。
  15. 根据权利要求14所述的图像传感器,其特征在于,所述第一图像数据和所述第二图像数据的分辨率相同。
  16. 根据权利要求15所述的图像传感器,其特征在于,所述第二图像数据为通过插值算法生成。
  17. 一种电子设备,其特征在于,包括:如权利要求1-16中任一项所述的图像传感器。
  18. 根据权利要求17所述的电子设备,其特征在于,所述电子设备还包括:
    处理单元,用于根据所述单色光信号和所述混合色光信号,生成所述拍摄对象的目标图像。
  19. 根据权利要求18所述的电子设备,其特征在于,所述处理单元用于:
    根据所述单色光信号,生成所述拍摄对象的第一图像数据;
    根据所述混合色光信号,生成所述拍摄对象的第二图像数据;
    将所述第一图像数据和所述第二图像数据合成为所述拍摄对象的所述目标图像。
  20. 根据权利要求19所述的电子设备,其特征在于,所述第一图像数 据和所述第二图像数据的分辨率相同。
  21. 根据权利要求20所述的电子设备,其特征在于,所述处理单元用于:
    根据所述混合色光信号,通过插值算法,生成所述拍摄对象的第二图像数据。
PCT/CN2020/103268 2020-05-15 2020-07-21 图像传感器和电子设备 WO2021227250A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010410639 2020-05-15
CN202010410639.2 2020-05-15

Publications (1)

Publication Number Publication Date
WO2021227250A1 true WO2021227250A1 (zh) 2021-11-18

Family

ID=72202804

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/103268 WO2021227250A1 (zh) 2020-05-15 2020-07-21 图像传感器和电子设备

Country Status (2)

Country Link
CN (11) CN111756974A (zh)
WO (1) WO2021227250A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114823985A (zh) * 2022-05-31 2022-07-29 深圳市聚飞光电股份有限公司 一种光电传感器及其封装方法

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114845015A (zh) * 2020-10-15 2022-08-02 Oppo广东移动通信有限公司 图像传感器、控制方法、成像装置、终端及可读存储介质
CN112312097B (zh) * 2020-10-29 2023-01-24 维沃移动通信有限公司 一种传感器
CN112822466A (zh) * 2020-12-28 2021-05-18 维沃移动通信有限公司 图像传感器、摄像模组和电子设备
CN113037980A (zh) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 像素传感阵列和视觉传感器
CN115225832A (zh) * 2021-04-21 2022-10-21 海信集团控股股份有限公司 一种图像采集设备及图像加密处理方法、设备和介质
CN113540138B (zh) * 2021-06-03 2024-03-12 奥比中光科技集团股份有限公司 一种多光谱图像传感器及其成像模块
CN113676652B (zh) * 2021-08-25 2023-05-26 维沃移动通信有限公司 图像传感器、控制方法、控制装置、电子设备和存储介质
CN113676651B (zh) * 2021-08-25 2023-05-26 维沃移动通信有限公司 图像传感器、控制方法、控制装置、电子设备和存储介质
CN113852797A (zh) * 2021-09-24 2021-12-28 昆山丘钛微电子科技股份有限公司 色彩滤镜阵列、图像传感器以及摄像模组
CN114125318A (zh) * 2021-11-12 2022-03-01 Oppo广东移动通信有限公司 图像传感器、摄像模组、电子设备、图像生成方法和装置
CN114125240A (zh) * 2021-11-30 2022-03-01 维沃移动通信有限公司 图像传感器、摄像模组、电子设备及拍摄方法
CN114363486A (zh) * 2021-12-14 2022-04-15 Oppo广东移动通信有限公司 图像传感器、摄像模组、电子设备、图像生成方法和装置
CN114157795A (zh) * 2021-12-14 2022-03-08 Oppo广东移动通信有限公司 图像传感器、摄像模组、电子设备、图像生成方法和装置
CN115696078B (zh) * 2022-08-01 2023-09-01 荣耀终端有限公司 色彩滤波阵列、图像传感器、摄像头模组和电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105430359A (zh) * 2015-12-18 2016-03-23 广东欧珀移动通信有限公司 成像方法、图像传感器、成像装置及电子装置
CN105516697A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
US20160119559A1 (en) * 2014-10-27 2016-04-28 Novatek Microelectronics Corp. Color Filter Array and Manufacturing Method thereof
CN105578078A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN105578071A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN107105140A (zh) * 2017-04-28 2017-08-29 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9479745B2 (en) * 2014-09-19 2016-10-25 Omnivision Technologies, Inc. Color filter array with reference pixel to reduce spectral crosstalk
CN105282529B (zh) * 2015-10-22 2018-01-16 浙江宇视科技有限公司 一种基于raw空间的数字宽动态方法及装置
US10313612B2 (en) * 2015-12-18 2019-06-04 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image sensor, control method, and electronic device
CN105516700B (zh) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160119559A1 (en) * 2014-10-27 2016-04-28 Novatek Microelectronics Corp. Color Filter Array and Manufacturing Method thereof
CN105430359A (zh) * 2015-12-18 2016-03-23 广东欧珀移动通信有限公司 成像方法、图像传感器、成像装置及电子装置
CN105516697A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN105578078A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN105578071A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN107105140A (zh) * 2017-04-28 2017-08-29 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114823985A (zh) * 2022-05-31 2022-07-29 深圳市聚飞光电股份有限公司 一种光电传感器及其封装方法
CN114823985B (zh) * 2022-05-31 2024-04-09 深圳市聚飞光电股份有限公司 一种光电传感器及其封装方法

Also Published As

Publication number Publication date
CN212752389U (zh) 2021-03-19
CN111654615A (zh) 2020-09-11
CN111756972A (zh) 2020-10-09
CN212435794U (zh) 2021-01-29
CN111614886A (zh) 2020-09-01
CN111756973A (zh) 2020-10-09
CN111756974A (zh) 2020-10-09
CN111614886B (zh) 2021-10-19
CN212752379U (zh) 2021-03-19
CN212785522U (zh) 2021-03-23
CN111629140A (zh) 2020-09-04
CN212435793U (zh) 2021-01-29

Similar Documents

Publication Publication Date Title
WO2021227250A1 (zh) 图像传感器和电子设备
JP5118047B2 (ja) 高機能カラーフィルタモザイクアレイのためのシステムおよび方法
CN205792895U (zh) 成像系统
CN206727071U (zh) 图像传感器
CN102339839B (zh) 具有改良的光电二极管区域分配的cmos图像传感器
CN206759600U (zh) 成像系统
CN102365861B (zh) 在产生数字图像时曝光像素组
KR101442313B1 (ko) 카메라 센서 교정
CN100596168C (zh) 多透镜成像系统和方法
US8294797B2 (en) Apparatus and method of generating a high dynamic range image
EP1871091A2 (en) Camera Module
CN110649057B (zh) 图像传感器、摄像头组件及移动终端
US9159758B2 (en) Color imaging element and imaging device
US11659289B2 (en) Imaging apparatus and method, and image processing apparatus and method
CN102461175A (zh) 用于四通道彩色滤光片阵列的内插
TWI600927B (zh) 用於減少顏色混疊之彩色濾光器陣列圖案
KR20160065464A (ko) 컬러 필터 어레이, 이를 포함하는 이미지 센서 및 이를 이용한 적외선 정보 획득 방법
CN210143059U (zh) 图像传感器集成电路、图像传感器以及成像系统
EP2502422A1 (en) Sparse color pixel array with pixel substitutes
CN111818314A (zh) 一种滤波器阵列及图像传感器
CN104412581A (zh) 彩色摄像元件及摄像装置
CN111818283A (zh) 三角形像素的图像传感器、电子设备及成像方法
CN207251823U (zh) 成像设备和成像系统
CN212785636U (zh) 滤波器阵列、图像传感器及其应用设备
CN111989916B (zh) 成像设备和方法、图像处理设备和方法以及成像元件

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20935408

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20935408

Country of ref document: EP

Kind code of ref document: A1