CN111629140A - Image sensor and electronic device - Google Patents

Image sensor and electronic device Download PDF

Info

Publication number
CN111629140A
CN111629140A CN202010724148.5A CN202010724148A CN111629140A CN 111629140 A CN111629140 A CN 111629140A CN 202010724148 A CN202010724148 A CN 202010724148A CN 111629140 A CN111629140 A CN 111629140A
Authority
CN
China
Prior art keywords
row
column
filter
units
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010724148.5A
Other languages
Chinese (zh)
Inventor
程祥
王迎磊
宋锐男
张玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Publication of CN111629140A publication Critical patent/CN111629140A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The application provides an image sensor and an electronic device, which can enable a finally generated image to be closer to a real effect. The image sensor includes: the filter unit array comprises a plurality of filter unit groups, and each filter unit group in the plurality of filter unit groups comprises a white filter unit and a color filter unit; the pixel unit array is positioned below the filtering unit array, pixel units in the pixel unit array correspond to filtering units in the filtering unit groups one by one, white pixel units in the pixel unit array are used for receiving optical signals passing through the white filtering units corresponding to the white pixel units, color pixel units in the pixel unit array are used for receiving optical signals passing through the color filtering units corresponding to the color pixel units, and light intensity information of the optical signals sensed by the pixel unit array is used for determining a generation mode of generating a target image of a shooting object by pixel values of the color pixel units and pixel values of the white pixel units.

Description

Image sensor and electronic device
The present application claims priority from the chinese patent office, application No. 202010410639.2, entitled "image sensor and electronic device," filed on 15/5/2020, the entire contents of which are incorporated herein by reference.
Technical Field
The embodiments of the present application relate to the field of images, and more particularly, to an image sensor and an electronic device.
Background
Imaging systems in electronic devices typically rely on image sensors to create an electronic display of a viewable image. Examples of such image sensors include charge-coupled device (CCD) image sensors and Active Pixel Sensor (APS) devices, which are often also referred to as CMOS sensors because they can be fabricated in Complementary Metal Oxide Semiconductor (CMOS) processes.
These image sensors include a plurality of light sensitive pixels, often arranged in a regular pattern of rows and columns. In order to capture a color image, it is necessary to accumulate light signals of a specific wavelength on different pixels, that is, signals corresponding to reception of a specific color, so a color filter is installed in an image sensor. For example, a filter having a Bayer (Bayer) array configured to include each color of red, green, and blue (RGB) is generally used.
In order to make different pixels in the pixel array sensitive to only part of the visible spectrum, it is necessary to set the color filters to different colors to pass the light signals of the corresponding colors, and then the amount of light reaching each light-sensitive pixel is reduced, thereby reducing the light sensitivity of each light-sensitive pixel. In addition, when used in a mobile device, the image sensor is generally limited in size, and the photosensitive area of the corresponding pixel array is also limited, so that the performance of photographing is limited in a low-light environment.
Disclosure of Invention
The application provides an image sensor and an electronic device, which can enable a finally generated image to be closer to a real effect.
In a first aspect, an image sensor is provided, the image sensor comprising: the filter unit array comprises a plurality of filter unit groups, and each filter unit group in the plurality of filter unit groups comprises a white filter unit and a color filter unit; the pixel unit array is located under the filtering unit array, pixel units in the pixel unit array correspond to filtering units in the filtering unit groups one to one, white pixel units in the pixel unit array are used for receiving optical signals passing through the white filtering units corresponding to the white pixel units, color pixel units in the pixel unit array are used for receiving optical signals passing through the color filtering units corresponding to the color pixel units, and light intensity information of the optical signals sensed by the pixel unit array is used for determining pixel values of the color pixel units and a generation mode of a target image of a shooting object generated by the pixel values of the white pixel units.
Based on the above technical solution, the image sensor in the present application may include a white filter unit, and compared with a filter unit array only provided with a monochromatic filter unit, the light-entering amount may be greatly increased, and accordingly, the light-entering amount of the entire image sensor may also be increased, so that even in a low-light environment, the performance of the image sensor is still not affected.
In addition, the image sensor can generate the target image in different fusion modes under different ambient light conditions, so that the finally generated target image is closer to a real effect.
In one possible implementation manner, each of the plurality of filter unit groups includes 4 × 4 filter units, a ratio of white filter units to color filter units in each of the filter unit groups is 1:1, and the white filter units and the color filter units are alternately arranged.
The white filtering unit with the proportion of 50% can ensure that the white pixel unit has a higher spatial sampling rate, and is beneficial to obtaining a better high-resolution gray image by a subsequent Remosaic algorithm.
In addition, the white filter cells and the color filter cells may be alternately arranged, that is, two adjacent white filter cells or color filter cells do not exist in any one row or any one column, so that crosstalk may be relatively averaged. In addition, the white filter unit and the color filter unit are distributed uniformly, so that a high color space sampling rate can be realized, and the restoration of subsequent images is facilitated.
In one possible implementation, the color filter unit includes 2 red filter units, 2 blue filter units, and 4 green filter units.
In one possible implementation manner, the filter units on one diagonal line in each filter unit group are white filter units, and the other diagonal line is 2 red filter units and 2 blue filter units.
In a possible implementation manner, the 2 red filter units on the other diagonal line are arranged at the same vertex angle, and the 2 blue filter units are also arranged at the same vertex angle.
In one possible implementation, the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column in each filter unit group; the red filter units are positioned in a third row, a second column and a fourth row, a first column in each filter unit group; the blue filter units are positioned in a fourth column in a first row and a third column in a second row in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
In a possible implementation manner, 2 red filter units on the other diagonal line are arranged at the same vertex angle, and 2 blue filter units are separately arranged; or the 2 blue light filtering units on the other diagonal line are arranged at the same vertex angle, and the 2 red light filtering units are separately arranged.
In one possible implementation, the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column in each filter unit group; the red filter units are positioned in a fourth column of a first row and a first column of a fourth row in each filter unit group; the blue filter units are positioned in a second row, a third column and a third row and a second column in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
In one possible implementation, the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column in each filter unit group; the red filter units are positioned in a second row, a third column and a third row and a second column in each filter unit group; the blue filter units are positioned in a fourth column in the first row and a first column in the fourth row and in a first column in the fourth row in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
In one possible implementation, the red filter elements and the blue filter elements on the other diagonal line are alternately arranged.
In one possible implementation, the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column in each filter unit group; the red filter units are positioned in the third column of the second row and the first column of the fourth row in each filter unit group; the blue filter units are positioned in a first row, a fourth column, a third row and a second column in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
In a possible implementation manner, the filter units on one diagonal line in each filter unit group are white filter units, and the filter units on the other diagonal line are green filter units.
In one possible implementation, the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column in each filter unit group; the red filter units are positioned in a first column of a second row and a third column of a fourth row in each filter unit group; the blue filter units are positioned in a first row, a second column, a third row and a fourth column in each filter unit group; the green filter units are positioned in a first row, a fourth column, a second row, a third column, a third row, a second column and a fourth row, a first column in each filter unit group.
In one possible implementation, the method further includes a processor configured to: determining light intensity information according to the light signals sensed by the pixel unit array; and determining a generation mode for generating a target image of the shooting object by the pixel value of the color pixel unit and/or the pixel value of the white pixel unit according to the light intensity information.
In one possible implementation, the method further includes a processor configured to: determining light intensity according to the light signals sensed by the pixel unit array; and when the light intensity is greater than or equal to a first preset threshold value, generating the target image by using the pixel values of the color pixel units.
In one possible implementation, the method further includes a processor configured to: when the light intensity is smaller than the first preset threshold and larger than or equal to a second preset threshold, determining texture information according to a pixel value of a white pixel unit, determining color information and a pixel value of the position of the white pixel unit according to the pixel value of the white pixel unit and colors of pixel units around the white pixel unit, and generating the target image according to the color information and the pixel value of the position of the white pixel unit and the pixel values of the color pixel units.
In one possible implementation, the method further includes a processor configured to: when the light intensity is smaller than the second preset threshold and larger than or equal to a third preset threshold, generating first image data according to the pixel value of the white pixel unit, generating second image data according to the pixel value of the color pixel unit, and correcting the first image data by using a correction coefficient alpha to obtain corrected first image data; and fusing the corrected first image data and the second image data to obtain the target image, wherein a correction coefficient alpha is determined according to the light intensity, and alpha is more than 0 and less than 1.
In one possible implementation, the method further includes a processor configured to: and when the light intensity is smaller than the third preset threshold value, generating first image data according to the pixel value of the white pixel unit, generating second image data according to the pixel value of the color pixel unit, and fusing the first image data and the second image data to obtain the target image.
The fusion mode can ensure that the generated target image can be close to a real effect no matter in a strong light environment or a weak light environment.
In one possible implementation manner, the method further includes: and the micro lens array comprises a plurality of micro lenses, is positioned above the light filtering unit array and is used for converging the optical signals returned by the shooting object to the light filtering unit array, wherein one micro lens in the plurality of micro lenses corresponds to at least one light filtering unit in the light filtering unit array.
In one possible implementation manner, the microlenses in the microlens array correspond to the filtering units in the filtering unit array one by one.
In a second aspect, an electronic device is provided, comprising: the image sensor of the first aspect or any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic block diagram of an image processing apparatus provided in an embodiment of the present application.
Fig. 2 is a schematic diagram of the color distribution of a conventional set of filter cells.
Fig. 3 is a schematic top view of another image sensor provided in an embodiment of the present application.
Fig. 4 is a schematic cross-sectional view of the image sensor of fig. 3 along a-a'.
Fig. 5 is a schematic illustration of images taken under two different filter configurations.
Fig. 6 to fig. 11 are schematic diagrams illustrating arrangement of filter units in a filter unit group according to an embodiment of the present application.
Fig. 12-13 are schematic structural diagrams of an image sensor provided in an embodiment of the present application.
Fig. 14-18 are schematic flow charts of a fusion process provided by an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
The image processing device converts an optical image of a photographic subject into an electric signal in a proportional relationship with the optical image by using a photoelectric conversion function of the pixel array, and obtains an image of the photographic subject. Fig. 1 shows a schematic block diagram of an image processing apparatus 100, the image processing apparatus 100 may refer to any electronic device, for example, the image processing apparatus 100 may be a mobile phone; alternatively, the image processing apparatus 100 may be a part of an electronic device, for example, an image pickup module in the electronic device, and the embodiment of the present application is not limited thereto.
As shown in fig. 1, the image processing apparatus 100 generally includes a pixel array (pixel array)101 (or may also be referred to as a photoelectric conversion unit 101 or an image sensor 101), a signal reading circuit 102, a memory 103, a controller 104, an image processor 105, an output interface 106, and a power supply 107. The electrical signal output end of the pixel array 101 is connected to the input end of the signal reading circuit 102, the control end of the pixel array 101 is connected to the input end of the controller 104, the output end of the signal reading circuit 102 is connected to the input end of the memory 103 and the input end of the controller 104, the output end of the controller 104 is connected to the input end of the image processor 105, the output end of the image processor 105 is connected to the input end of the output interface 106, and the power supply 107 is used for providing power supply for the above modules.
The pixel array 101 may adopt two different semiconductor structures of a CCD or a CMOS to capture light and perform photoelectric conversion, and the pixel array 101 may be configured to collect a light signal returned by an imaging object, convert the light signal into an electrical signal, and reflect a light image of the imaging object by the intensity of the electrical signal. The signal reading circuit 102 is used for reading the electrical signal output by each pixel, and the signal reading circuit 102 may be an a/D converter for implementing analog-to-digital conversion. The memory 103 may be an internal memory for directly exchanging data, for example, the memory 103 may be a Random Access Memory (RAM) for storing required data. The controller 104 may be a Complex Programmable Logic Device (CPLD) capable of satisfying the logic operation and timing control of the sensor. The image processor 105 is used to pre-process the read-out data and may perform different algorithmic processing for different filter patterns. The output interface 106 serves as an external data interaction interface for transmitting the image data to the outside. The controller 104 is configured to output a control signal for controlling each pixel in the pixel array to work in cooperation.
The core component of the image processing apparatus 100 is the pixel array 101. Each photosensitive structure in the pixel array 101 is similar, and generally, each pixel structure may include a lens (or a microlens), a filter (or a color filter), and a photosensitive element (or a pixel). The lens is positioned above the optical filter, and the optical filter is positioned above the photosensitive element. Light returned by the subject is focused by the lens, emitted from the lens exit area, filtered by the optical filter, and incident on a photosensor such as a Photodiode (PD), and an optical signal is converted into an electric signal by the photosensor. The pixels may include red pixels (hereinafter, referred to as R pixels), green pixels (hereinafter, referred to as G pixels), and blue pixels (hereinafter, referred to as B pixels) according to the types of light transmitted through the different filters. The R pixel is used for receiving a red light signal filtered by the filter, and the principle of the G pixel and the B pixel is the same as that of the R pixel, which is not described herein again.
The principle of the image processing apparatus 100 generating color image data is: each pixel in the pixel array can convert only one type of optical signal into an electrical signal, and then perform interpolation operation by combining optical signals acquired by other types of pixels around, so as to restore the image color of the area acquired by the current pixel, which is also called Demosaicing (Demosaicing), and is usually completed in a processor. For example: the current pixel is an R pixel, and the R pixel can only convert a red light signal into an electrical signal, so that the intensity of blue light and green light of the current pixel can be restored by combining the electrical signals collected by surrounding B pixels or G pixels, and the image color of the current pixel is determined.
Therefore, in order to collect a color image, a color filter with a color specific arrangement needs to be disposed above a photosensor array included in the photosensitive array, or may also be referred to as a Color Filter Array (CFA). Currently, for most photosensitive arrays, such as CCD and CMOS image sensors, the CFA involved therein employs a Bayer format based on the three primary colors of RGB. The Bayer pattern is characterized by a basic unit of a 2 × 2 four-pixel array including 1 red pixel R, one blue pixel B, and 2 green pixels G, wherein two green pixels G are adjacently disposed at a common vertex, as shown in fig. 2. Since any pixel can only obtain a signal of one color of RGB, the restoration of complete color information must be realized by a specific image processing algorithm.
In such a pure RGB bayer layout, each pixel allows only light of a specific color to pass, i.e., most of the photons are cut off, and thus the image may not be accurately restored in a low-light environment.
In addition, miniaturized and multi-image-pixel imaging devices are becoming more and more numerous pixels, and imaging equipment with high density pixels is becoming more and more important for capturing high resolution images. Thus, as the number of pixels of the image sensor is increased and the size of the image sensor is smaller, the photosensitive area of each pixel is reduced, which further affects the intensity of the optical signal sensed by each pixel, so that the image sensor cannot accurately restore the image.
Alternatively, the image processor 105 may include, but is not limited to, an Image Signal Processor (ISP) for performing linearization processing, dead pixel removal, noise removal, color correction, demosaic (demosaic), Automatic Exposure Control (AEC), Automatic Gain Control (AGC), Automatic White Balance (AWB), and the like on the digital image.
With the above-described image processing apparatus 100 employing the Bayer format CFA, the red pixel cell can receive only the red light signal, the green pixel cell can receive only the green light signal, the blue pixel cell can receive only the blue light signal, and the intensity of the light signal received by each pixel cell is small, which causes the SNR of the image to be large, thereby affecting the image quality.
In addition, in the image sensor of the Bayer CFA, high-frequency information of luminance and chrominance information in an image are likely to overlap, color aliasing (color aliasing) occurs, and color moire (color moire) is likely to occur.
In addition, miniaturized and multi-image-pixel imaging devices are becoming more and more numerous pixels, and imaging equipment with high density pixels is becoming more and more important for capturing high resolution images. Thus, as the number of pixels of the image sensor is increased and the size of the image sensor is smaller, the photosensitive area of each pixel is reduced, which further affects the intensity of the optical signal sensed by each pixel, so that the image sensor cannot accurately restore the image.
Based on this, the embodiment of the application provides an image sensor, which is beneficial to accurately restoring an image under a low-light environment.
Fig. 3 is a schematic top view of an image sensor 200 according to an embodiment of the present disclosure, and fig. 4 is a schematic cross-sectional view of the image sensor 200 along a direction a-a'.
As shown in fig. 3 and 4, the image sensor 200 includes a filter cell array 210 and a pixel cell array 220. The filter cell array 210 may include a plurality of filter cell groups 211, and each filter cell group 211 may include a white filter cell and a color filter cell. The pixel unit array 220 may be disposed below the filter unit array 210, and the white pixel units in the pixel unit array 220 are configured to receive the optical signals passing through the corresponding white filter units, and the color pixel units are configured to receive the optical signals filtered by the corresponding color filter units.
In the image sensor in the embodiment of the application, a white (W) filter unit is added in the CFA, the color pixel units in the pixel unit array receive color light signals, and the white pixel units receive white light signals, so that the overall light signal intensity received by the pixel unit array can be increased. For a low-light environment, the scheme can accurately restore the image of the object.
However, after the white filter unit is added in the CFA, if the pixel value of the white pixel unit is directly fused into the color pixel unit, when the light is sufficient, the brightness of the final image is too large due to excessive brightness information, so that the final image deviates from the real effect.
As shown in fig. 5, when the light is sufficient, the image generated by using the RGGB structure is as shown in (a), which is relatively close to the real effect; however, as shown in (b), the images generated by using the RGBW structure have too large brightness to deviate from the real effect because the images are fused with the brightness information of the W pixel unit.
Based on this, the embodiment of the application also provides an image sensor, which can generate a target image by using different fusion modes under different ambient light conditions, so that the finally generated target image is closer to a real effect.
The image sensor includes a filter cell array and a pixel cell array. The filter unit array includes a plurality of filter unit groups, each of which includes a white filter unit and a color filter unit. The pixel unit array is located below the filter unit array, the white pixel unit in the pixel unit array is used for receiving the optical signal passing through the white filter unit corresponding to the white pixel unit, the color pixel unit in the pixel unit array is used for receiving the optical signal passing through the color filter unit corresponding to the color pixel unit, and the light intensity information of the optical signal sensed by the pixel unit array is used for determining the pixel value of the color pixel unit and/or the generation mode of generating the target image of the shooting object by the pixel value of the white pixel unit.
The white pixel unit is a pixel unit corresponding to the white filter unit, and the color pixel unit is a pixel unit corresponding to the color filter unit.
It should be noted that, in the present application, the white filter unit refers to a filter or a filter material for transmitting white light, and in some embodiments, the white filter unit may also be a transparent material or an air gap for transmitting all optical signals including white light in the environment. In particular, the white light may be a mixture of colored lights. For example, light of three primary colors in the spectrum: blue, red and green, mixed in a certain proportion to obtain white light, or the mixture of all visible light in the spectrum is also white light.
The color filter units in the embodiments of the present application may include at least one red filter unit, at least one blue filter unit, and at least one green filter unit, so that the color integrity of the RGB space can be ensured.
The number of the filter units included in the filter unit group is not specifically limited in the embodiment of the present application. For example, one filtering unit group may include 2 × 2 filtering units, or one filtering unit group may include 3 × 3 filtering units, or one filtering unit group may include 4 × 4 filtering units, or one filtering unit group may include 6 × 6 filtering units, or one filtering unit group may include 8 × 8 filtering units, and the like.
The proportion of the white filter units in each filter unit group can be between 25% and 75%.
Preferably, the proportion of the white filter unit may be 50%, that is, the proportion of the white filter unit to the color filter unit in one filter unit group may be 1:1, so that the light sensing capability of the whole sensor can be improved. Therefore, the white light filtering unit can provide enough brightness under a low-light environment, overhigh brightness cannot be caused, and the final image can be close to a real effect.
And secondly, the white filtering unit with the proportion of 50% can ensure that the white pixel unit has higher spatial sampling rate, and is beneficial to acquiring better high-resolution gray-scale images by a subsequent reconstruction mosaic (Remosaic) algorithm.
In addition, the white filter cells and the color filter cells may be alternately arranged, that is, two adjacent white filter cells or color filter cells do not exist in any one row or any one column, so that crosstalk may be relatively averaged. In addition, the white filter unit and the color filter unit are distributed uniformly, so that a high color space sampling rate can be realized, and the restoration of subsequent images is facilitated.
The following describes an arrangement of the filter units in the embodiment of the present application, taking an example that each filter unit group includes 4 × 4 filter units.
One filter unit group may include 2 red filter units, 2 blue filter units, and 4 green filter units. Because the sensitivity of human eyes to green light is higher than that of blue light and red light, the number of the green light filtering units in the light filtering units is larger than that of the blue light filtering units and that of the red light filtering units, so that color restoration can be better realized, RGB is ensured to have relatively average spatial sampling rate, and the subsequent Remosaic algorithm is facilitated to acquire color images.
The filter units on one diagonal line in each filter unit group are white filter units, and the other diagonal line is 2 red filter units and 2 blue filter units.
As an example, the 2 red filter units may be arranged at a vertex angle, and the 2 blue filter units are also arranged at a common vertex angle, that is, 2 consecutive red filter units and 2 consecutive blue filter units are included on another diagonal line, as shown in (a) of fig. 6.
In the filter unit groups shown in (a) of fig. 6, the white filter units may be located in the first row and the first column, the first row and the third column, the second row and the second column, the second row and the fourth column, the third row and the first column, the third row and the third column, the fourth row and the second column, and the fourth row and the fourth column of each filter unit group; the red filter units are positioned in a third row, a second column and a fourth row, a first column in each filter unit group; the blue filter units are positioned in a fourth column in a first row and a third column in a second row in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
In the structure, the light filtering units with the same color are arranged at the same vertex angle, so that the fusion process can be simplified in the subsequent fusion process. Taking fig. 4 as an example, the pixel units corresponding to the 2 green filter units at the upper left corner may be directly merged into one green pixel unit, the pixel units corresponding to the 2 blue filter units at the upper right corner may be directly merged into one blue pixel unit, the pixel units corresponding to the 2 red filter units at the lower left corner may be directly merged into one red pixel unit, the pixel units corresponding to the 2 green filter units at the lower right corner may be directly merged into one green pixel unit, and the merged pixel units may directly form an RGGB pattern.
As another example, 2 red filter cells on another diagonal are arranged at a common vertex angle, and 2 blue filter cells are arranged separately; or, the 2 blue filter units on the other diagonal line are arranged at the same vertex angle, and the 2 red filter units are separately arranged.
Taking the diagram (b) in fig. 6 as an example, the white filter units are located in the first row and the first column, the first row and the third column, the second row and the second column, the second row and the fourth column, the third row and the first column, the third row and the third column, the fourth row and the second column, and the fourth row and the fourth column in each filter unit group; the red filter unit is positioned in the fourth column of the first row and the first column of the fourth row in each filter unit group; the blue filter units are positioned in the third column of the second row and the second column of the third row in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
Taking the diagram (c) in fig. 6 as an example, the white filter units are located in the first row and the first column, the first row and the third column, the second row and the second column, the second row and the fourth column, the third row and the first column, the third row and the third column, the fourth row and the second column, and the fourth row and the fourth column in each filter unit group; the red filter units are positioned in the third column of the second row and the second column of the third row in each filter unit group; the blue filter units are positioned in the fourth column of the first row and the first column of the fourth row in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
Although the color filter units of the same color in the filter unit groups shown in fig. 6 (b) and (c) are not all arranged at opposite corners, it can be seen that the filter unit groups after tiling can achieve the same effect as the filter unit group shown in fig. 6 (a), the filter unit group 302 corresponds to the filter unit group shown in fig. 6 (a), the filter unit group 304 corresponds to the filter unit group shown in fig. 6 (b), and the filter unit group 306 corresponds to the filter unit group shown in fig. 6 (c), as shown in fig. 8, by arranging the filter unit groups in an array such that the filter unit groups tile the entire surface of the image sensor.
As still another example, the red filter cells and the blue filter cells on the other diagonal line are alternately arranged.
As shown in (d) of fig. 6, the white filter units are located in the first row and the first column, the first row and the third column, the second row and the second column, the second row and the fourth column, the third row and the first column, the third row and the third column, the fourth row and the second column, and the fourth row and the fourth column of each filter unit group; the red filter units are positioned in the third column of the second row and the first column of the fourth row in each filter unit group; the blue filter units are positioned in a fourth column of a first row and a second column of a third row in each filter unit group; the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
Of course, the arrangement of the red filter units and the blue filter units on the other diagonal line is not limited to the above description, for example, the red filter units and the blue filter units on the other diagonal line may also be arranged as shown in fig. 7.
Optionally, the filter units on one diagonal line in each filter unit group are white filter units, and the filter units on the other diagonal line are green filter units. Wherein, 2 red filter units can be set separately, and 2 blue filter units are also set separately.
For example, as shown in (e) of fig. 6, the white filter units are located in the first row and the first column, the first row and the third column, the second row and the second column, the second row and the fourth column, the third row and the first column, the third row and the third column, the fourth row and the second column, and the fourth row and the fourth column in each filter unit group; the red filter unit is positioned in the first column of the second row and the third column of the fourth row in each filter unit group; the blue filter units are positioned in a first row, a second column, a third row and a fourth column in each filter unit group; the green filter units are positioned in the fourth column of the first row, the third column of the second row, the second column of the third row and the first column of the fourth row in each filter unit group.
As can be seen from fig. 8, the filter unit group 308 in fig. 8 corresponds to the filter unit group shown in fig. 6 (e), that is, the filter unit group shown in fig. 6 (e) can also achieve the effect of the filter unit group shown in fig. 6 (a), (b), and (c).
The filter units may be arranged in a manner as shown in fig. 9, in addition to the manner shown in fig. 6 (e).
In addition, as can be seen from fig. 8, the filtering unit group 310 is also an arrangement in which white filtering units are arranged on a diagonal line, 3 green filtering units are continuously arranged at opposite corners, and another green filtering unit is arranged separately from the 3 green filtering units.
The filter cells may be arranged in the manner shown in fig. 10, in addition to the filter cell group 310.
The filter cells in the filter cell group may be arranged in the manner shown in fig. 11, in addition to the filter cell groups shown in fig. 6 to 10.
As shown in fig. 12 and 13, the image sensor 200 in the embodiment of the present application may further include a microlens array 230, and the microlens array 230 may be disposed above the filter cell array 210, and is configured to converge an optical signal returned by the photographic subject to the filter cell array 210. The microlens array 230 may include a plurality of microlenses, and one of the plurality of microlenses corresponds to at least one filter unit of the filter unit array.
In the embodiment of the present application, the distribution of the microlens array 230 may be set corresponding to the filter unit array 210 located therebelow, for example, each microlens in the microlens array 230 may correspond to one or more of the filter unit arrays 210 located therebelow.
Alternatively, as an embodiment, the microlenses in the microlens array 230 and the filter units in the filter unit array 210 may correspond to each other one by one. Specifically, as shown in fig. 12, the microlens array 230 includes a plurality of first microlenses 231 therein, and each first microlens 231 corresponds to one filter unit and also corresponds to one pixel unit.
Optionally, as another embodiment, at least one microlens may also exist in the microlens array 230 to correspond to a plurality of filter units in the filter unit array 230. For example, a plurality of second microlenses 232 may be included in the microlens array 230, each second microlens 232 corresponding to a plurality of filter units, for example, each second microlens 232 may correspond to 2 × 2 filter units, each filter unit corresponding to one pixel unit.
For another example, the microlens array 230 may also include at least one first microlens 231 and at least one second microlens 232, where each first microlens 231 corresponds to one filtering unit in the filtering unit array 210, and each second microlens 232 corresponds to a plurality of filtering units in the filtering unit array 210. For example, as shown in fig. 13, the microlens array 230 includes a plurality of first microlenses 231 corresponding to the filter units one by one, and also includes at least one second microlens 232 corresponding to 2 × 2 filter units.
As for the second microlenses 232 corresponding to a plurality of filter units described above, the number of filter units corresponding to the second microlenses 232 can be set according to practical applications, and can be set to any number. For example, the second microlenses 232 may correspond to 3 × 3 filter units or 1 × 2 filter units, and the embodiment of the present application is not limited thereto.
In addition, the plurality of filter units corresponding to the same second microlens 232 may have the same or different colors. For example, as shown in fig. 13, 2 × 2 filter units corresponding to the second microlenses 232 may all be white filter units, where the 2 × 2 white filter units may or may not belong to the same filter unit group, for example, 2 × 2 filter units corresponding to the second microlenses 232 may also belong to two or more adjacent filter unit groups, and the embodiment of the present application is not limited thereto.
When the plurality of white filter units are arranged corresponding to the same second microlens, the phase difference of the electrical signals can be calculated by using the electrical signals converted from the mixed light emitted from different emitting areas of the second microlens, so as to adjust the focal length of the image sensor according to the phase difference.
The image sensor 200 in the embodiment of the present application may also include other parts. As shown in fig. 12 and 13, a dielectric layer 240 may be further included between the filter unit array 210 and the pixel array 220.
As shown in fig. 12 and 13, the filter cell array 210 may further include a dielectric 215 and a reflective grid 216 around the periphery thereof; the pixel array 220 may include a semiconductor substrate 221 and a photosensitive element 222, wherein the photosensitive element 222 is located in the semiconductor substrate 221 and the photosensitive element 222 may be a PD. Optionally, the pixel array 220 may also include an isolation region 223 between the two photosensitive elements 222.
The fusion mode provided by the embodiment of the present application is described below with reference to fig. 14 to 18.
According to the embodiment of the application, the generation mode of the generated target image can be determined according to the light intensity information of the light signal sensed by the pixel unit array, that is, if the light intensity is different, the generation mode of the generated target image is also different.
Because the signal-to-noise ratio of the optical signal sensed by the white pixel unit is relatively high, the optical signal sensed by the white pixel unit can be preferentially used as the light intensity judgment basis.
The image sensor of the embodiment of the application can further include a processor, and the processor can be used for determining light intensity information according to the light signals sensed by the pixel unit array; and determining a generation mode for generating a target image of the shooting object by the pixel value of the color pixel unit and/or the pixel value of the white pixel unit according to the light intensity information.
In the embodiment of the present application, the light intensity may be divided into different ranges, each range corresponds to a different fusion process, and the fused target image is sent to an Image Signal Processor (ISP) for processing, as shown in fig. 14.
When the light intensity is greater than or equal to the first preset threshold, indicating that the light signal is saturated or close to saturation, the processor may generate the target image using flow 1.
When the light intensity is too large, the white pixel unit is saturated or close to saturation, and the pixel value of the white pixel unit is not used at the moment, the pixel value of the color pixel unit can be directly subjected to the reconstruction mosaic algorithm processing, and the RGB data is output. The mode can ensure that the generated target image is not too bright under the strong light environment and is closer to the real effect.
The processor may generate the target image using flow 2 when the light intensity is less than the first preset threshold and greater than or equal to the second preset threshold.
The processor can determine the texture information and carry out color guessing according to the pixel values of the white pixel units, and generate a target image according to the pixel values of the color pixel units. In the process, the white pixel unit does not participate in the fusion process, and is only used for providing the texture information and color guessing guidance of the target image.
For example, the gray scale information of the target image may be generated according to the pixel value of the white pixel unit, and the processor may be configured to determine the texture information of the target image according to the gray scale information.
In addition, when determining color information at the white pixel cell location, color estimation may be performed according to the pixel value of the white pixel cell and the colors of the pixel cells around the white pixel cell to improve the accuracy of color restoration of the target image. For example, the color information of the position where the white pixel unit is located and the pixel value of the color may be determined according to the pixel value of the white pixel unit and the color of the pixel unit around the white pixel unit.
The processor may generate the target image using flow 3 when the light intensity is less than the second preset threshold and greater than or equal to a third preset threshold.
The processor can generate first image data according to the pixel value of the white pixel unit, generate second image data according to the pixel value of the color pixel unit, and correct the first image data by using the correction coefficient alpha to obtain corrected first image data; and fusing the corrected first image data and the second image data to obtain the target image.
Wherein, the correction coefficient alpha is determined according to the intensity of the light intensity, and alpha is more than 0 and less than 1. It can be understood that the greater the light intensity, the smaller alpha, the fewer W pixels that need to be fused; the smaller the light intensity, the larger alpha, the more W pixels that need to be fused.
When the light intensity is between the second preset threshold and the third preset threshold, the processor can weaken the problem of excessive brightness improvement introduced when white pixels are fused, and only part of white pixel units are fused, so that the brightness of the generated target image is not too high or too low and is closer to the real effect.
When the light intensity is less than the third preset threshold, the processor may generate the target image using flow 4.
The processor may generate first image data according to a pixel value of the white pixel unit, generate second image data according to a pixel value of the color pixel unit, and fuse the first image data and the second image data to obtain the target image.
Under the condition that the light intensity is low, the pixel value of the white pixel unit can be completely fused into the target image, so that the brightness of the target image in a low-illumination environment can be ensured, and the image sensor can accurately restore the image in the low-illumination environment.
The image sensor in the embodiment of the present application may further include a light intensity determining module, where the light intensity determining module may be configured to determine an intensity of the light signal sensed by the pixel unit array, so that the processor determines a generation manner of generating the target image.
The fusion process in the embodiments of the present application will be described below with reference to fig. 15 to 18.
Fig. 15 shows the case of light saturation or near saturation, and the fusion process shown in fig. 15 corresponds to flow 1. In this case, it is possible to directly use the pixel values of the color pixel units and generate a target image of the photographic subject by the reconstruction mosaic algorithm.
Fig. 16 shows a case where the light is sufficient, and the fusion process shown in fig. 16 corresponds to flow 2. In this case, the W pixel unit may be separated from the pixel unit array, and a 4 × 4 full W pixel array may be obtained by interpolation, where the full W pixel array may be used to provide texture information of an image and perform color guessing, and the texture information and the color guessing information may be referred to in a process of generating a target image by the color pixel unit. In this process, the color pixel cells may generate a target image through a reconstruction mosaic algorithm.
Fig. 17 shows a case where light is insufficient, and the fusion process shown in fig. 17 corresponds to flow 3. In this case, a part of W pixel units may be fused. The specific process is as follows: white pixel units in the pixel unit array can be separated to obtain discrete W pixel units, and then interpolation can be carried out through an interpolation algorithm to obtain a 4 x 4 full W pixel array. The full-W pixel array can provide texture information and carry out color guessing in the process that the color pixel units generate the second image through the mosaic reconstruction algorithm. The light intensity determination module may determine light intensity information, which may be used to determine the correction factor α. The full W pixel array may be used to generate a first image. The pixel values of the full W pixel array may be multiplied by the correction factor α to obtain a corrected first image. And fusing the corrected first image and the second image to obtain a target image of the shooting object.
Fig. 18 shows a case where the light is low, and the fusion process shown in fig. 18 corresponds to flow 4. In this case, all the W pixel units may be fused. The specific process is as follows: white pixel units in the pixel unit array can be separated to obtain discrete W pixel units, and then interpolation can be carried out through an interpolation algorithm to obtain a 4 x 4 full W pixel array. The full-W pixel array can provide texture information and carry out color guessing in the process that the color pixel units generate the second image through the mosaic reconstruction algorithm. The light intensity determination module may determine that the correction coefficient α is 1. The full W pixel array can be used for generating a first image, and the first image and a second image are fused to obtain a target image of a shooting object.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. An image sensor, comprising:
the filter unit array comprises a plurality of filter unit groups, and each filter unit group in the plurality of filter unit groups comprises a white filter unit and a color filter unit;
the pixel unit array is located under the filtering unit array, pixel units in the pixel unit array correspond to filtering units in the filtering unit groups one to one, white pixel units in the pixel unit array are used for receiving optical signals passing through the white filtering units corresponding to the white pixel units, color pixel units in the pixel unit array are used for receiving optical signals passing through the color filtering units corresponding to the color pixel units, and light intensity information of the optical signals sensed by the pixel unit array is used for determining pixel values of the color pixel units and a generation mode of a target image of a shooting object generated by the pixel values of the white pixel units.
2. The image sensor according to claim 1, wherein each of the plurality of filter unit groups includes 4 x 4 filter units, a ratio of white filter units to color filter units in each of the filter unit groups is 1:1, and the white filter units and the color filter units are alternately arranged.
3. The image sensor of claim 2, wherein the color filter unit includes 2 red filter units, 2 blue filter units, and 4 green filter units.
4. The image sensor of claim 3, wherein the filter cells on one diagonal of each filter cell group are white filter cells, and the filter cells on the other diagonal are 2 red filter cells and 2 blue filter cells.
5. The image sensor as claimed in claim 4, wherein the 2 red filter cells on the other diagonal are arranged at a common vertex angle, and the 2 blue filter cells are arranged at a common vertex angle.
6. The image sensor according to claim 4 or 5, wherein the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column of each filter unit group;
the red filter units are positioned in a third row, a second column and a fourth row, a first column in each filter unit group;
the blue filter units are positioned in a fourth column in a first row and a third column in a second row in each filter unit group;
the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
7. The image sensor of claim 4, wherein the 2 red filter cells on the other diagonal are arranged at a common vertex angle, and the 2 blue filter cells are arranged separately; or
And 2 blue light filtering units on the other diagonal line are arranged at the same vertex angle, and 2 red light filtering units are separately arranged.
8. The image sensor according to claim 4 or 7, wherein the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column of each filter unit group;
the red filter units are positioned in a fourth column of a first row and a first column of a fourth row in each filter unit group;
the blue filter units are positioned in a second row, a third column and a third row and a second column in each filter unit group;
the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
9. The image sensor according to claim 4 or 7, wherein the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column of each filter unit group;
the red filter units are positioned in a second row, a third column and a third row and a second column in each filter unit group;
the blue filter units are positioned in a fourth column in the first row and a first column in the fourth row and in a first column in the fourth row in each filter unit group;
the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
10. The image sensor as claimed in claim 4, wherein the red filter cells and the blue filter cells on the other diagonal line are alternately arranged.
11. The image sensor according to claim 4 or 10, wherein the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column of each filter unit group;
the red filter units are positioned in the third column of the second row and the first column of the fourth row in each filter unit group;
the blue filter units are positioned in a first row, a fourth column, a third row and a second column in each filter unit group;
the green filter units are positioned in the first row, the second column, the second row, the first column, the third row, the fourth column, the fourth row and the third column in each filter unit group.
12. The image sensor of claim 3, wherein the filter cells on one diagonal of each filter cell group are white filter cells, and the filter cells on the other diagonal are green filter cells.
13. The image sensor of claim 12, wherein the white filter units are located in a first row and a first column, a first row and a third column, a second row and a second column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column of each filter unit group;
the red filter units are positioned in a first column of a second row and a third column of a fourth row in each filter unit group;
the blue filter units are positioned in a first row, a second column, a third row and a fourth column in each filter unit group;
the green filter units are positioned in a first row, a fourth column, a second row, a third column, a third row, a second column and a fourth row, a first column in each filter unit group.
14. The image sensor of any one of claims 1-13, further comprising a processor to:
determining light intensity according to the light signals sensed by the pixel unit array;
and when the light intensity is greater than or equal to a first preset threshold value, generating the target image by using the pixel values of the color pixel units.
15. The image sensor of any one of claims 1-13, further comprising a processor to:
when the light intensity is smaller than the first preset threshold and larger than or equal to a second preset threshold, determining texture information according to a pixel value of a white pixel unit, determining color information and a pixel value of the position of the white pixel unit according to the pixel value of the white pixel unit and colors of pixel units around the white pixel unit, and generating the target image according to the color information and the pixel value of the position of the white pixel unit and the pixel value of the color pixel unit.
16. The image sensor of any one of claims 1-13, further comprising a processor to:
when the light intensity is smaller than the second preset threshold and larger than or equal to a third preset threshold, generating first image data according to the pixel value of the white pixel unit, generating second image data according to the pixel value of the color pixel unit, and correcting the first image data by using a correction coefficient alpha to obtain corrected first image data; and fusing the corrected first image data and the second image data to obtain the target image, wherein a correction coefficient alpha is determined according to the light intensity, and alpha is more than 0 and less than 1.
17. The image sensor of any one of claims 1-13, further comprising a processor to:
and when the light intensity is smaller than the third preset threshold value, generating first image data according to the pixel value of the white pixel unit, generating second image data according to the pixel value of the color pixel unit, and fusing the first image data and the second image data to obtain the target image.
18. The image sensor of any one of claims 1-17, further comprising:
and the micro lens array comprises a plurality of micro lenses, is positioned above the light filtering unit array and is used for converging the optical signals returned by the shooting object to the light filtering unit array, wherein one micro lens in the plurality of micro lenses corresponds to at least one light filtering unit in the light filtering unit array.
19. The image sensor of claim 18, wherein the microlenses in the microlens array correspond one-to-one to the filter cells in the filter cell array.
20. An electronic device, comprising:
the image sensor of any one of claims 1 to 19.
CN202010724148.5A 2020-05-15 2020-07-24 Image sensor and electronic device Pending CN111629140A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010410639 2020-05-15
CN2020104106392 2020-05-15

Publications (1)

Publication Number Publication Date
CN111629140A true CN111629140A (en) 2020-09-04

Family

ID=72202804

Family Applications (11)

Application Number Title Priority Date Filing Date
CN202021297708.5U Active CN212435793U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021303789.5U Active CN212752379U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010637147.7A Pending CN111756974A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010635332.2A Pending CN111756972A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297709.XU Active CN212435794U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010636571.XA Pending CN111756973A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010708333.5A Active CN111614886B (en) 2020-05-15 2020-07-22 Image sensor and electronic device
CN202021510460.6U Active CN212752389U (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202021508422.7U Active CN212785522U (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202010724148.5A Pending CN111629140A (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202010724146.6A Pending CN111654615A (en) 2020-05-15 2020-07-24 Image sensor and electronic device

Family Applications Before (9)

Application Number Title Priority Date Filing Date
CN202021297708.5U Active CN212435793U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021303789.5U Active CN212752379U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010637147.7A Pending CN111756974A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010635332.2A Pending CN111756972A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297709.XU Active CN212435794U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010636571.XA Pending CN111756973A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010708333.5A Active CN111614886B (en) 2020-05-15 2020-07-22 Image sensor and electronic device
CN202021510460.6U Active CN212752389U (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202021508422.7U Active CN212785522U (en) 2020-05-15 2020-07-24 Image sensor and electronic device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010724146.6A Pending CN111654615A (en) 2020-05-15 2020-07-24 Image sensor and electronic device

Country Status (2)

Country Link
CN (11) CN212435793U (en)
WO (1) WO2021227250A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023082766A1 (en) * 2021-11-12 2023-05-19 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112235494B (en) * 2020-10-15 2022-05-20 Oppo广东移动通信有限公司 Image sensor, control method, imaging device, terminal, and readable storage medium
CN112312097B (en) * 2020-10-29 2023-01-24 维沃移动通信有限公司 Sensor with a sensor element
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
CN113037980A (en) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor
CN115225832A (en) * 2021-04-21 2022-10-21 海信集团控股股份有限公司 Image acquisition equipment, image encryption processing method, equipment and medium
CN113540138B (en) * 2021-06-03 2024-03-12 奥比中光科技集团股份有限公司 Multispectral image sensor and imaging module thereof
CN113676651B (en) * 2021-08-25 2023-05-26 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113676652B (en) * 2021-08-25 2023-05-26 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113852797A (en) * 2021-09-24 2021-12-28 昆山丘钛微电子科技股份有限公司 Color filter array, image sensor and camera module
CN114125240A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, electronic equipment and shooting method
CN114363486A (en) * 2021-12-14 2022-04-15 Oppo广东移动通信有限公司 Image sensor, camera module, electronic equipment, image generation method and device
CN114157795A (en) * 2021-12-14 2022-03-08 Oppo广东移动通信有限公司 Image sensor, camera module, electronic equipment, image generation method and device
CN114823985B (en) * 2022-05-31 2024-04-09 深圳市聚飞光电股份有限公司 Photoelectric sensor and packaging method thereof
CN115696078B (en) * 2022-08-01 2023-09-01 荣耀终端有限公司 Color filter array, image sensor, camera module and electronic equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9479745B2 (en) * 2014-09-19 2016-10-25 Omnivision Technologies, Inc. Color filter array with reference pixel to reduce spectral crosstalk
TWI552594B (en) * 2014-10-27 2016-10-01 聯詠科技股份有限公司 Color filter array for image sensing device and manufacturing method thereof
CN105282529B (en) * 2015-10-22 2018-01-16 浙江宇视科技有限公司 A kind of digital wide dynamic approach and device based on RAW spaces
CN105430359B (en) * 2015-12-18 2018-07-10 广东欧珀移动通信有限公司 Imaging method, imaging sensor, imaging device and electronic device
CN105516697B (en) * 2015-12-18 2018-04-17 广东欧珀移动通信有限公司 Imaging sensor, imaging device, mobile terminal and imaging method
CN105578078B (en) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 Imaging sensor, imaging device, mobile terminal and imaging method
CN105578071B (en) * 2015-12-18 2018-03-20 广东欧珀移动通信有限公司 Imaging method, imaging device and the electronic installation of imaging sensor
WO2017101864A1 (en) * 2015-12-18 2017-06-22 广东欧珀移动通信有限公司 Image sensor, control method, and electronic device
CN105516700B (en) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 Imaging method, imaging device and the electronic installation of imaging sensor
CN107105140B (en) * 2017-04-28 2020-01-24 Oppo广东移动通信有限公司 Dual-core focusing image sensor, focusing control method thereof and imaging device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023082766A1 (en) * 2021-11-12 2023-05-19 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus

Also Published As

Publication number Publication date
CN111614886A (en) 2020-09-01
CN111756973A (en) 2020-10-09
CN212785522U (en) 2021-03-23
CN212752389U (en) 2021-03-19
CN111756974A (en) 2020-10-09
CN111654615A (en) 2020-09-11
CN212435793U (en) 2021-01-29
CN212435794U (en) 2021-01-29
CN111614886B (en) 2021-10-19
CN212752379U (en) 2021-03-19
WO2021227250A1 (en) 2021-11-18
CN111756972A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
CN212785522U (en) Image sensor and electronic device
JP4971323B2 (en) Color and panchromatic pixel processing
JP4421793B2 (en) Digital camera
JP5345944B2 (en) Low resolution image generation
JP5462345B2 (en) Image sensor with improved light sensitivity
US8587681B2 (en) Extended depth of field for image sensor
JP5149279B2 (en) Image sensor with improved light sensitivity
CN102339839B (en) CMOS image sensor with improved photodiode area allocation
US6924841B2 (en) System and method for capturing color images that extends the dynamic range of an image sensor using first and second groups of pixels
US7812869B2 (en) Configurable pixel array system and method
KR100827238B1 (en) Apparatus and method for supporting high quality image
US20090021612A1 (en) Multiple component readout of image sensor
JP2009524988A (en) Image sensor with improved light sensitivity
JP2009506646A (en) Image sensor with improved light sensitivity
JP2008005488A (en) Camera module
WO2009025825A1 (en) Image sensor having a color filter array with panchromatic checkerboard pattern
US9332199B2 (en) Imaging device, image processing device, and image processing method
US20110115954A1 (en) Sparse color pixel array with pixel substitutes
CN210143059U (en) Image sensor integrated circuit, image sensor, and imaging system
CN113141475A (en) Imaging system and pixel merging method
CN113132692B (en) Image sensor, imaging device, electronic apparatus, image processing system, and signal processing method
JP2009303020A (en) Image capturing apparatus and defective pixel correcting method
CN114679551A (en) Solid-state imaging device, signal processing method for solid-state imaging device, and electronic apparatus
CN117751576A (en) Demosaicing-free pixel array, image sensor, electronic device and operation method thereof
JP2011199426A (en) Solid-state image pickup element, image pickup device, and smear correction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination