CN212435794U - Image sensor and electronic device - Google Patents

Image sensor and electronic device Download PDF

Info

Publication number
CN212435794U
CN212435794U CN202021297709.XU CN202021297709U CN212435794U CN 212435794 U CN212435794 U CN 212435794U CN 202021297709 U CN202021297709 U CN 202021297709U CN 212435794 U CN212435794 U CN 212435794U
Authority
CN
China
Prior art keywords
column
row
color filter
units
filter units
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202021297709.XU
Other languages
Chinese (zh)
Inventor
程祥
王迎磊
张玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Application granted granted Critical
Publication of CN212435794U publication Critical patent/CN212435794U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

Provided are an image sensor and an electronic device capable of improving the performance of the image sensor. The image sensor includes: a filter cell array including a plurality of filter cell groups, each of the plurality of filter cell groups including 4 × 4 filter cells; in the 4 × 4 filter units, each row and each column includes 2 white filter units and 2 color filter units, and at least one diagonal line includes 4 white filter units or 4 color filter units of the same color; the pixel unit array comprises a plurality of pixel units, the pixel unit array is positioned below the light filtering unit array, and the plurality of pixel units in the pixel unit array correspond to the plurality of light filtering units in the light filtering unit array one by one. By the scheme of the embodiment of the application, image processing is conveniently carried out on a subsequent image algorithm, image color recovery is facilitated, loss of image lines/image detail information is reduced, and generation of color moire fringes is avoided.

Description

Image sensor and electronic device
The present application claims priority from the chinese patent office, application No. 202010410639.2, entitled "image sensor and electronic device," filed on 15/5/2020, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of sensors, and more particularly, to an image sensor and an electronic device.
Background
An image sensor is an electronic device that converts an optical image into a digital signal, and generally includes a pixel cell array composed of a plurality of pixel cells, each pixel cell in the pixel cell array being used to form one pixel value in the image. In order to enable the image sensor to capture a color image, a Color Filter (CF) may be disposed above the pixel unit so that the pixel unit may receive a light signal of a specific color, forming a pixel value corresponding to the light signal of the specific color.
However, when the color filter is provided, the light received by each pixel unit is reduced, which results in a reduced signal-to-noise ratio (SNR) of the image, and thus affects the image quality. And if the image sensor is applied to a mobile device, the size of the image sensor is limited, and the photosensitive area of the corresponding pixel unit array is also limited, so that the image quality is further limited in a low-light environment.
SUMMERY OF THE UTILITY MODEL
The embodiment of the application provides an image sensor and electronic equipment, and aims to solve the problem of image quality reduction in a low-illumination environment.
In a first aspect, an image sensor is provided, including: a filter cell array including a plurality of filter cell groups, each of the plurality of filter cell groups including 4 × 4 filter cells; in the 4 × 4 filter units, each row and each column includes 2 white filter units and 2 color filter units, and at least one diagonal includes 4 white filter units or 4 color filter units of the same color; the pixel unit array comprises a plurality of pixel units, the pixel unit array is positioned below the light filtering unit array, and the plurality of pixel units in the pixel unit array correspond to the plurality of light filtering units in the light filtering unit array one by one.
Through the scheme of this application embodiment, because in every filtering unit group, every row and every row all include 2 white filtering units and 2 colored filtering units on, and include 4 white filtering units or 4 colored filtering units of same colour on at least one diagonal, not only be convenient for follow-up image algorithm carry out image processing, still be favorable to image colour to resume, reduce the loss of image line/image detail information, and avoid producing colored moire.
In some possible embodiments, the filtering unit group includes first, second and third color filtering units of different colors, and the number of the first color filtering units is equal to the sum of the numbers of the second and third color filtering units.
In some possible embodiments, in the filter unit group, the number of the second color filter units is equal to the number of the third color filter units.
In some possible embodiments, in the filtering unit group, the color filtering units are arranged at diagonal positions, and the white filtering units are arranged at off-diagonal positions.
In some possible embodiments, in the set of filter units, 4 filter units on one diagonal are all the first color filter units, and 4 filter units on the other diagonal include 2 second color filter units and 2 third color filter units.
In some possible embodiments, on the other diagonal line, 2 of the second color filter units are disposed adjacent to each other at a common vertex, and 2 of the third color filter units are disposed adjacent to each other at a common vertex.
In some possible embodiments, in the filter unit group, the white filter units are respectively located in the first row, the second column, the first row, the third column, the second row, the first column, the second row, the fourth column, the third upper first column, the third row, the fourth column, the fourth row, the second column, the fourth row, the third column; the second color filter units are respectively positioned in a fourth column of the first row and a third column of the second row; the first color filter units are respectively positioned in a first row and a first column, a second row and a second column, a third row and a third column, and a fourth row and a fourth column; the third color filter units are respectively positioned in the third row, the second column and the fourth row, the first column.
In some possible embodiments, 2 second color filter units are spaced from 2 third color filter units on the other diagonal.
In some possible embodiments, in the filter unit group, the white filter units are respectively located in the first row, the second column, the first row, the third column, the second row, the first column, the second row, the fourth column, the third upper first column, the third row, the fourth column, the fourth row, the second column, the fourth row, the third column; the second color filter units are respectively positioned in a fourth column of the first row and a second column of the third row; the first color filter units are respectively positioned in a first row and a first column, a second row and a second column, a third row and a third column, and a fourth row and a fourth column; the third color filter units are respectively positioned in the third column of the second row and the first column of the fourth row.
In some possible embodiments, in the set of filter units, the white filter units are arranged at diagonal positions, and the color filter units are arranged at off-diagonal positions.
In some possible embodiments, the color filter units in each row and each column in the filter unit group are different in color.
In some possible embodiments, in the filtering unit group, the color filtering units of the same color are adjacently arranged at the same vertex angle.
In some possible embodiments, in the filter unit group, the white filter units are respectively located in a first row and a first column, a first row and a fourth column, a second row and a second column, a second row and a third column, a third upper second column, a third row and a third column, a fourth row and a first column, and a fourth row and a fourth column; the second color filter units are respectively positioned in a third column in the first row and a fourth column in the second row; the first color filter units are respectively positioned in a first row, a second column, a second row, a first column, a third row, a fourth column and a fourth row, a third column; the third color filter units are respectively positioned in a third row, a first column and a fourth row, a second column.
In some possible embodiments, the set of filter units includes 2 × 2 sets of white filter units and 2 × 2 sets of color filter units, the 2 × 2 sets of white filter units are disposed on a diagonal of the set of filter units, and the 2 × 2 sets of color filter units are disposed on another diagonal of the set of filter units.
In some possible embodiments, in the 2 × 2 color filter sets, each color filter includes 2 first color filters arranged at a common vertex, and 1 second color filter and 1 third color filter arranged at a common vertex.
In some possible embodiments, the relative position relationship of the first color filter units in the 2 × 2 color filter unit sets is the same.
In some possible embodiments, in the 2 × 2 color filter sets, the relative position relationship of the first color filter is different, and the relative position relationship of the third color filter is the same.
In some possible embodiments, in the filter unit group, the white filter units are respectively located in a first row, a third column, a first row, a fourth column, a second row, a third column, a second row, a fourth column, a third upper first column, a third row, a second column, a fourth row, a first column, a fourth row, a second column; the second color filter units are respectively positioned in the second column of the first row and the third column of the fourth row; the first color filter units are respectively positioned in a first row and a first column, a second row and a first column, a third row and a third column, and a fourth row and a fourth column; the third color filter unit is respectively positioned in the first column of the second row and the fourth column of the third row.
In some possible embodiments, the first color filter unit, the second color filter unit, and the third color filter unit are configured to pass light signals of three colors respectively, and the wavelength bands of the light signals of the three colors cover the visible light wavelength band.
In some possible embodiments, the first color filter unit, the second color filter unit, and the third color filter unit have three colors of red, green, blue, cyan, magenta, and yellow, respectively.
In some possible embodiments, the first color filter unit is a green filter unit, the second color filter unit and the third color filter unit are a red filter unit and a blue filter unit, respectively.
In some possible embodiments, the image sensor further includes: and the micro lens array comprises a plurality of micro lenses, is positioned above the light filtering unit array and is used for converging the optical signals returned by the shooting object to the light filtering unit array.
In some possible embodiments, the plurality of microlenses in the microlens array correspond to the plurality of filter units in the filter unit array in a one-to-one manner.
In some possible embodiments, the microlens array includes at least one first microlens corresponding to one white filter unit in the filter unit array and at least one second microlens corresponding to four color filter units in the filter unit array.
In some possible embodiments, the pixel values of the white pixel cells in the pixel cell array are used for generating first image data of a photographic subject, the pixel values of the color pixel cells in the pixel cell array are used for generating second image data of the photographic subject, and the first image data and the second image data are used for synthesizing a target image of the photographic subject; the white pixel unit is a pixel unit corresponding to the white filter unit, and the color pixel unit is a pixel unit corresponding to the color filter unit.
In some possible embodiments, the pixel values of the color pixel cells in the pixel cell array are used to generate an intermediate image through interpolation processing, and the intermediate image is used to generate the second image data in Bayer format through demosaic processing.
In some possible embodiments, of the 2 × 2 pixel values of the intermediate image, 2 pixel values are original pixel values of the color pixel unit, and the other 2 pixel values are pixel values obtained through interpolation processing.
In some possible embodiments, the first image data and the second image data have the same resolution.
In some possible embodiments, the image sensor is a complementary metal oxide semiconductor CMOS image sensor, or a charge coupled device CCD image sensor.
In a second aspect, an electronic device is provided, comprising: the image sensor of the first aspect or any one of the possible embodiments of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of an image sensor according to an embodiment of the present disclosure.
Fig. 2 is a schematic top view of another image sensor provided in an embodiment of the present application.
Fig. 3 is a schematic cross-sectional view of the image sensor of fig. 2 along a-a'.
Fig. 4 and 5 are schematic cross-sectional views of the image sensor of fig. 2 taken along a-a'.
Fig. 6 is a schematic arrangement diagram of a filter unit in a filter unit array according to an embodiment of the present application.
Fig. 7 to 14 are schematic arrangement diagrams of filter units in several filter unit groups provided in embodiments of the present application.
Fig. 15 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 16 is an image schematic diagram of the image processing method of fig. 15.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
It should be understood that the specific examples are provided herein only to assist those skilled in the art in better understanding the embodiments of the present application and are not intended to limit the scope of the embodiments of the present application.
It should also be understood that the various embodiments described in this specification can be implemented individually or in combination, and the examples in this application are not limited thereto.
The technical solution of the embodiment of the present application may be applied to various image sensors, such as a Complementary Metal Oxide Semiconductor (CMOS) image sensor (CIS) or a Charge Coupled Device (CCD) image sensor, but the embodiment of the present application is not limited thereto.
As a common application scenario, the image sensor provided in the embodiment of the present application may be applied to a smart phone, a camera, a tablet computer, and other mobile terminals or other terminal devices having an imaging function.
Fig. 1 shows a schematic structural diagram of an image sensor. As shown in fig. 1, the image sensor 100 includes: a pixel array (pixel array)110, a row selection circuit 120, a column selection circuit 130, a control circuit 140, an analog to digital converter (ADC) circuit 150, a front-end signal processing circuit 160, and a back-end signal processing circuit 170.
Specifically, as shown in fig. 1, a plurality of square pixel cells in the pixel cell array 110 are arranged in M rows × N columns, where M, N is a positive integer. Generally, the row direction of the M rows and the column direction of the N columns are perpendicular to each other on the plane of the pixel unit array 110. In some cases, for convenience of description, in one plane, two directions perpendicular to each other, such as a row direction and a column direction in the present application, may be referred to as a horizontal direction and a vertical direction.
In the pixel cell array 110 shown in fig. 1, any one side of each square pixel cell is parallel or perpendicular to the row direction or the column direction.
Optionally, the pixel unit may include a photodiode, a field effect switching transistor, and other devices for receiving an optical signal and converting the optical signal into a corresponding electrical signal.
Alternatively, if the image sensor needs to collect a color image, a Color Filter Array (CFA) may be disposed above the pixel unit array 110, wherein one color filter unit may be disposed above each pixel unit, and for the purpose of description, the pixel unit above which the color filter unit is disposed is also referred to as a color pixel unit, for example, the pixel unit above which the red filter unit is disposed is referred to as a red pixel unit (denoted by R in fig. 1), the pixel unit above which the green filter unit is disposed is referred to as a green pixel unit (denoted by G in fig. 1), and the pixel unit above which the blue filter unit is disposed is referred to as a blue pixel unit (denoted by B in fig. 1).
Currently, most CFAs of image sensors use a Bayer (Bayer) format based on three primary colors of RGB, for example, as shown in fig. 1, a CFA in the Bayer format is disposed above the pixel cell array 110, and the pixel cell array 110 uses 2 × 2 pixel cells as basic cells, each of which includes 1R pixel cell, 1B pixel cell, and 2G pixel cells, wherein 2G pixel cells are disposed adjacent to each other at a common vertex angle.
The row selection circuit 120 is connected to each row of pixel cells in the pixel cell array 110 through M row control lines, and may be used to turn on and off each pixel cell in each row of pixel cells. For example, the row selection circuit 120 is connected to the gate of the fet of each pixel in the first row of the pixel array 110 via a row control line, and controls the operating state of the photodiode by turning on or off the fet. Wherein, M row control lines are all parallel to the horizontal direction.
The column selection circuit 130 is connected to each column of pixel cells in the pixel cell array 110 through N column control lines, and may be configured to select a signal value output of each pixel cell in each column. For example, the column selection circuit 130 is connected to the source of the field effect switch tube of each pixel unit in the first column in the pixel unit array 110 through a column control line, and controls the output of the electrical signal converted by the photodiode. Wherein, N column control lines are all parallel to the vertical direction.
The control circuit 140 is connected to the row selection circuit 120 and the column selection circuit 130, and is configured to provide timing for the row selection circuit 120 and the column selection circuit 130, control the row selection circuit 120 and the column selection circuit 130 to select a pixel unit in the pixel unit array 110, and output a pixel value of the pixel unit.
Optionally, after the row selecting circuit 120, the column selecting circuit 130 and the control circuit 140 cooperate to output the pixel value generated by the pixel unit array 110, the pixel value of the pixel unit array 110 is transmitted to the ADC circuit 150 for analog-to-digital conversion, and the analog pixel value is converted into a digital pixel value to form a digital image, so that the subsequent signal processing circuit 160 can perform image processing conveniently to output an optimized color image.
Alternatively, the signal processing circuit 160 may include, but is not limited to, an Image Signal Processor (ISP) for performing linearization processing, dead pixel removal, noise removal, color correction, demosaic (demosaic), Automatic Exposure Control (AEC), Automatic Gain Control (AGC), Automatic White Balance (AWB), and the like on the digital image.
With the above-described image sensor 100 employing the Bayer format CFA, the red pixel cell can receive only the red light signal, the green pixel cell can receive only the green light signal, the blue pixel cell can receive only the blue light signal, and the intensity of the light signal received by each pixel cell is small, resulting in a large SNR of an image, thereby affecting image quality.
In addition, in the image sensor of the Bayer CFA, high-frequency information of luminance and chrominance information in an image are likely to overlap, color aliasing (color aliasing) occurs, and color moire (color moire) is likely to occur.
Based on the above problem, the present application provides an image sensor, in which a white filtering unit is added in a CFA, a part of pixel units in a pixel unit array receives color light signals, and a part of pixel units receives white light signals, so as to increase the intensity of the light signals received by the part of pixel units, and on this basis, the pixel values of a plurality of pixel units in the pixel unit array are processed, so that on the basis of ensuring image color information, image quality parameters such as SNR and resolution of an image are improved, and an optimized color image is obtained.
Fig. 2 is a schematic top view of an image sensor 200 according to an embodiment of the present disclosure, and fig. 3 is a schematic cross-sectional view of the image sensor 200 along a direction a-a'.
As shown in fig. 2 and 3, the image sensor 200 includes:
a filter cell array 210 including a plurality of filter cell groups 211, each of the plurality of filter cell groups 211 including 4 × 4 filter cells,
in the 4 × 4 filter units, each row and each column includes 2 white filter units and 2 color filter units, and at least one diagonal line includes 4 white filter units or color filter units with the same color.
The pixel unit array 220 is located below the filter unit array 210, and includes a plurality of pixel units, and a plurality of pixel units in the pixel unit array 220 correspond to a plurality of filter units in the filter unit array 210 one to one.
In one possible implementation, as shown in fig. 3, a plurality of filter units in the filter unit array 210 may be disposed on an upper surface of a plurality of pixel units in the pixel unit array 220; in another possible implementation, the plurality of filter units in the filter unit array 210 may be disposed above the plurality of pixel units in the pixel unit array 220 in a floating manner.
Further, as shown in fig. 3, as an example, each filter unit in the filter unit array 210 is correspondingly disposed right above each pixel unit in the pixel unit array 220, in other words, the center of each filter unit coincides with the center of its corresponding pixel unit in the vertical direction. Except for this way, each filter unit in the filter unit array 210 is correspondingly disposed above and obliquely above each pixel unit in the pixel unit array 220, at this time, each pixel unit in the pixel unit array 220 may receive an optical signal in an oblique direction, and the embodiment of the present application does not limit the specific position of the filter unit array 210.
The pixel unit corresponding to the color filter unit in the pixel unit array 220 is configured to receive the color light signal passing through the color filter unit and output a color pixel value correspondingly; the pixel unit corresponding to the white filter unit in the pixel unit array 220 is configured to receive the white light signal passing through the white filter unit and output a corresponding white pixel value; the colored light signal and the white light signal are used together to generate a target image of a photographic subject. For example, a pixel cell corresponding to the red filter cell receives a red light signal, and a pixel value output correspondingly may be referred to as a red pixel value; the pixel unit corresponding to the white filter unit receives the white light signal, and the pixel corresponding to the output may be referred to as a white pixel value.
Further, fig. 4 and 5 show schematic cross-sectional views of two other image sensors 200 along the direction a-a'.
As shown in fig. 4 and 5, the image sensor 200 includes, in addition to the filter cell array 210 and the pixel cell array 220 described above:
the microlens array 230, which includes a plurality of microlenses, is disposed above the filter unit array 210, and is used for converging the optical signal returned by the photographic subject to the filter unit array 210 and reducing crosstalk between optical signals of adjacent pixel units.
As shown in fig. 4, the plurality of microlenses in the microlens array 230 correspond one-to-one to the plurality of filter units in the filter unit array 210 and the plurality of pixel units in the pixel unit array 220.
In some embodiments, the pixel structure in the image sensor may be referred to as an on-chip lens (OCL) pixel structure.
As shown in fig. 5, a part of the first microlenses in the microlens array 230 corresponds to four white filter cells in the filter cell array 210 and four white pixel cells in the pixel cell array 220, and another part of the second microlenses corresponds to one color filter cell in the filter cell array 210 and one color pixel cell in the pixel cell array 220.
Alternatively, the radius of the curved surface of the first microlens may be 2 times the radius of the curved surface of the second microlens.
In some embodiments, the pixel structure in the image sensor may be referred to as a 2 × 2OCL pixel structure.
Further, as shown in fig. 4 and 5, in the image sensor 200, a dielectric layer 240 may be further included between the filter cell array 210 and the pixel cell array 220 for connecting the pixel cell array 220 and the filter cell array 210.
In addition, the filter cell array 210 may further include a dielectric 215 and a reflective grid 216 located at the periphery; the reflection grid is used for reflecting the optical signals incident at large angles and preventing the optical signals from being lost.
The pixel unit array 220 may include a semiconductor substrate 221 and a photosensitive element 222, wherein the photosensitive element 222 is located in the semiconductor substrate 221, and the photosensitive element 222 includes, but is not limited to, a Photodiode (PD). Optionally, the pixel unit array 220 may further include an isolation region 223 between two photosensitive elements 222 to prevent electrical signal interference between two adjacent photosensitive elements.
It is understood that the image sensor 200 may include other stacked structures besides the basic structures shown in fig. 4 and 5, such as at least one metal interconnection layer to electrically connect a plurality of pixel units in a pixel unit array, and the like, and the structure of the image sensor is not limited thereto in the embodiments of the present application.
Alternatively, the top view shown in fig. 2 is also a schematic arrangement diagram of the filter unit array 210 according to the embodiment of the present application.
As shown in fig. 2, in the embodiment of the present application, each of the filter units in one filter unit group 211 is a quadrilateral filter unit, for example, the filter units may be square filter units, and 16 square filter units form a square filter unit group. In the filter unit group 211, the number of color filter units (indicated by the hatched blocks in the figure) and the number of white filter units (indicated by the blank blocks in the figure) are equal, that is, 8 color filter units and 8 white filter units are included. In the plane of the image sensor, one filter unit group comprises filter units in two forms of a white filter unit and a color filter unit in the horizontal direction (row direction) and the vertical direction (column direction), 4 color filter units with the same color in the + 45-degree direction (first diagonal direction) and color filter units with other colors in the-45-degree direction (second diagonal direction).
It should be noted that, in the present application, the white filter unit refers to a filter or a filter material for transmitting white light, and in some embodiments, the white filter unit may also be a transparent material or an air gap for transmitting all optical signals including white light in the environment. In particular, the white light may be a mixture of colored lights. For example, light of three primary colors in the spectrum: blue, red and green, mixed in a certain proportion to obtain white light, or the mixture of all visible light in the spectrum is also white light.
Correspondingly, the color filter unit refers to a filter or a filter material for transmitting color light. Specifically, the colored light may be a light signal in any wavelength range in the visible light spectrum, for example, the red filter unit may be configured to transmit red light, which may be a light signal in the wavelength range of 620nm to 750nm in the visible light spectrum. Similarly, the color filter units of other colors are also used to transmit the light signals of the corresponding colors.
Through the scheme of this application embodiment, because in every filtering unit group, every row and every row all include 2 white filtering units and 2 colored filtering units on, and include 4 white filtering units or 4 colored filtering units of same colour on at least one diagonal, not only be convenient for follow-up image algorithm carry out image processing, still be favorable to image colour to resume, reduce the loss of image line/image detail information, and avoid producing colored moire.
Alternatively, in some embodiments, the color filter units may include color filter units of three colors (a first color filter unit, a second color filter unit, and a third color filter unit), for example, may be filter units of three primary colors, i.e., filter units of three colors of red, green, and blue (RGB), or may also be filter units of three complementary colors, i.e., filter units of three colors of cyan, magenta, and yellow (CMY), or may also be filter units of two complementary colors of one primary color, or filter units of two primary colors of one complementary color.
In the case that the filtering unit group includes the color filtering units of three colors, color optical signals of three colors passing through the color filtering units of three colors may cover a visible light band, and the specific colors of the color filtering units are not limited in the embodiment of the present application.
For convenience of description, the color filter units in the filter unit group 211 are illustrated as including color filter units of three colors, i.e., red, green, and blue, and when the filter unit group includes color filter units of other three colors, the following technical solutions may be referred to.
Fig. 6 is a schematic arrangement diagram of filter cells in a filter cell group 211.
As shown in fig. 6, the filter unit group 211 includes a white filter unit 201, a red filter unit 202, a green filter unit 203, and a blue filter unit 204, and optionally, a ratio of the number of the white filter units 201, the number of the green filter units 203, the number of the red filter units 202, and the number of the blue filter units 204 is 4: 2: 1: 1.
in the filter unit group 211, the color filter units are arranged on diagonal lines, and the white filter units are arranged on off-diagonal lines.
As shown in fig. 6, 8 white filter units 201 are respectively located in the 1 st row, the 2 nd column, the 1 st row, the 3 rd column, the 2 nd row, the 1 st column, the 2 nd row, the 4 th column, the 3 rd row, the 1 st column, the 3 rd row, the 4 th column, the 4 th row, the 2 nd column and the 4 th row, the 3 rd column. The 8 white filter cells 201 may be denoted as W12、W13、W21、W24、W31、W34、W42And W43
Optionally, in the filtering unit group 211, 4 filtering units on one diagonal are green filtering units, and 4 filtering units on the other diagonal include 2 red filtering units and 2 blue filtering units, where 2 red filtering units are adjacent to each other at a common vertex angle, and 2 blue filtering units are adjacent to each other at a common vertex angle.
As an example, as shown in fig. 6, 2 red filter units 202 are respectively located in the 1 st row, the 4 th column and the 2 nd row, the 3 rd column, and the 2 red filter units 202 can be represented as R14And R23
The 4 green filter units 203 are respectively located in the 1 st row, the 1 st column, the 2 nd row, the 3 rd column, the 3 rd row, the 3 rd column and the 4 th row, the 4 green filter units 203 can be represented as G11,G23,G33And G44
The 2 blue filter units 204 are respectively located in the 3 rd row, the 2 nd column and the 4 th row, the 1 st column, and the 2 blue filter units 204 can be represented as B32And B41
Alternatively, based on the position of the white filter 201 in fig. 6 being unchanged, the positions of the color filters of different colors in fig. 6 may be changed according to the requirements of the colors of the color filters on the diagonal.
For example, based on the filter unit group in fig. 6, positions of the red filter unit and the blue filter unit are changed while positions of the white filter unit and the green filter unit are kept unchanged.
For another example, based on the filter unit group in fig. 6, only the white filter unit position is kept unchanged, and the positions of the green filter unit, the red filter unit, and the blue filter unit therein are changed.
Optionally, in some embodiments, the center of the set of filter cells includes a set of target filter cells in a Bayer pattern.
In the case where the above conditions are satisfied, fig. 7 is a schematic arrangement diagram of filter units in several changed filter unit groups, based on the filter unit group in fig. 6, in the case where the positions of the white filter unit and the green filter unit are maintained. Fig. 8 is a schematic arrangement diagram of filter units in several changed filter unit groups, based on the filter unit group in fig. 6, in the case of keeping only the position of the white filter unit unchanged.
By adopting the method of the embodiment of the application, a target filtering unit set in a Bayer format can be directly formed in one filtering unit group, and the accuracy of color recovery can be greatly improved.
Further, as shown in fig. 6, fig. 7 (a), and fig. 8 (a) and (b), since the filter units of the same color are all arranged at the same vertex angle, the difficulty of the color estimation part in the subsequent image algorithm can be reduced, and the image processing efficiency of the image sensor can be improved.
It will be appreciated that for each of the embodiments of the above application, it is within the scope of the present application to perform the geometric transformation, for example, rotation, on the filter unit set.
For example, when the filter unit group shown in fig. 6 is rotated by 90 ° counterclockwise or 90 ° clockwise, the filter unit group shown in (a) or (b) of fig. 8 may be formed.
In addition to fig. 6, the filter unit group formed by rotating any one of the filter unit groups in fig. 7 to 8 also belongs to the protection scope of the present application, and is not shown in a list.
The filter unit groups of the above embodiments are obtained by conversion based on fig. 6, and in addition to the above-mentioned structures, the filter unit group of the present application may be the filter unit group 211 shown by several dashed boxes in fig. 9 to 11.
It is understood that the filter unit group 211 in fig. 9 to 11 and the filter unit group 211 in fig. 2 may form a filter unit array having the same central area, and the difference is only in the arrangement form of the filter units of one or two circles at the outermost periphery of the filter unit array 210, and therefore, the filter unit array formed by the filter unit group 211 in fig. 2, 9 to 11 may be equivalent to the same filter unit array.
Therefore, in the present application, any 4 × 4 filtering units in the filtering unit array 210 in fig. 2 may be divided into one filtering unit group, and the filtering unit group in any division case is within the protection scope of the present application.
Similarly, the filter unit array formed by any one of the filter unit groups in the present application is also within the protection scope of the present application.
Referring to the filter unit group shown in fig. 9, in the filter unit group 211 shown in fig. 9, white filter units are disposed on two diagonal lines, color filter units are disposed on non-diagonal lines, and color filter units of the same color are disposed at common vertex angles.
Optionally, in the filter unit group in the embodiment of the present application, the color filter units on each row and each column are different in color.
As shown in fig. 9, 8 white filter units 201 are respectively located in the 1 st row, the 4 th column, the 2 nd row, the 2 nd column, the 2 nd row, the 3 rd column, the 3 rd row, the 2 nd column, the 3 rd row, the 3 rd column, the 4 th row, the 1 st column and the 4 th row, the 4 th column. The 8 white filter cells 201 may be denoted as W11、W14、W22、W23、W32、W33、W41And W44
Optionally, in the filter unit group in the embodiment of the present application, the color filter units of the same color are adjacently disposed at a common vertex angle.
As an example, as shown in fig. 9, 2 red filter units 202 are respectively located in the 3 rd row, the 4 th column and the 4 th row, the 3 rd column, and the 2 red filter units 202 can be represented as R34And R43
The 4 green filter units 203 are respectively located in the 1 st row, the 3 rd column, the 2 nd row, the 4 th column, the 3 rd row, the 1 st column and the 4 th row, the 2 nd column, and the 4 green filter units 203 can be represented as G13,G24,G31And G42
The 2 blue filter units 204 are respectively located in the 3 rd row, the 4 th column and the 4 th row, the 3 rd column, and the 2 blue filter units 204 can be represented as B34And B43
Similar to the above embodiment, the positions of the color filter cells of different colors in fig. 9 can be changed based on the position of the white filter cell in fig. 9 being unchanged, and a plurality of filter cell groups can be obtained.
For example, as shown in fig. 12 (a) to (c), in the several filter unit groups, filter units of the same color are arranged at common corners.
As shown in fig. 12 (d) to (g), only the green filter cells of the several filter cell groups are arranged at the common vertex.
It should be noted here that, if the color filter units of the same color are arranged at the common vertex, that is, in the multiple filter unit groups shown in fig. 9 and (a) to (c) in fig. 12, the color pixel values corresponding to the two color filter units of the same color at the common vertex can be directly used for performing interpolation calculation to obtain the color pixel value at the corresponding position of the white filter unit at the common vertex, so as to reduce the complexity of the subsequent image processing algorithm, and the specific image processing procedure will be described in detail below.
Referring to the filter unit group shown in fig. 10, in the filter unit group 211 shown in fig. 10, 2 × 2 white filter sets and 2 × 2 color filter sets are included, the 2 × 2 white filter sets are disposed on one diagonal of the filter unit group, and the 2 × 2 color filter sets are disposed on the other diagonal of the filter unit group.
Optionally, in this embodiment of the application, in the 2 sets of 2 × 2 color filter units, each color filter unit includes 2 green filter units arranged at a common vertex angle, and the other 2 common vertex angles are arranged as a red filter unit and a blue filter unit.
Similar to the above embodiment, the positions of the color filter cells of different colors in fig. 9 can be changed based on the position of the white filter cell in fig. 10 being unchanged, and a plurality of filter cell groups can be obtained.
For example, as shown in fig. 13 (a) to (g), in a set of 2 × 2 color filter units in the several filter unit groups, the relative positional relationship of the green filter units is the same, and the relative positional relationship of the red filter units and the blue filter units is different.
It should be noted that, if the relative positional relationship of the green filter units is the same in the 2 × 2 color filter unit sets, and the relative positional relationship of the red filter units and the blue filter units is different, that is, in the multiple filter unit sets shown in fig. 10 and (a) to (g) in fig. 13, the color pixel values corresponding to the two color filter units of the same color at the common vertex angle can be directly used for performing interpolation calculation to obtain the color pixel value at the corresponding position of the white filter unit at the common vertex angle, thereby reducing the complexity of the subsequent image processing algorithm.
Referring to fig. 11, the filter unit group 211 is in the form of a filter unit group obtained by dividing the filter unit array 210 in another dividing manner, and similarly, the positions of the color filter units in the filter unit group can be changed based on the unchanged position of the white filter unit in the above manner, so as to obtain a plurality of changed filter unit groups.
For example, as shown in fig. 14 (a) to (h), in the several filter unit groups, three green filter units are disposed at a common vertex, another green filter unit is located at a corner of the filter unit group, and in the several filter unit groups, a pair of red filter units or a pair of blue filter units are disposed at a common vertex.
It is understood that the filter unit group structure obtained by performing geometric transformation such as rotation or symmetry on the various filter unit groups in fig. 10 to 14 is also within the protection scope of the present application.
The basic structure of the image sensor 200 and the arrangement of the various filter unit groups 211 therein in the present application are described above with reference to fig. 2 to 14, and the image processing method for the image sensor 200 in the present application is described below with reference to fig. 15 and 16.
Fig. 15 shows a schematic flow chart of an image processing method. Fig. 16 shows a schematic image through the image processing method of fig. 14.
As shown in fig. 15, the image processing method 10 includes:
s110: an image formed by the pixel unit array is sub-sampled, and a first sampling diagram comprising color pixel values and a second sampling diagram comprising white pixel values are obtained.
As an example, fig. 1# in fig. 16 is an image produced by 4 × 4 pixel cells in a pixel cell array, the 4 × 4 pixel cells being pixel cells corresponding to one filter cell group in the embodiment of the above application.
In diagram # 1 in fig. 16, a white pixel value 101 is a pixel value generated after a white pixel unit receives a white light signal, and a white filter unit 201 is correspondingly arranged above the white pixel value, and a red pixel value 102 is a pixel value generated after a red pixel unit receives a red light signal, and a red filter unit 202 is correspondingly arranged above the red pixel value; similarly, the green pixel value 103 is a pixel value generated by the green pixel unit receiving the green light signal, above which the green filter unit 203 is correspondingly disposed, and the blue pixel value 104 is a pixel value generated by the blue pixel unit receiving the blue light signal, above which the blue filter unit 204 is correspondingly disposed.
Fig. 2# in fig. 16 is a first sampling diagram obtained by sub-sampling the red pixel value, the green pixel value, and the blue pixel value in fig. 1# and fig. 3# in fig. 16 is a second sampling diagram obtained by sub-sampling the white pixel value in fig. 1 #. In the first and second sampling maps, the relative positional relationship of the pixel values coincides with the relative positional relationship of the pixel values in the original image 1# map.
S120: and performing interpolation processing on the first sampling image and the second sampling image to obtain a first image and a second image.
As an example, the 4# diagram in FIG. 16 is the firstThe first image after interpolation processing in the sampling chart (2 #), specifically, the pixel value in the ith row and jth column in the 4# chart can be represented as XijWhere X represents color information for the pixel value, e.g., the green pixel value at the top left, line 1, column 1 may be represented as G11Can be according to G11And G22Interpolation to obtain G12And G21Similarly, may be according to R14And R23Interpolating to obtain R13And R24According to B32And B41Interpolation gives B31And B42(ii) a Root according to G33And G44Interpolation to obtain G34And G43And interpolating according to the pixel values in the first sampling image to obtain a first image.
It can be seen that in this first image, the 8 pixel values in the dashed box in the figure, i.e. G11,R14,G22,R24,B32,G34,B41And G44The color information in the original image is reserved for the pixel value in the original image, the color restoration in the subsequent image processing process is facilitated, the interpolation processing process is simple, the algorithm in the image processing process can be simplified, and the image processing efficiency is improved.
In addition, in the embodiment of the present application, green pixel values are all on one diagonal of the first sampling diagram, or white pixel values are all on the diagonal of the other sampling diagrams, and in the recovery of the high resolution image and the color recovery, the loss of the image texture/image detail information can be reduced.
Fig. 5# in fig. 16 is a second image of the second sampling diagram (fig. 3 #) after interpolation processing, and the interpolation process may use any interpolation algorithm in the prior art, which is not limited in this embodiment of the present application. It will be appreciated that the second sample map and the second image are not used to characterize color information in the image, but rather to enhance the brightness of the image. In the embodiment of the application, the white pixel values are uniformly distributed in the image, and the spatial sampling rate is higher, which is beneficial to improving the resolution and the image quality of the image.
S130: and performing mosaic rearrangement processing on the first image to obtain a third image.
As an example, the 4# image in fig. 16 is subjected to a re-mosaic (remosaic) process to obtain a third image shown in the 6# image, which is a Bayer (Bayer) format data image. In this embodiment of the present application, any rearrangement method in the prior art may be used in the process of rearranging the mosaics, which is not specifically limited in this embodiment of the present application.
Because the third image after the mosaic rearrangement processing is in a Bayer format which is commonly used in the field of image processing at present, the third image can be suitable for more types of Image Signal Processors (ISPs), so that the image sensor can be adapted to more ISPs in the application and is suitable for more application scenes.
S140: and fusing the third image and the second image to obtain an optimized color image.
Optionally, the resolution of the third image is the same as the resolution of the second image.
As an example, image fusion is performed on the 5# image and the 6# image in fig. 16 to obtain an optimized color image 7#, and the color image after image fusion can better maintain brightness information while ensuring image color information, so that image quality under low-light conditions can be effectively improved. Meanwhile, in the color image, the luminance and chrominance information are not easily overlapped in the frequency domain, so that moire fringes are not easily generated.
It is understood that the above-mentioned image processing method 10 can be implemented by a processor or a processing circuit, in other words, optionally, in the above-mentioned image sensor, a processing unit for implementing the above-mentioned image processing method 10 can also be included.
In addition to the image sensor 200 provided in the embodiments of the application, the present application also provides an electronic device, which may include the image sensor 200 in any of the embodiments described above.
The electronic device may be any electronic device having an image capturing function, for example, the electronic device may specifically be a mobile terminal such as a mobile phone and a computer, a shooting device such as a camera and a video camera, an Automatic Teller Machine (ATM), and the like.
Alternatively, the processing unit for executing the image processing method 10 may be located in a processing unit in the electronic device where the image sensor 200 is located, instead of the image sensor 200, for example, if the electronic device is a mobile phone, the processing unit may be an image signal processing unit in a processor in the mobile phone, or the processing unit may also be a separate image signal processing chip in the mobile phone, and the embodiment of the present application does not limit the specific hardware form of the processing unit.
It should be understood that the processor of the embodiments of the present application may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It will be appreciated that the image sensor of embodiments of the application may also include memory, which may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory. It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (30)

1. An image sensor, comprising:
a filter cell array including a plurality of filter cell groups, each of the plurality of filter cell groups including 4 × 4 filter cells; in the 4 × 4 filter units, each row and each column includes 2 white filter units and 2 color filter units, and at least one diagonal includes 4 white filter units or 4 color filter units of the same color;
the pixel unit array comprises a plurality of pixel units, the pixel unit array is positioned below the light filtering unit array, and the plurality of pixel units in the pixel unit array correspond to the plurality of light filtering units in the light filtering unit array one by one.
2. The image sensor according to claim 1, wherein the filter unit group includes first, second, and third color filter units of different colors, and a number of the first color filter units is equal to a sum of numbers of the second and third color filter units.
3. The image sensor according to claim 2, wherein in the filter unit group, the number of the second color filter units is equal to the number of the third color filter units.
4. The image sensor according to claim 3, wherein in the filter unit group, the color filter units are arranged at diagonal positions and the white filter units are arranged at off-diagonal positions.
5. The image sensor according to claim 4, wherein in the set of filter units, 4 filter units on one diagonal are the first color filter units, and 4 filter units on the other diagonal include 2 second color filter units and 2 third color filter units.
6. The image sensor as claimed in claim 5, wherein on the other diagonal line, 2 of the second color filter units are disposed adjacent to each other at a common vertex, and 2 of the third color filter units are disposed adjacent to each other at a common vertex.
7. The image sensor according to claim 6, wherein in the filter unit group, the white filter units are respectively located in a first row, a second column, a third column, a second row, a first column, a second row, a fourth column, a third upper column, a third row, a fourth column, a fourth row, a second column, a fourth row, a third column;
the second color filter units are respectively positioned in a fourth column in the first row and a third column in the second row;
the first color filter units are respectively positioned in a first row and a first column, a second row and a second column, a third row and a third column, and a fourth row and a fourth column;
the third color filter units are respectively positioned in the third row, the second column and the fourth row, the first column.
8. The image sensor according to claim 5, wherein 2 of the second color filter units are spaced apart from 2 of the third color filter units on the other diagonal line.
9. The image sensor of claim 8, wherein in the set of filter units, the white filter units are respectively located in a first row, a second column, a third column, a second row, a first column, a second row, a fourth column, a third upper column, a third row, a fourth column, a fourth row, a second column, a fourth row, a third column;
the second color filter units are respectively positioned in a fourth column of the first row and a second column of the third row;
the first color filter units are respectively positioned in a first row and a first column, a second row and a second column, a third row and a third column, and a fourth row and a fourth column;
the third color filter units are respectively positioned in the third column of the second row and the first column of the fourth row.
10. The image sensor according to claim 3, wherein in the filter unit group, the white filter units are arranged at diagonal positions and the color filter units are arranged at off-diagonal positions.
11. The image sensor of claim 10, wherein the color filter cells in each row and each column are different in color in the set of filter cells.
12. The image sensor of claim 11, wherein in the filter unit group, the color filter units of the same color are adjacently disposed at a common vertex angle.
13. The image sensor of claim 12, wherein in the set of filter units, the white filter units are respectively located in a first row and a first column, a first row and a fourth column, a second row and a second column, a second row and a third column, a third upper second column, a third row and a third column, a fourth row and a first column, and a fourth row and a fourth column;
the second color filter units are respectively positioned in a third column in the first row and a fourth column in the second row;
the first color filter units are respectively positioned in a first row, a second column, a second row, a first column, a third row, a fourth column and a fourth row, a third column;
the third color filter units are respectively positioned in a third row, a first column and a fourth row, a second column.
14. The image sensor according to claim 3, wherein the filter unit group includes 2 × 2 white filter unit sets and 2 × 2 color filter unit sets, the 2 × 2 white filter unit sets are disposed on one diagonal of the filter unit group, and the 2 × 2 color filter unit sets are disposed on the other diagonal of the filter unit group.
15. The image sensor of claim 14, wherein in the 2 x 2 sets of color filter units, each color filter unit comprises 2 common-vertex-angle-arranged first color filter units, and 1 common-vertex-angle-arranged second color filter unit and 1 common-vertex-angle-arranged third color filter unit.
16. The image sensor according to claim 15, wherein the relative positional relationship of the first color filter units is the same in the 2 sets of 2 x 2 color filter units.
17. The image sensor of claim 16, wherein in the 2 x 2 color filter sets, the relative positional relationship of the second color filter is different, and the relative positional relationship of the third color filter is different.
18. The image sensor of claim 17, wherein in the set of filter units, the white filter units are respectively located in a first row, a third column, a first row, a fourth column, a second row, a third column, a second row, a fourth column, a third upper first column, a third row, a second column, a fourth row, a first column, and a fourth row, a second column;
the second color filter units are respectively positioned in the second column of the first row and the third column of the fourth row;
the first color filter units are respectively positioned in a first row and a first column, a second row and a first column, a third row and a third column, and a fourth row and a fourth column;
the third color filter units are respectively positioned in the first column of the second row and the fourth column of the third row.
19. The image sensor according to any one of claims 2 to 18, wherein the first color filter unit, the second color filter unit, and the third color filter unit are configured to pass light signals of three colors respectively, and the wavelength bands of the light signals of the three colors cover a visible light wavelength band.
20. The image sensor of claim 19, wherein the first color filter unit, the second color filter unit, and the third color filter unit are respectively three colors of red, green, blue, cyan, magenta, and yellow.
21. The image sensor of claim 20, wherein the first color filter unit is a green filter unit, the second color filter unit and the third color filter unit are a red filter unit and a blue filter unit, respectively.
22. The image sensor of any one of claims 1 to 18, further comprising:
and the micro lens array comprises a plurality of micro lenses, is positioned above the light filtering unit array and is used for converging the optical signals returned by the shooting object to the light filtering unit array.
23. The image sensor of claim 22, wherein the plurality of microlenses in the microlens array correspond one-to-one to the plurality of filter cells in the filter cell array.
24. The image sensor of claim 22, wherein the microlens array includes at least one first microlens and at least one second microlens therein,
the first micro lens corresponds to one white filter unit in the filter unit array,
the second micro lens corresponds to four color filter units in the filter unit array.
25. The image sensor according to any one of claims 1 to 18, wherein pixel values of color pixel cells in the pixel cell array are used to generate first image data of the photographic subject, pixel values of white pixel cells in the pixel cell array are used to generate second image data of the photographic subject, and the first image data and the second image data are used to synthesize a target image of the photographic subject;
the white pixel unit is a pixel unit corresponding to the white filter unit, and the color pixel unit is a pixel unit corresponding to the color filter unit.
26. The image sensor according to claim 25, wherein pixel values of color pixel cells in the pixel cell array are used to generate an intermediate image through interpolation processing, the intermediate image being used to generate the first image data in bayer format through demosaic processing.
27. The image sensor according to claim 26, wherein 2 pixel values among the 2 x 2 pixel values of the intermediate image are original pixel values of the color pixel unit, and the other 2 pixel values are pixel values obtained by interpolation processing.
28. The image sensor of claim 25, wherein the first image data and the second image data are of the same resolution.
29. The image sensor of any one of claims 1 to 18, wherein the image sensor is a Complementary Metal Oxide Semiconductor (CMOS) image sensor or a Charge Coupled Device (CCD) image sensor.
30. An electronic device, comprising:
the image sensor of any one of claims 1 to 29.
CN202021297709.XU 2020-05-15 2020-07-03 Image sensor and electronic device Active CN212435794U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020104106392 2020-05-15
CN202010410639 2020-05-15

Publications (1)

Publication Number Publication Date
CN212435794U true CN212435794U (en) 2021-01-29

Family

ID=72202804

Family Applications (11)

Application Number Title Priority Date Filing Date
CN202021303789.5U Active CN212752379U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010635332.2A Pending CN111756972A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010636571.XA Pending CN111756973A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297708.5U Active CN212435793U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297709.XU Active CN212435794U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010637147.7A Pending CN111756974A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010708333.5A Active CN111614886B (en) 2020-05-15 2020-07-22 Image sensor and electronic device
CN202010724146.6A Pending CN111654615A (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202021510460.6U Active CN212752389U (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202010724148.5A Pending CN111629140A (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202021508422.7U Active CN212785522U (en) 2020-05-15 2020-07-24 Image sensor and electronic device

Family Applications Before (4)

Application Number Title Priority Date Filing Date
CN202021303789.5U Active CN212752379U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010635332.2A Pending CN111756972A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010636571.XA Pending CN111756973A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297708.5U Active CN212435793U (en) 2020-05-15 2020-07-03 Image sensor and electronic device

Family Applications After (6)

Application Number Title Priority Date Filing Date
CN202010637147.7A Pending CN111756974A (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010708333.5A Active CN111614886B (en) 2020-05-15 2020-07-22 Image sensor and electronic device
CN202010724146.6A Pending CN111654615A (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202021510460.6U Active CN212752389U (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202010724148.5A Pending CN111629140A (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202021508422.7U Active CN212785522U (en) 2020-05-15 2020-07-24 Image sensor and electronic device

Country Status (2)

Country Link
CN (11) CN212752379U (en)
WO (1) WO2021227250A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157795A (en) * 2021-12-14 2022-03-08 Oppo广东移动通信有限公司 Image sensor, camera module, electronic equipment, image generation method and device

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114845015A (en) * 2020-10-15 2022-08-02 Oppo广东移动通信有限公司 Image sensor, control method, imaging apparatus, terminal, and readable storage medium
CN112312097B (en) * 2020-10-29 2023-01-24 维沃移动通信有限公司 Sensor with a sensor element
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
CN113037980A (en) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor
CN115225832A (en) * 2021-04-21 2022-10-21 海信集团控股股份有限公司 Image acquisition equipment, image encryption processing method, equipment and medium
CN113540138B (en) * 2021-06-03 2024-03-12 奥比中光科技集团股份有限公司 Multispectral image sensor and imaging module thereof
CN113676651B (en) * 2021-08-25 2023-05-26 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113676652B (en) * 2021-08-25 2023-05-26 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113852797A (en) * 2021-09-24 2021-12-28 昆山丘钛微电子科技股份有限公司 Color filter array, image sensor and camera module
CN114125318A (en) * 2021-11-12 2022-03-01 Oppo广东移动通信有限公司 Image sensor, camera module, electronic equipment, image generation method and device
CN114125240A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, electronic equipment and shooting method
CN114363486A (en) * 2021-12-14 2022-04-15 Oppo广东移动通信有限公司 Image sensor, camera module, electronic equipment, image generation method and device
CN114823985B (en) * 2022-05-31 2024-04-09 深圳市聚飞光电股份有限公司 Photoelectric sensor and packaging method thereof
CN115696078B (en) * 2022-08-01 2023-09-01 荣耀终端有限公司 Color filter array, image sensor, camera module and electronic equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9479745B2 (en) * 2014-09-19 2016-10-25 Omnivision Technologies, Inc. Color filter array with reference pixel to reduce spectral crosstalk
TWI552594B (en) * 2014-10-27 2016-10-01 聯詠科技股份有限公司 Color filter array for image sensing device and manufacturing method thereof
CN105282529B (en) * 2015-10-22 2018-01-16 浙江宇视科技有限公司 A kind of digital wide dynamic approach and device based on RAW spaces
CN105516700B (en) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 Imaging method, imaging device and the electronic installation of imaging sensor
CN105578078B (en) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 Imaging sensor, imaging device, mobile terminal and imaging method
CN105516697B (en) * 2015-12-18 2018-04-17 广东欧珀移动通信有限公司 Imaging sensor, imaging device, mobile terminal and imaging method
CN105578071B (en) * 2015-12-18 2018-03-20 广东欧珀移动通信有限公司 Imaging method, imaging device and the electronic installation of imaging sensor
CN105430359B (en) * 2015-12-18 2018-07-10 广东欧珀移动通信有限公司 Imaging method, imaging sensor, imaging device and electronic device
WO2017101864A1 (en) * 2015-12-18 2017-06-22 广东欧珀移动通信有限公司 Image sensor, control method, and electronic device
CN107105140B (en) * 2017-04-28 2020-01-24 Oppo广东移动通信有限公司 Dual-core focusing image sensor, focusing control method thereof and imaging device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157795A (en) * 2021-12-14 2022-03-08 Oppo广东移动通信有限公司 Image sensor, camera module, electronic equipment, image generation method and device

Also Published As

Publication number Publication date
CN111756973A (en) 2020-10-09
CN212435793U (en) 2021-01-29
CN111614886A (en) 2020-09-01
CN111629140A (en) 2020-09-04
CN212752389U (en) 2021-03-19
WO2021227250A1 (en) 2021-11-18
CN111654615A (en) 2020-09-11
CN111614886B (en) 2021-10-19
CN111756972A (en) 2020-10-09
CN212785522U (en) 2021-03-23
CN212752379U (en) 2021-03-19
CN111756974A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
CN212435794U (en) Image sensor and electronic device
US10032810B2 (en) Image sensor with dual layer photodiode structure
EP2022258B1 (en) Image sensor with improved light sensitivity
US8456553B2 (en) Color imaging element
CN111314592B (en) Image processing method, camera assembly and mobile terminal
CN111246064B (en) Image processing method, camera assembly and mobile terminal
CN110649057B (en) Image sensor, camera assembly and mobile terminal
CN111757006A (en) Image acquisition method, camera assembly and mobile terminal
TW201031188A (en) Image capture device
CN210143059U (en) Image sensor integrated circuit, image sensor, and imaging system
US9185375B2 (en) Color imaging element and imaging device
US20200267291A1 (en) Reduced optical crosstalk plenoptic imaging device, corresponding method, computer program product, computer-readable carrier medium and apparatus
EP2800376B1 (en) Imaging device, method for controlling imaging device, and control program
US8711257B2 (en) Color imaging device
EP4246959A1 (en) Image sensor and imaging apparatus
CN114666469B (en) Image processing device, method and lens module with image processing device
JP2016034055A (en) Image processing apparatus, image processing method, and imaging apparatus
US20140320710A1 (en) Imaging device, method for controlling imaging device, and storage medium storing a control program
CN114080795A (en) Image sensor and electronic device
CN114008781A (en) Image sensor, camera assembly and mobile terminal
JP2009009971A (en) Solid-state imaging apparatus
US20140307136A1 (en) Imaging device, method for controlling imaging device, and storage medium storing a control program
JP2005210359A (en) Two-ccd type color solid-state imaging apparatus and digital camera
JP2008252020A (en) Solid-state imaging element

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant