CN115988299A - Image sensor sharing microlens to partially shield phase focusing and control method thereof - Google Patents

Image sensor sharing microlens to partially shield phase focusing and control method thereof Download PDF

Info

Publication number
CN115988299A
CN115988299A CN202111188578.0A CN202111188578A CN115988299A CN 115988299 A CN115988299 A CN 115988299A CN 202111188578 A CN202111188578 A CN 202111188578A CN 115988299 A CN115988299 A CN 115988299A
Authority
CN
China
Prior art keywords
focusing
pixel
pixel units
unit
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111188578.0A
Other languages
Chinese (zh)
Inventor
俞恩杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SmartSens Technology Shanghai Co Ltd
Original Assignee
SmartSens Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SmartSens Technology Shanghai Co Ltd filed Critical SmartSens Technology Shanghai Co Ltd
Priority to CN202111188578.0A priority Critical patent/CN115988299A/en
Publication of CN115988299A publication Critical patent/CN115988299A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Color Television Image Signal Generators (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

The invention provides an image sensor for partially shielding phase focusing by a shared micro lens and a control method thereof, wherein four pixels of each pixel unit are used in a four-in-one mode by sharing a light filter and a micro lens, at least one pair of the pixel units is set as a focusing pixel unit, and the rest is a non-focusing pixel unit (namely a normal pixel unit), so that when a circuit is read out, the non-focusing pixel unit directly reads out brightness information in four-in-one mode, the focusing pixel unit can only read out half information because of being shielded, and the information of the focusing pixel units in the whole pixel array is superposed to obtain the image phase information, thereby realizing automatic focusing, namely, the phase information and the brightness information can be simultaneously obtained only through one-time circuit reading, so that the frame rate is not reduced by half, and the video frame rate of the image sensor when the video uses automatic phase focusing is effectively improved.

Description

Image sensor for partially shielding phase focusing by shared micro-lens and control method thereof
Technical Field
The invention relates to the technical field of image sensors, in particular to an image sensor sharing a microlens and focusing a partial shielding phase and a control method thereof.
Background
The image capturing device generally includes an image sensor and a lens. The lens focuses light onto an image sensor to form an image, and the image sensor converts light signals into electrical signals. The electrical signals are output from the image capture device to other components of the electronic system. The image acquisition device and other components of the electronic system constitute an imaging system. Image sensors have become popular and are found in a variety of electronic systems, such as cell phones, digital cameras, medical devices, and computers.
A typical image sensor includes a plurality of light-sensitive image elements ("pixels") arranged in a two-dimensional array. An image sensor is enabled to form a color image by forming a Color Filter Array (CFA) on pixels. Typically, each pixel is covered by a monochromatic filter, which in turn is covered by a microlens to focus the light onto a photodiode. The repeating pixel modules form a pixel array, wherein the pixel module is a 2 x 2 array of four pixels covered with a red, a blue and two green filters, forming a well-known bayer pattern CFA. The technology for fabricating image sensors, particularly Complementary Metal Oxide Semiconductor (CMOS) image sensors, continues to advance.
At present, a common microlens is used to provide a full-pixel automatic phase focusing function for an image sensor, but since phase focusing information and pixel brightness information (i.e. image information) need to be read out respectively, the reading mode is not flexible enough, the conventional layout design is difficult to effectively realize flexible acquisition of required information, in addition, in practical application, the video frame rate is reduced by half, and for a high-resolution image sensor, the video frame rate is a very critical and difficult index.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, an object of the present invention is to provide an image sensor using a common microlens for partial-shielding phase focusing and a control method thereof, which are used to solve the problems of the prior art that the image sensor using the common microlens for providing the full-pixel auto-focusing function has an inflexible readout mode and a low frame rate when the image sensor is used for auto-phase focusing.
To achieve the above and other related objects, the present invention provides an image sensor for partial block phase focusing using a common microlens, the image sensor comprising:
the pixel array is composed of a plurality of pixel unit groups, each pixel unit group comprises four pixel units arranged in a 2 x 2 mode, each pixel unit comprises four pixels arranged in an n x n mode, n is larger than or equal to 2, and all the pixels of each pixel unit correspond to filters with the same color and share one microlens;
the pixel units comprise focusing pixel units and non-focusing pixel units, the focusing pixel units are formed by setting part of pixels in the focusing pixel units as shielding pixels, the focusing pixel units are arranged in pairs and are arranged in the pixel units with the same color, and the focusing pixel units are used for acquiring phase focusing information.
Alternatively, the shielded pixel is formed by forming a light-shielding material layer on a photoelectric conversion element in the shielded pixel.
Optionally, the pixels in the focusing pixel unit are divided into an upper part and a lower part, and the pixels on the upper side and the pixels on the lower side are respectively set as the shielding pixels to form the focusing pixel unit arranged in pairs; or the pixels in the focusing pixel unit are divided into a left part and a right part, and the pixels on the left side and the pixels on the right side are respectively set as the shielding pixels to form the focusing pixel unit which is arranged in pairs; or dividing the pixels in the focusing pixel units into two parts at two sides of a diagonal line, and respectively setting the pixels at two sides of the two diagonal lines as the shielding pixels to form the focusing pixel units which are arranged in pairs; or the pixels in the focusing pixel unit are divided into two sides of a division line passing through the center of the pixels, and the pixels on the two sides of the division line are respectively set as the shielding pixels to form the focusing pixel units arranged in pairs.
Optionally, the four pixel units in each pixel unit group constitute a bayer-based array, wherein the focusing pixel units arranged in pairs are respectively arranged in different bayer-based arrays.
Further, the bayer base array includes a bayer RGB array, the bayer RGB array includes a blue pixel unit, a first green pixel unit, a second green pixel unit, and a red pixel unit, and the pair of focusing pixel units are formed by replacing blue filters in two blue pixel units in the two bayer RGB arrays with auxiliary filters, and then setting part of the pixels as the shielding pixels.
Further, the auxiliary filter includes any one of a green filter, a cyan filter, and a white filter.
Further, the pixel array includes a plurality of repeating units of sixteen bayer-based arrays arranged in a 4 × 4 pattern; each repeating unit comprises a pair of focusing pixel units.
Further, the two focusing pixel units in each repeating unit are in the same column and spaced by three columns in the repeating unit, or the two focusing pixel units are in the same row and spaced by three columns in the repeating unit, or the two focusing pixel units are spaced by three columns and spaced by three columns in the repeating unit.
Optionally, the pixel array comprises a plurality of repeating units of the bayer-based array arranged in a 4 x 4 pattern; each repeating unit comprises two pairs of focusing pixel units.
Furthermore, every two focusing pixel units in the four focusing pixel units in each repeating unit are in the same column and are separated by three rows in the repeating unit, and three columns are separated between two columns where the four focusing pixel units are located; or the four focusing pixel units in each repeating unit are arranged at intervals in a row along the row direction, and every two focusing pixel units are positioned in the same row along the row direction and three rows are arranged between two rows where the four focusing pixel units are positioned.
Optionally, each pixel unit corresponds to the same readout circuit, so as to read out all the pixel information corresponding to the pixel unit.
The present invention also provides a control method of the image sensor, including: acquiring phase focusing information based on the focusing pixel unit and acquiring image information based on the non-focusing pixel unit; or phase focusing information is acquired based on the focusing pixel unit and the non-focusing pixel unit.
Optionally, when the pixel array includes a plurality of the repeating units, and each of the repeating units includes a pair of the focusing pixel units; or, each repeating unit includes two pairs of focusing pixel units, where four focusing pixel units in each repeating unit are all arranged at intervals in a row direction, and every two focusing pixel units are in the same row and four focusing pixel units are in three rows in the row direction, or are arranged at intervals in a row direction, and every two focusing pixel units are in the same row and four focusing pixel units are in three rows in the column direction, and when three rows are arranged at intervals between two rows in which the four focusing pixel units are located in the same row in the row direction, the control method includes:
defining the repeating unit to be composed of four repeating subunits arranged in an array, wherein each repeating subunit is composed of a bayer base array arranged in a 2 × 2 manner, and defining the position of a pair of focusing pixel units corresponding to each repeating subunit as a reference position, wherein in each repeating subunit, the pixel units at different positions corresponding to the bayer base array are read out in a combined manner, and for the reference position, the information of the pixel unit at the reference position is directly read out, so as to obtain the phase focusing information and the image information at the same time.
As described above, in the image sensor with a common microlens partially shielding phase autofocus, four pixels of each pixel unit can be used in a four-in-one mode by sharing one filter and one microlens, at least one pair of the pixel units is set as a focusing pixel unit, and the rest are non-focusing pixel units (i.e. normal pixel units), so that when the circuit is read out, the non-focusing pixel units directly read out luminance information in four-in-one mode, the focusing pixel units can only read out half of the information because of being shielded by half, and the information of the focusing pixel units in the whole pixel array is superimposed to obtain image phase information, thereby realizing autofocus.
Drawings
Fig. 1 is a schematic diagram illustrating the effect of focusing by using a full-pixel auto-phase focusing mode in a conventional image sensor, in which 2 × 2 pixels in a focusing pixel unit are divided into a left portion and a right portion for focusing.
Fig. 2 is a schematic diagram illustrating a readout method of a pixel array in a conventional image sensor, wherein a four-in-one readout method is used.
FIG. 3 is a schematic diagram showing a readout method of a pixel array in a common microlens partially-shielded phase-focusing image sensor according to the present invention, wherein a four-in-one readout method is used.
FIG. 3a shows three formation modes of the focusing pixel unit in the common microlens partially-shielded phase focusing image sensor of the present invention.
Fig. 4 is a schematic cross-sectional structure diagram of a pixel array in a common microlens partially-shielded phase-focusing image sensor according to the present invention, wherein fig. 4 is a schematic cross-sectional structure diagram taken along direction AA in fig. 3.
Fig. 5 to 9, 13, 14 and 17 are schematic diagrams illustrating the arrangement of focusing pixel units in the repeating units of the pixel array of the image sensor for partially shielding phase focusing by the common microlens according to the present invention.
FIG. 10 is a schematic diagram of a pixel cell merging layout of a pixel array of a phase-focusing image sensor using a common microlens according to the present invention.
Fig. 11 is a diagram showing a merging result obtained in the merging manner of fig. 10.
Fig. 12 is a schematic diagram showing another combination result obtained in the combination manner of fig. 10.
Fig. 15 is a merged layout of fig. 14.
FIG. 16 is a graph showing the combined results of FIG. 15.
Fig. 18 is a diagram of a merged layout of fig. 17.
FIG. 19 is a graph showing the combined results of FIG. 18.
Description of the element reference numerals
10. 20 pixel array
101. Pixel unit group
102. Bayer RGB array
103. Blue pixel unit
104. First green pixel unit
105. Second green pixel unit
106. Red pixel unit
11. 21 pixel unit
111. Focusing pixel unit
112. Non-focusing pixel unit
12. 22 pixels
121. Optical filter
122. Micro-lens
123. Photoelectric conversion element
13. Shading pixel
131. Light-shielding material layer
14. Repeating unit
Detailed Description
The following embodiments of the present invention are provided by way of specific examples, and other advantages and effects of the present invention will be readily apparent to those skilled in the art from the disclosure herein. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
Please refer to fig. 1 to 19. It should be noted that the drawings provided in the present embodiment are only for illustrating the basic idea of the present invention, and the drawings only show the components related to the present invention rather than being drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of each component in actual implementation may be changed according to actual needs, and the layout of the components may be more complicated.
As described in the background, the use of a common microlens currently provides an all-pixel auto-phase focus function for an image sensor, and as shown in fig. 1, four 2 × 2 pixels 22 of each pixel unit 21 in a pixel array 20 of the common microlens all-pixel focus image sensor share a color filter material and a microlens. If the signals of the two pixels 22 on the left side L and the signals of the two pixels 22 on the right side R are read out separately, phase information can be acquired, and thus the lens shift amount required for focusing sharpness is calculated. However, in the video mode, as shown in fig. 2, four pixels 22 are generally combined into one pixel for use, and if the auto-phase-focusing mode is not used, the four pixels are read out only once; using the auto-phase focusing mode requires reading out the phase focusing information and the image information separately, as shown in fig. 1, the reading out method is not flexible enough, and in addition, the reading out is required twice, resulting in a lower video frame rate.
In view of the above problems, the present embodiment provides an image sensor with a shared microlens and a partially shielded phase focusing, which can improve the flexibility of the pixel unit information reading mode by sharing the microlens and partially shielding the pixels in the focused pixel unit; in addition, the phase focusing information and the image information can be simultaneously obtained through once reading, and the frame rate of the image sensor in the video mode is effectively improved. As shown in fig. 3 and 4, the image sensor includes:
a pixel array 10 composed of a plurality of pixel unit groups 101, each of the pixel unit groups 101 includes four pixel units 11 arranged in a 2 × 2 manner, each of the pixel units 11 includes a plurality of pixels 12 arranged in an n × n manner, and all the pixels 12 of each of the pixel units 11 correspond to filters 121 of a same color and share a microlens 122 (as shown in fig. 4), wherein all the pixels 12 of the pixel units 11 may share a filter of a same color;
the pixel unit 11 comprises a focusing pixel unit 111 and a non-focusing pixel unit 112, the focusing pixel unit 111 is formed by arranging part of the pixels 12 therein as shielding pixels 13, the focusing pixel units 111 are arranged in pairs and arranged in the pixel units 11 of the same color, and the focusing pixel unit 111 is used for acquiring phase focusing information.
It should be noted that, for the convenience of understanding, fig. 3 only shows the pixel array 10 formed by four pixel unit groups 101 arranged in a 2 × 2 format, but it is known to those skilled in the art that the size of the pixel array 10 may be set according to actual situations, and therefore, the pixel array of the present application should not be limited to the size of the pixel array 10 of fig. 3. In addition, also for the sake of understanding, the pixel unit shown in fig. 3 is made up of four said pixels 12 arranged in a 2 × 2 format, and only one said focusing pixel unit 111 is shown, and this focusing pixel unit 111 is formed by arranging two said pixels 12 on the right side as said shading pixel 13, and in practice said pixel unit may be more pixels in the form of 3 × 3, 4 × 4, 5 × 5 … … n × n, etc., and said focusing pixel units 111 are present in pairs, one of which acquires two pixel signals on one side and the other acquires two pixel signals on the other side.
Four pixels of each pixel unit are used in a four-in-one mode by sharing one optical filter and one micro lens, at least one pair of the pixel units is set as a focusing pixel unit, and the rest of the pixel units are non-focusing pixel units (namely normal pixel units), so that when a circuit is read out, the non-focusing pixel units directly read out brightness information in four-in-one mode, the focusing pixel units can only read out half of the information because of being shielded by half, and the information of the focusing pixel units in the whole pixel array is superposed to obtain image phase information, thereby realizing automatic focusing, namely, the phase information and the brightness information can be simultaneously obtained only through one-time circuit reading, so that the frame rate is not reduced by half, and the video frame rate of the image sensor when the video uses automatic phase focusing is effectively improved.
As shown in fig. 3 and 4, the shielded pixel 13 may be formed by forming a light-shielding material layer 131 on a photoelectric conversion element 123 in the shielded pixel 13, wherein the photoelectric conversion element 123 is generally selected as a photodiode, and the light-shielding material layer 131 is generally selected as metal tungsten. The light-shielding material layer 131 can shield the light entering the underlying photoelectric conversion element 123, so that the light can only enter the photoelectric conversion element 123 which is not shielded by the light-shielding material layer 131, and the whole pixel unit can only read out the information of two pixels except the shielded pixel. This method does not require a change in the structure of the photoelectric conversion element 123, reduces the process complexity, and is easy to implement.
As an example, as shown in fig. 3, 2 × 2 pixels 12 in the focusing pixel unit 111 may be divided into two parts, i.e., a left side and a right side, and two pixels 12 on the left side and two pixels 12 on the right side are respectively set as the shielding pixels 13 to form the focusing pixel unit 111 arranged in pairs; as shown in a in fig. 3a, alternatively, 2 × 2 pixels 12 in the focusing pixel unit 111 are divided into two parts, namely, an upper part and a lower part, and two pixels 12 on the upper side and two pixels 12 on the lower side are respectively arranged as the shielding pixels 13 to form the focusing pixel unit 111 arranged in pairs; as shown in b in fig. 3a, 2 × 2 pixels 12 in the focusing pixel unit 111 are alternatively divided into two parts along two sides of a diagonal line, and a complete pixel 12 and two half pixels 12 on two sides of the diagonal line are respectively set as the shielding pixels 13 to form the focusing pixel unit 111 arranged in pairs; as shown in c in fig. 3a, 2 × 2 pixels 12 in the focusing pixel unit 111 are divided into two sides of a dividing line passing through the center of the pixel, and the pixels 12 on the two sides of the dividing line are respectively set as the shielding pixels 13 to form the focusing pixel unit 111 arranged in pairs.
As an example, the four pixel units 11 in each pixel unit group 101 constitute a bayer-based array, wherein the focusing pixel units 111 arranged in pairs are respectively arranged in different bayer-based arrays. The bayer array may be an RGGB array mode, i.e., a bayer RGB array, or may be a filter that is replaced with another color, for example, RYYB or the like, or some pixels may transmit light of all spectra without a filter, or may be replaced with an infrared enhancement or infrared cut filter.
As shown in fig. 3 and 5, the four pixel units 11 commonly found in each pixel unit group 101 form a bayer RGB array 102, which sequentially includes a blue pixel unit 103, a first green pixel unit 104, a second green pixel unit 105, and a red pixel unit 106, wherein the focusing pixel units 111 disposed in pairs are respectively disposed in different bayer RGB arrays 102. As a preferred example, as shown in fig. 5, the focusing pixel units 111 arranged in pairs are formed by replacing the blue filters in the two blue pixel units 103 in the two bayer RGB arrays 102 with auxiliary filters, and then arranging the two pixels as the shielding pixels. That is, the focusing pixel unit 111 is disposed on the blue channel of the bayer RGB array while converting the blue channel into an auxiliary channel. Any one of a green filter, a cyan filter and a white filter can be selected as the auxiliary filter, and in this embodiment, the green filter is preferred, so that the influence on the image quality can be effectively reduced by the green filter, and the sensitivity of the focusing pixel can be improved.
As shown in fig. 5 to 9, 13, 14, and 17, as an example, the pixel array 10 may be divided into a plurality of repeating units 14, all the repeating units 14 have the same structure, and all the repeating units 14 are arrayed in an n × m form to obtain the pixel array 10, where n and m may be the same or different and are both greater than 1. As shown in fig. 5 to 9, 13, 14, and 17, the repeating unit 14, which divides the pixel array 10 by the repeating unit 14, includes sixty-four pixel units 11 arranged in an 8 × 8 form. The number of pairs of focusing pixel units 111 arranged in each of the repeating units 14 can be set according to actual needs.
The following example is described by taking as an example that a pixel cell group composed of four pixel cells 11 arranged in a 2 × 2 form is the bayer RGB array 102, and the focusing pixel cell 111 is arranged on a blue channel of the bayer RGB array 102 while converting the blue channel into a green channel.
As shown in fig. 5 to 7, as an example, the repeating unit 14, which divides the pixel array 10 into repeating units 14, includes sixty-four pixel units 11 arranged in an 8 × 8 form, and is also equivalent to the repeating unit 14 including sixteen bayer RGB arrays 102 arranged in a 4 × 4 form, where the pixel units 11 include focusing pixel units 111 and non-focusing pixel units 112, and each repeating unit 14 is provided with a pair of the focusing pixel units 111.
Preferably, the two focusing pixel units 111 in each of the repeating units 14 are in the same column and spaced by three columns in the repeating unit 14 (as shown in fig. 6), or the two focusing pixel units 111 are in the same column and spaced by three columns in the repeating unit 14 (as shown in fig. 7), or the two focusing pixel units 111 are spaced by three columns and spaced by three columns in the repeating unit 14 (as shown in fig. 5). This way, the distribution uniformity of the focusing pixel units 111 in the whole pixel array 10 can be improved, thereby improving the focusing accuracy.
In this embodiment, as shown in fig. 6, preferably, when the two focusing pixel units 111 in each of the repeating units 14 are in the same column and spaced by three rows in the repeating unit 14, one focusing pixel unit 111 (i.e., L in the figure) is in the third row R3 and the third column L3 in the repeating unit 14, and the other focusing pixel unit 111 (i.e., R in the figure) is in the seventh row R7 and the third column L3 in the repeating unit 14, of course, the positions of L and R may be exchanged; as shown in fig. 7, when the two focusing pixel units 111 in each of the repeating units 14 are in the same row and spaced by three columns in the repeating unit 14, one of the focusing pixel units 111 (i.e., L in the figure) is in the third row R3 and the third column L3 in the repeating unit 14, and the other focusing pixel unit 111 (i.e., R in the figure) is in the third row R3 and the seventh column L7 in the repeating unit 14, although the positions of L and R may be exchanged; as shown in fig. 5, when the two focusing pixel units 111 in each repeating unit 14 are separated by three rows and three columns in the repeating unit 14, one focusing pixel unit 111 (i.e., L in the figure) is in the third row R3 and the third column L3 in the repeating unit 14, and the other focusing pixel unit 111 (i.e., R in the figure) is in the seventh row R7 and the seventh column L7 in the repeating unit 14, although the positions of L and R may be exchanged.
As shown in fig. 8, 9, 13, 14 and 17, as another example, the repeating unit 14, which divides the pixel array 10 into repeating units 14, includes sixty-four pixel units 11 arranged in an 8 × 8 format, and also corresponds to the repeating unit 14 including sixteen bayer RGB arrays 102 arranged in a 4 × 4 format, the pixel units 11 include focused pixel units 111 and unfocused pixel units 112, and two pairs of the focused pixel units 111 are disposed in each of the repeating units 14.
Preferably, every two focusing pixel units 111 of the four focusing pixel units 111 in each repeating unit 14 are in the same column and spaced by three rows in the repeating unit 14, and three columns are spaced between two columns of the four focusing pixel units 111 (as shown in fig. 8, 9, 13 and 14); or the four focusing pixel units 111 in each repeating unit 14 are all arranged at intervals of one column along the column direction, and every two focusing pixel units 111 are in the same row along the row direction and three rows are arranged between two rows where the four focusing pixel units 111 are located (as shown in fig. 17).
As the size of pixels is currently reduced, the total amount of light absorbed within the pixel is reduced and some advanced features are challenged. In general, the output resolution of video is lower than that of an image, and one way to represent a point in an image sensor by increasing the amount of light collected is to sum signals from adjacent or nearby pixels that share the same color filter, known as pixel binning, to improve sensitivity when acquiring images at low illumination.
The pixel control method for obtaining pixel information generally includes: firstly, reading pixel values output by a pixel array 10, as described above, the pixel array 10 includes a plurality of the repeating units 14, each of the repeating units 14 includes sixteen bayer RGB arrays 102 arranged in a 4 × 4 manner, and each of the bayer RGB arrays 102 includes a blue pixel unit 103, a first green pixel unit 104, a second green pixel unit 105, and a red pixel unit 106 (as shown in fig. 5) in sequence; then, receiving a pixel merging instruction; and finally outputting a new pixel array value according to the newly formed repeating unit.
In one embodiment, when the pixel array includes a plurality of the repeating units, each of the repeating units includes a pair of the focusing pixel units; or, each repeating unit comprises two pairs of focusing pixel units, wherein four focusing pixel units are arranged at intervals of one row along the row direction, every two focusing pixel units are positioned in the same row and four focusing pixel units are positioned between two rows at intervals of three rows along the row direction, or are arranged at intervals of one row along the row direction, every two focusing pixel units are positioned in the same row and four focusing pixel units are positioned at intervals of three rows along the row direction, and the control method comprises the following steps:
defining the repeating unit to be composed of four repeating subunits arranged in an array, wherein each repeating subunit is composed of a bayer base array arranged in a 2 × 2 manner, and defining the position of a pair of focusing pixel units corresponding to each repeating subunit as a reference position, wherein in each repeating subunit, the pixel units at different positions corresponding to the bayer base array are read out in a combined manner, and for the reference position, the information of the pixel unit at the reference position is directly read out, so as to obtain the phase focusing information and the image information at the same time.
The pixel combination mode is adopted to combine and then not only keep focusing information but also keep image information. Another method of pixel merging to obtain pixel information is: dividing each repeating unit 14 into A, B, C, D identical repeating subunits, namely dividing each repeating subunit into four bayer RGB arrays 102 of 2 × 2 form, defining a position where a pair of focusing pixel units 111 corresponding to each repeating subunit is located as a reference position, wherein in each repeating subunit, the pixel units at different positions corresponding to the bayer array are read out in combination, and for the reference position, directly reading out the information of the pixel unit at the reference position to obtain the phase focusing information and the image information at the same time.
For example, as shown in fig. 7, the focusing pixel unit 111 (at the position L3R 3) in the repeating subunit a and the focusing pixel unit 111 (at the position L7R 3) in the repeating subunit B in the repeating unit 14 are selected as a pair of focusing pixel units, and the positions of the pair of focusing pixel units corresponding to each of the repeating subunits are the reference positions (the reference positions are L3R3, L7R3, L3R7, and L7R 7), so that when merging, the merging read of the repeating subunit a is: directly reading out the pixel units L3R3 of the reference position (discarding the information of the pixel units L1R1, L3R1 and L1R 3), merging and reading out the pixel units L2R1, L4R1, L2R3 and L4R3, merging and reading out the pixel units L1R2, L3R2, L1R4 and L3R4, merging and reading out the pixel units L2R2, L4R2, L2R4 and L4R 4; similarly, the merged readout of the repeated sub-unit B is to directly read out the pixel unit L7R3 of the reference position (the information of the pixel units L5R1, L7R1, and L5R3 is discarded), merge and read out the pixel units L6R1, L8R1, L6R3, and L8R3, merge and read out the pixel units L5R2, L7R2, L5R4, and L7R4, merge and read out the pixel units L6R2, L8R2, L6R4, and L8R 4; the merged readout of the sub-unit C is repeated to directly read out the pixel unit L3R7 (the information of the pixel units L1R5, L3R5, L1R7 is discarded), merge read out the pixel units L2R5, L4R5, L2R7, L4R7, merge read out the pixel units L1R6, L3R6, L1R8, L3R8, merge read out the pixel units L2R6, L4R6, L2R8, L4R 8; the merged readout of the repeated sub-unit D is to read out the pixel unit L7R7 directly (discarding the information of the pixel units L5R5, L7R5, L5R 7), read out the pixel units L6R5, L8R5, L6R7, L8R7 merged, read out the pixel units L5R6, L7R6, L5R8, L7R8 merged, and read out the pixel units L6R6, L8R6, L6R8, L8R8 merged.
In another example, as shown in fig. 17, the focusing pixel unit 111 (position is L3R 3) in the repeating subunit a and the focusing pixel unit 111 (position is L7R 3) in the repeating subunit B in the repeating unit 14 are selected as a pair of focusing pixel units, and the positions of the pair of focusing pixel units corresponding to the repeating subunits are the reference positions (reference positions are L3R3, L7R3, L3R7, and L7R 7), so that when merging, the merging and reading of the repeating subunit a is to directly read out the pixel unit L3R3 at the reference position (information of the pixel units L1R1, L3R1, and L1R3 is discarded), merge and read out the pixel units L2R1, L4R1, L2R3, L4R3, and L4R3, merge and read out the pixel units L1R2, L3R2, L1R4, and L3R4, and merge and read out the pixel units L2R2, L4R 4; similarly, the merged readout of the repeated sub-unit B is to directly read out the pixel unit L7R3 of the reference position (the information of the pixel units L5R1, L7R1, and L5R3 is discarded), merge and read out the pixel units L6R1, L8R1, L6R3, and L8R3, merge and read out the pixel units L5R2, L7R2, L5R4, and L7R4, merge and read out the pixel units L6R2, L8R2, L6R4, and L8R 4; the merged readout of the sub-unit C is repeated to directly read out the pixel unit L3R7 (the information of the pixel units L1R5, L3R5, L1R7 is discarded), merge read out the pixel units L2R5, L4R5, L2R7, L4R7, merge read out the pixel units L1R6, L3R6, L1R8, L3R8, merge read out the pixel units L2R6, L4R6, L2R8, L4R 8; the merged readout of the repeated sub-unit D is to read out the pixel unit L7R7 directly (discarding the information of the pixel units L5R5, L7R5, L5R 7), read out the pixel units L6R5, L8R5, L6R7, L8R7 merged, read out the pixel units L5R6, L7R6, L5R8, L7R8 merged, and read out the pixel units L6R6, L8R6, L6R8, L8R8 merged.
The merging mode can be flexibly selected based on different layout modes of the pixel arrays in the image sensor.
For example, when the pixel array 10 is formed by using the repeating units shown in fig. 8, 9 and 13 in this embodiment, one method of obtaining pixel information for pixel combination is: taking the arrangement of fig. 9 as an example, as shown in fig. 10, each of the repeating units 14 is divided into A, B, C, D four identical repeating sub-units, that is, each repeating sub-unit is divided into four bayer RGB arrays 102 in the form of 2 × 2, then the pixel units of the corresponding colors in each bayer RGB array 102 in the corresponding repeating sub-unit are combined into one pixel unit and arranged in the form of bayer RGB array, for example, three blue pixel units 103 in the repeating sub-unit a of fig. 10 are combined into one blue pixel unit in the block a 'of fig. 11, four first green pixel units 104 in the repeating sub-unit a of fig. 10 are combined into one green pixel unit in the corresponding position in the block a' of fig. 11, merging four second green pixel units 105 in the repeating subunit a of fig. 10 into one green pixel unit of a corresponding position in the block a 'of fig. 11, merging four red pixel units 106 in the repeating subunit a of fig. 10 into one red pixel unit in the block a' of fig. 11, so that the block a 'of fig. 11 is formed as one new bayer RGB array, and so on, merging the repeating subunit B of fig. 10 into one new bayer RGB array of the block B' of fig. 11, merging the repeating subunit C of fig. 10 into one new RGB array of the block C 'of fig. 11, and merging the repeating subunit D of fig. 10 into one new bayer array of the block D' of fig. 11; finally outputting a merged image according to new pixel array values composed of the newly formed repeating units of FIG. 11;
another method of obtaining pixel information for pixel binning is: still taking the arrangement of fig. 9 as an example for explanation, as shown in fig. 10, each of the repeating units 14 is divided into A, B, C, D with four identical repeating sub-units, that is, each repeating sub-unit is divided into four bayer RGB arrays 102 in a 2 × 2 format, the bayer RGB arrays 102 where the focusing pixel units in each repeating sub-unit are located are extracted and then array-combined, and the obtained array is as shown in fig. 12, and the pixel combination mode is adopted to facilitate reading and combining.
When the pixel array 10 is formed using the repeating unit shown in fig. 14 in this embodiment, one method of obtaining pixel information for pixel binning is: as shown in fig. 10, each of the repeating units 14 is divided into A, B, C, D four identical repeating subunits, that is, each repeating subunit is divided into four bayer RGB arrays 102 of 2 × 2 form, as shown in fig. 15, a pair of the focusing pixel units in the repeating units 14 is selected as a focusing pixel unit, two bayer RGB arrays 102 in two repeating subunits where the two focusing pixel units are located are extracted, and two bayer RGB arrays 102 in corresponding positions in the other two repeating subunits are extracted and then array-combined, for example, as shown in fig. 15, RGB arrays 102 in which focusing pixel units in repeating subunits a and C of fig. 14 are located are extracted, and bayer RGB arrays 102 in repeating subunits B and D at the same positions as the RGB bayer arrays 102 extracted in repeating subunits a and C are extracted and array-combined (as shown in fig. 16). The pixel combination mode is adopted to combine and then not only keep focusing information but also keep image information.
When the pixel array 10 is formed by using the repeating units shown in fig. 5 to 7 and 17 in this embodiment, one method of obtaining pixel information for pixel combination is: as shown in fig. 10, each of the repeating units 14 is divided into A, B, C, D four identical repeating subunits, that is, each repeating subunit is divided into four bayer RGB arrays 102 of 2 × 2 form, as shown in fig. 18, a pair of the focusing pixel units in the repeating units 14 is selected as a focusing pixel unit, two bayer RGB arrays 102 in two repeating subunits where the two focusing pixel units are located are extracted, and two bayer RGB arrays 102 in corresponding positions in the other two repeating subunits are extracted and then array-combined, for example, as shown in fig. 18, RGB arrays 102 in which focusing pixel units in repeating subunits a and B of fig. 17 are located are extracted, and bayer RGB arrays 102 in repeating subunits C and D at the same positions as the RGB bayer arrays 102 extracted in repeating subunits a and B are extracted and array-combined (as shown in fig. 19).
In summary, the present invention provides an image sensor with a shared microlens for partially shielding phase autofocus, wherein four pixels of each pixel unit are used in a four-in-one mode by sharing a filter and a microlens, at least one pair of the pixel units is set as a focusing pixel unit, and the rest are non-focusing pixel units (i.e. normal pixel units), so that when the circuit is read out, the non-focusing pixel units directly read out luminance information in four-in-one mode, the focusing pixel units can only read out half of the information because of being shielded, and the information of the focusing pixel units in the entire pixel array is superimposed to obtain image phase information, thereby achieving autofocus, i.e. only one circuit reading is required to obtain phase information and luminance information at the same time, so that the frame rate is not reduced by half, and the video frame rate of the image sensor when the image uses autofocus is effectively increased. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (13)

1. An image sensor with shared microlens partially blocking phase focusing, the image sensor comprising:
the pixel array is composed of a plurality of pixel unit groups, each pixel unit group comprises four pixel units arranged in a 2 x 2 mode, each pixel unit comprises a plurality of pixels arranged in an n x n mode, n is larger than or equal to 2, and all the pixels of each pixel unit correspond to optical filters with the same color and share one microlens;
the pixel units comprise focusing pixel units and non-focusing pixel units, the focusing pixel units are formed by setting part of pixels in the focusing pixel units as shielding pixels, the focusing pixel units are arranged in pairs and are arranged in the pixel units with the same color, and the focusing pixel units are used for acquiring phase focusing information.
2. The common microlens partially-occluded phase-focused image sensor of claim 1, wherein: the shielded pixel is formed by forming a light-shielding material layer on a photoelectric conversion element in the shielded pixel.
3. The common microlens partially-occluded phase-focused image sensor of claim 1, wherein: dividing the pixels in the focusing pixel unit into an upper part and a lower part, and respectively setting the pixels on the upper side and the lower side as the shielding pixels to form the focusing pixel unit arranged in pairs; or the pixels in the focusing pixel unit are divided into a left part and a right part, and the pixels on the left side and the pixels on the right side are respectively set as the shielding pixels to form the focusing pixel unit which is arranged in pairs; or dividing the pixels in the focusing pixel units into two parts at two sides of a diagonal line, and respectively setting the pixels at two sides of the two diagonal lines as the shielding pixels to form the focusing pixel units which are arranged in pairs; or the pixels in the focusing pixel unit are divided into two sides of a division line passing through the center of the pixels, and the pixels on the two sides of the division line are respectively set as the shielding pixels to form the focusing pixel units arranged in pairs.
4. The common microlens partially-occluded phase-focused image sensor of claim 1, wherein: the four pixel units in each pixel unit group form a Bayer array, wherein the focusing pixel units which are arranged in pairs are respectively arranged in the different Bayer arrays.
5. The common microlens partially-occluded phase-focused image sensor as in claim 4, wherein: the Bayer base array comprises a Bayer RGB array, the Bayer RGB array comprises a blue pixel unit, a first green pixel unit, a second green pixel unit and a red pixel unit, the focusing pixel units arranged in pairs are formed by respectively replacing blue filters in two blue pixel units in the two Bayer RGB arrays with auxiliary filters and setting part of pixels in the two Bayer RGB arrays as the shielding pixels.
6. The common microlens partially-occluded phase-focused image sensor as claimed in claim 5, wherein: the auxiliary filter includes any one of a green filter, a cyan filter, and a white filter.
7. The common microlens partially-occluded phase-focused image sensor as in claim 4, wherein: the pixel array includes a plurality of repeating units of the bayer-based array arranged in a 4 × 4 pattern; each repeating unit comprises a pair of focusing pixel units.
8. The common microlens partially-occluded phase-focused image sensor as claimed in claim 7, wherein: the two focusing pixel units in each repeating unit are in the same column and spaced by three columns in the repeating unit, or the two focusing pixel units are in the same row and spaced by three columns in the repeating unit, or the two focusing pixel units are spaced by three columns and spaced by three columns in the repeating unit.
9. The common microlens partially-occluded phase-focused image sensor as in claim 4, wherein: the pixel array includes a plurality of repeating units of the bayer-based array arranged in a 4 × 4 pattern; each repeating unit comprises two pairs of focusing pixel units.
10. The common microlens partially-occluded phase-focused image sensor as in claim 9, wherein: every two focusing pixel units in the four focusing pixel units in each repeating unit are in the same column and are separated by three rows in the repeating unit, and three columns are separated between two columns where the four focusing pixel units are located; or the four focusing pixel units in each repeating unit are arranged at intervals in a row along the row direction, and every two focusing pixel units are positioned in the same row along the row direction and three rows are arranged between two rows where the four focusing pixel units are positioned.
11. The common microlens partially-occluded phase-focused image sensor according to any of claims 1-10, wherein: each pixel unit corresponds to the same reading circuit, and reading of all the pixel information corresponding to the pixel units is achieved.
12. A method of controlling an image sensor according to any one of claims 1 to 11, characterized by: acquiring phase focusing information based on the focusing pixel unit and acquiring image information based on the non-focusing pixel unit; or acquiring phase focusing information based on the focusing pixel unit and the non-focusing pixel unit.
13. The method of controlling an image sensor according to claim 12, wherein: when the pixel array comprises a plurality of the repeating units, each repeating unit comprises a pair of the focusing pixel units; or, each repeating unit comprises two pairs of focusing pixel units, wherein four focusing pixel units are arranged at intervals of one row along the row direction, every two focusing pixel units are positioned in the same row and four focusing pixel units are positioned between two rows at intervals of three rows along the row direction, or are arranged at intervals of one row along the row direction, every two focusing pixel units are positioned in the same row and four focusing pixel units are positioned at intervals of three rows along the row direction, and the control method comprises the following steps:
defining the repeating unit to be composed of four repeating sub-units arranged in an array, wherein each repeating sub-unit is composed of a Bayer base array arranged by 2 x 2, and defining the position of a pair of focusing pixel units corresponding to each repeating sub-unit as a reference position, in each repeating sub-unit, the pixel units at the corresponding positions of different Bayer base arrays are read out in a combining way, and for the reference position, the information of the pixel unit at the reference position is directly read out, so as to obtain the phase focusing information and the image information at the same time.
CN202111188578.0A 2021-10-12 2021-10-12 Image sensor sharing microlens to partially shield phase focusing and control method thereof Pending CN115988299A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111188578.0A CN115988299A (en) 2021-10-12 2021-10-12 Image sensor sharing microlens to partially shield phase focusing and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111188578.0A CN115988299A (en) 2021-10-12 2021-10-12 Image sensor sharing microlens to partially shield phase focusing and control method thereof

Publications (1)

Publication Number Publication Date
CN115988299A true CN115988299A (en) 2023-04-18

Family

ID=85968597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111188578.0A Pending CN115988299A (en) 2021-10-12 2021-10-12 Image sensor sharing microlens to partially shield phase focusing and control method thereof

Country Status (1)

Country Link
CN (1) CN115988299A (en)

Similar Documents

Publication Publication Date Title
US9532033B2 (en) Image sensor and imaging device
CN102577395B (en) Solid-state image pickup element and image pickup apparatus
US7440019B2 (en) Solid-state image pick-up device
US20150358593A1 (en) Imaging apparatus and image sensor
CN102227811B (en) Solid-state image pickup device and image pickup apparatus
US8125547B2 (en) Driving method of solid-state imaging device, solid-state imaging device, and imaging apparatus including photoelectric conversion elements for luminance detection
WO2013046973A1 (en) Solid state image capture element, image capture device, and focus control method
WO2013105383A1 (en) Image generation method, image generation apparatus, program, and storage medium
US8111298B2 (en) Imaging circuit and image pickup device
US9188480B2 (en) Color filter array and image sensor
JP4207736B2 (en) Solid-state imaging device
CN103999449A (en) Image capture element
CN106506997A (en) Imageing sensor including phase difference detection pixel
EP2800376B1 (en) Imaging device, method for controlling imaging device, and control program
EP2800355A1 (en) Imaging device, method for controlling imaging device, and control program
CN216873256U (en) Image sensor for partially shielding phase focusing by shared micro-lens
JP5874334B2 (en) Image processing apparatus, imaging apparatus, image processing program, and imaging apparatus control program
US20060016961A1 (en) Imaging apparatus and solid-state imaging device
CN111835971B (en) Image processing method, image processing system, electronic device, and readable storage medium
CN115988299A (en) Image sensor sharing microlens to partially shield phase focusing and control method thereof
JP2008016862A (en) Solid state imaging apparatus
CN114584725A (en) Image sensor and imaging device
US20140320710A1 (en) Imaging device, method for controlling imaging device, and storage medium storing a control program
US20130044244A1 (en) Solid-state imaging device and imaging apparatus
CN115623341B (en) Image sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination