CN110352489B - Autofocus system for CMOS imaging sensor - Google Patents

Autofocus system for CMOS imaging sensor Download PDF

Info

Publication number
CN110352489B
CN110352489B CN201780087438.XA CN201780087438A CN110352489B CN 110352489 B CN110352489 B CN 110352489B CN 201780087438 A CN201780087438 A CN 201780087438A CN 110352489 B CN110352489 B CN 110352489B
Authority
CN
China
Prior art keywords
autofocus
photodiode
array
pixel sensors
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780087438.XA
Other languages
Chinese (zh)
Other versions
CN110352489A (en
Inventor
B·穆
A·M·麦格纳尼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems Imaging Solutions Inc
Original Assignee
BAE Systems Imaging Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems Imaging Solutions Inc filed Critical BAE Systems Imaging Solutions Inc
Publication of CN110352489A publication Critical patent/CN110352489A/en
Application granted granted Critical
Publication of CN110352489B publication Critical patent/CN110352489B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/702SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Focusing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Automatic Focus Adjustment (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)

Abstract

The invention includes an imaging device comprising a two-dimensional array of pixel sensors. Each pixel sensor includes a primary photodiode, an autofocus photodiode, and a microlens that focuses light onto the primary photodiode and the autofocus photodiode. The imaging array of pixel sensors includes first and second autofocus arrays of pixel sensors, the pixel sensors in the first autofocus array of pixel sensors having autofocus photodiodes positioned such that each autofocus photodiode preferentially receives light from one half of a microlens in the pixel sensor, and the pixel sensors in the second autofocus array of pixel sensors having each autofocus photodiode positioned such that each autofocus photodiode preferentially receives light from the other half of the microlens in the pixel sensor. The autofocus photodiode may be comprised of a parasitic photodiode associated with each pixel sensor or floating diffusion node in a conventional photodiode.

Description

Autofocus system for CMOS imaging sensor
Technical Field
The invention relates to an autofocus system for a CMOS imaging sensor.
Background
Autofocus systems are widely used in still and moving picture cameras. Such a system reduces the expertise required of the user. In addition, in a motion picture camera, if the distance between the camera and the object of interest changes rapidly, the time to change focus as the scene evolves is prohibitive.
In one prior art system, a computer controlling a lens searches for a focal position that maximizes the high spatial frequency content of the image. Due to out-of-focus image blur, the spatial spectrum associated with an image of a scene containing sharp edges and other high spatial frequency generating elements has less power in the high frequency portion of the spectrum than in the image of the scene when in focus. Thus, these approaches iteratively search for the focal distance of the focal point that generates the image with the highest ratio of spatial frequency energy to average spatial frequency energy. The time for performing the search poses a problem when the algorithm is applied to rapidly changing scenes captured by a motion picture camera.
A second class of prior art autofocus systems that avoid this search time utilizes measurements of phase differences between pixels viewing an image through different portions of the camera lens. These schemes utilize a dedicated imaging array that is separate from the imaging array that produces the picture or the particular pixel sensors in the array to sense the phase difference. These special autofocus pixels replace the regular pixels of the recorded image; thus, the image recorded by the array includes "holes" at locations corresponding to the autofocus pixels. These holes are filled as a result of the insertion of surrounding pixels.
Disclosure of Invention
The invention includes an imaging device comprising a two-dimensional array of pixel sensors. Each pixel sensor includes a primary photodiode, an autofocus photodiode, and a microlens that focuses light onto the primary photodiode and the autofocus photodiode. The imaging array of pixel sensors includes first and second autofocus arrays of pixel sensors, the pixel sensors in the first autofocus array of pixel sensors having autofocus photodiodes positioned such that each autofocus photodiode preferentially receives light from one half of a microlens in the pixel sensor, and the pixel sensors in the second autofocus array of pixel sensors having each autofocus photodiode positioned such that each autofocus photodiode preferentially receives light from the other half of the microlens in the pixel sensor.
In one aspect of the invention, the autofocus photodiode comprises a pinned photodiode, and the primary photodiode is also a pinned photodiode, characterized by having a primary photodiode region that is larger than the pinned photodiode region.
In another aspect of the invention, the autofocus photodiode includes a parasitic photodiode associated with the floating diffusion node in each pixel sensor.
In another aspect of the invention, pixel sensors in a first autofocus array of pixel sensors of a pixel sensor have autofocus photodiodes positioned such that each autofocus photodiode receives more than 80% of the light from one half of a microlens in the pixel sensor, and pixel sensors in a second autofocus array of pixel sensors have autofocus photodiodes positioned such that each autofocus photodiode preferentially receives light from the other half of the microlens in the pixel sensor.
In another aspect of the invention, pixel sensors in a first autofocus array of pixel sensors of a pixel sensor have autofocus photodiodes positioned such that each autofocus photodiode receives more than 90% of the light from one half of a microlens in the pixel sensor, and pixel sensors in a second autofocus array of pixel sensors have autofocus photodiodes positioned such that each autofocus photodiode preferentially receives light from the other half of the microlens in the pixel sensor.
In another aspect of the invention, the apparatus comprises: a camera lens that images a scene to be photographed onto a two-dimensional array of pixel sensors; and an actuator that moves the camera lens relative to the two-dimensional imaging array in response to an autofocus signal from the controller. The controller is configured to: exposing a pixel sensor to light from a scene to be photographed during an auto-focus period; obtaining a signal from each pixel sensor in the first and second arrays, the signal being indicative of an amount of light received during an auto-focus period; and generating an autofocus signal such that a predetermined portion of the scene will be focused on a predetermined area of the two-dimensional array of pixel sensors.
In another aspect of the invention, generating the autofocus signal includes calculating a cross-correlation function between the signal from the autofocus photodiodes in the first array and the signal from the autofocus photodiodes in the second array.
In another aspect of the invention, the main photodiodes of the pixel sensors in the two-dimensional array of pixel sensors are organized as a uniform array with equal spacing in each of the two dimensions, and wherein the autofocus photodiodes form a non-uniform array. In another aspect of the invention, the first array of autofocus pixel sensors is a mirror image of the second array of autofocus pixel sensors.
In another aspect of the invention, the controller generates a first image of the scene using a main photodiode in an imaging array comprising first and second arrays of pixel sensors.
In another aspect of the invention, the first autofocus array of pixel sensors includes a first linear array of pixel sensors and the second autofocus array of pixel sensors includes a second linear array of pixel sensors configured as a mirror image of the first linear array of pixel sensors.
In another aspect of the invention, the pixel sensor includes a plurality of different color filters, one of the plurality of color filters disposed below the microlens in each of the plurality of pixel sensors, the first autofocus array is characterized by having a first number of color filters for each color included in the first autofocus array, and the second autofocus array is characterized by having a second number of color filters for each color included in the second autofocus array, the first and second numbers being substantially equal.
In another aspect of the invention, the controller outputs a first image and a light intensity measurement determined from an autofocus photodiode in each pixel sensor.
Drawings
FIG. 1 illustrates a two-dimensional imaging array according to one embodiment of the present invention.
Fig. 2 is a schematic diagram of a typical prior art pixel sensor in a column of pixel sensors in an imaging array.
Fig. 3 shows a pixel sensor in which a parasitic photodiode is used for image measurement.
Fig. 4A-4C illustrate the manner in which the distance from the camera lens to the imaging array may be detected.
Fig. 5 is a top view of a portion of an embodiment of an imaging array that utilizes the pixels shown in fig. 3, as taught in U.S. patent application 14/591,873 filed 7/2015.
Fig. 6 is a cross-sectional view of pixel sensors 66 and 67 through line 6-6 shown in fig. 5.
FIG. 7 is a top view of a portion of an imaging array according to one embodiment of the invention.
Fig. 8 is a cross-sectional view through line 8-8 shown in fig. 7.
FIG. 9 shows an imaging array having multiple autofocus regions.
FIG. 10 is a schematic diagram of a pixel sensor having two photodiodes that may be used in a two photodiode autofocus embodiment.
FIG. 11 is a top view of a portion of an imaging array utilizing the pixel sensor design shown in FIG. 10 according to one embodiment of the present invention.
Fig. 12 is a cross-sectional view through line 12-12 shown in fig. 11.
Fig. 13-15 illustrate additional layouts of imaging arrays according to other embodiments of the present invention.
Detailed Description
The present invention is based on two observations. First, each pixel sensor in the imaging array includes a floating diffusion node that can be used for autofocus measurements without losing any pixel from the imaging array. Second, by changing the position of the floating diffusion node, autofocus measurements can be made without blocking light from the autofocus pixels to provide the asymmetry required for phase autofocus measurements.
To simplify the following discussion, a pixel sensor is defined as a circuit that converts light incident thereon into an electrical signal whose magnitude is determined by the amount of light incident on the circuit over a period of time (referred to as exposure). The pixel sensor has a gate that couples the electrical signal to a readout line in response to a signal on a row select line.
A rectangular imaging array is defined as a plurality of pixel sensors organized into a plurality of rows and columns of pixel sensors. The rectangular array comprises a plurality of readout lines and a plurality of row select lines, each pixel sensor being connected to one row select line and one readout line, the electrical signal generated by the pixel being connected to the readout line associated with the pixel in response to a signal on the row select line associated with the pixel sensor.
The manner in which the present invention provides its advantages may be more readily understood with reference to FIG. 1, which illustrates a two-dimensional imaging array in accordance with one embodiment of the present invention. The rectangular imaging array 80 includes pixel sensors 81. Each pixel sensor has a main photodiode 86 and a parasitic photodiode 91. The manner in which the pixel sensor operates will be discussed in more detail below. The reset and amplification circuits in each pixel are shown at 87. The pixel sensors are arranged in a plurality of rows and columns. Exemplary rows are shown at 94 and 95. Each pixel sensor in a column is connected to a readout line 83, which readout line 83 is shared by all pixel sensors in the column. A calibration source 96 is optionally included on each readout line. Each pixel sensor in a row is connected to a row select line 82, and the row select line 82 determines whether the pixel sensors in that row are connected to a corresponding readout line.
The operation of the rectangular imaging array 80 is controlled by a controller 92, the controller 92 receiving the pixel addresses to be read out. The controller 92 generates row select addresses that are used by the row decoder 85 to enable readout of the pixel sensors on the corresponding row in the rectangular imaging array 80. The column amplifiers are included in an array of column amplifiers 84 that perform a readout algorithm, which will be discussed in more detail below. All pixel sensors in a given row are read out in parallel; thus, there is one column amplification and analog-to-digital converter (ADC) circuit per sense line 83. The column processing circuitry will be discussed in more detail below.
When the rectangular imaging array 80 is reset and then exposed during an imaging exposure, each photodiode accumulates charge that is dependent on the photo-exposure and photo-conversion efficiency of that photodiode. When a row of pixel sensors associated with the photodiode is read out, the charge is converted to a voltage by a reset and amplification circuit 87 in the pixel sensor. This voltage is coupled to a corresponding readout line 83 and processed by the amplification and ADC circuitry associated with that readout line to produce a digital value representing the amount of light incident on the pixel sensor during the imaging exposure.
Fig. 2 is a schematic diagram of a typical prior art pixel sensor in a column of pixel sensors in an imaging array. The pixel sensor 21 comprises a photodiode 22 which measures the light intensity at the corresponding pixel in the image. First, the photodiode 22 is reset by placing the gate 25 in a conductive state and connecting the floating diffusion node 23 to the reset voltage Vr. The gate 25 is then turned off and the photodiode 22 is allowed to accumulate photoelectrons. For purposes of this discussion, a floating diffusion node is defined as an electrical node that is not connected to a power rail or driven by another circuit. The potential on gate 27 sets the maximum amount of charge that can be accumulated on photodiode 22. If more charge is accumulated than allowed by the potential on gate 27, the excess charge is shunted to ground through gate 27.
After the photodiode 22 has been exposed to light, the charge accumulated in the photodiode 22 is typically measured by recording the change in voltage on the floating diffusion node 23 as the accumulated charge from the photodiode 22 is transferred to the floating diffusion node 23. Floating diffusion node 23 is characterized as having a capacitance represented by capacitor 23'. In practice, the capacitor 23' is charged to the voltage Vr and isolated by pulsing the reset line of the gate 24 before the floating diffusion node 23 is connected to the photodiode 22. When the gate 25 is turned on, the charge accumulated on the photodiode 22 is transferred to the floating diffusion node 23. The voltage on the floating diffusion node 23 is sufficient to remove all of this charge, reducing the voltage on the floating diffusion node 23 by an amount that depends on the amount of charge transferred and the capacitance of the capacitor 23'. Thus, by measuring the voltage change on the floating diffusion node 23, the amount of charge accumulated during exposure can be determined. When the pixel sensor in question is connected to a read-out line 31 in response to a signal on the bus line 28, the voltage on the floating diffusion node 23 is measured by a column amplifier 32.
The present invention is based on the following observations: a pixel of the type described above can be modified to include a second parasitic photodiode that is part of the floating diffusion node and has significant photodiode detection efficiency. In general, the optical conversion efficiency of the parasitic photodiode is minimized by shielding the floating diffusion node from the influence of light. However, by adjusting the spacing of the components near the floating diffusion node, the light conversion efficiency of the parasitic photodiode can be increased, as indicated in co-pending U.S. patent application 14/591,873 filed on 7/1/2015.
In order to distinguish the parasitic photodiode from the photodiode 22, the photodiode 22 and the photodiode for analog function will be referred to as a "conventional photodiode". Referring now to fig. 3, a pixel sensor is shown in which a parasitic photodiode is used for image measurement. To simplify the following discussion, those elements of the pixel sensor 41 that serve functions similar to those discussed above with reference to fig. 1 have been given the same reference numerals and will not be discussed further unless such discussion is necessary to illustrate a new manner in which these elements are utilized. Typically, the detection efficiency of the parasitic photodiode 42 is significantly less than the detection efficiency of the photodiode 22. The manner of adjusting the ratio of photodiode detection efficiencies of two photodiodes is discussed in more detail in co-pending U.S. patent application 14/591,873 filed on 7/1/2015. In one exemplary embodiment, the ratio of the conversion efficiency of the main photodiode to the parasitic photodiode is 30: 1. Other embodiments in which the ratio is 20:1 or 15:1 are useful.
The photo-charges accumulated on the parasitic photodiode during exposure may be determined separately from the photo-charges accumulated on the main photodiode during exposure. This process can be more easily understood starting with resetting the pixel sensor after the last image readout operation is completed. Initially, the main photodiode 22 is reset to Vr and the gate 25 is closed. This also resets the floating diffusion node 43 to Vr. If a correlated double sampling measurement is to be made, the voltage is measured at the start of the exposure by connecting the floating diffusion node 43 to the column amplifier 170. Otherwise, the previous voltage measurement for the reset voltage is used. During image exposure, the parasitic photodiode 42 generates photoelectrons that are stored on the floating diffusion node 43. These photoelectrons reduce the potential on the floating diffusion node 43. At the end of the exposure, the voltage on the floating diffusion node 43 is measured by connecting the output of the source follower 26 to the column amplifier 170, and the amount of charge produced by the parasitic photodiode 42 is determined to provide a first pixel intensity value. Next, the floating diffusion node 43 is again reset to Vr and the potential on the floating diffusion node 43 is measured by connecting the output of the source follower 26 to the column amplifier 170. The gate 25 is then placed in a conducting state and the photoelectrons accumulated by the main photodiode are transferred to the floating diffusion node 43. The voltage on the floating diffusion node 43 is then measured again and used by the column amplifier 170 to calculate a second pixel intensity value.
The basic principle of a phase detection autofocus system can be more easily understood with reference to fig. 4A-4C, which illustrate the manner in which the distance from the camera lens to the imaging array can be detected. Referring to fig. 4A, consider a point 221 in a scene that is to be captured by the imaging array of the camera through the lens 201. For the purposes of this example, it will be assumed that lens 201 is obscured by mask 204. All light is blocked by the mask 204 except for light passing through the two edge windows shown at 205 and 206. Light from windows 205 and 206 is imaged onto two linear arrays of pixel sensors shown at 202 and 203. For purposes of this discussion, it is assumed that pixel sensors in array 202 can only "see" light from window 205 and pixel sensors in array 203 can only "see" light from window 206. In fig. 4A, light from window 205 is detected at pixel sensor 207 in array 202, and light from window 206 is detected at pixel sensor 208.
The distance from the lens 201 to the plane of the arrays 202 and 203 is denoted by D. The pixel sensors where light is imaged on both arrays depends on the distance D. In the example shown in fig. 4A, lens 201 images a plane in the scene containing point 221 to a point below the array plane. Thus, the image of the plane in the scene is out of focus. If the lens is moved towards the arrays 202 and 203, the pixel sensor that now detects light is located towards the middle of the arrays 202 and 203. With the lens 201 focusing the light onto the plane of the arrays 202 and 203, the location of the pixel sensor receiving the light is located in the middle of the array closest to the optical axis 215 of the lens 201. Fig. 4B shows the lens at the appropriate distance and the pixel sensors receiving light are shown at 209 and 210. Reference is now made to fig. 4C. In this case, lens 201 is too close to the plane of arrays 202 and 203, and the pixel sensors receiving light are again separated along the length of the arrays, as shown at 211 and 212.
Conversely, if the identity of the pixel sensor receiving light from both windows in the lens can be determined, the distance required to properly focus the point 221 onto the imaging array can be determined. If the pixel sensor receiving the light is known, the distance that must be moved to reach the correct focus lens can be determined from the look-up table, and therefore, there is no need to iterate the lens distance. Thus, this type of auto-focus scheme can perform auto-focus adjustment in a much shorter time than is available for schemes that optimize the high frequency spatial composition of the image.
Adapting this auto-focus scheme to an imaging array, where the pixel sensor array is within the imaging array used to form the image of the scene being photographed, presents two problems. First, the imaging lens is uncovered. This problem can be overcome by using a pixel sensor that measures only the light transmitted by half of the camera lens. If the autofocus pixel sensor is separate from the pixel sensor that actually detects the image, a pixel sensor that satisfies this constraint can be obtained by masking the microlens located above the pixel sensor. However, this approach effectively removes the pixel sensor from the imaging array. The manner in which this is accomplished in the present invention without sacrificing the pixel sensors within the imaging array will be discussed in more detail below.
Second, the light projected onto the auto-focus linear array is not a single spot, but is from a line of the scene. Thus, detecting only the identity of the pixel sensor that receives the most light in each array does not provide the necessary information for determining the appropriate D. This problem can be overcome by calculating an image correlation value that can be mapped to the distance between the lens and the imaging array.
The manner in which the present invention overcomes the first problem can be more readily understood with reference to fig. 5, which is a top view of a portion of an embodiment of an imaging array 60 utilizing the pixels shown in the figure, as taught in the aforementioned U.S. patent application. Various gate and control lines are omitted from the figure to simplify the drawing. The pixel sensors are arranged in a rectangular array. The elements of a typical pixel sensor are labeled 61. Specifically, the pixel sensor 61 has a main photodiode 62 and a parasitic photodiode 63. Both photodiodes receive light from a microlens 64, the microlens 64 overlying the silicon surface in which the photodiode is constructed. The pixel sensors are typically arranged in groups of four pixel sensors, such as group 65. In an array for a color camera, each pixel sensor is covered by a color filter. Typically, one pixel sensor is covered by a red color filter, as indicated by "R"; one pixel sensor is covered with a blue color filter as indicated by "B", and two pixel sensors are covered with a green color filter as indicated by "G". Color processing is not relevant to the present discussion and is therefore not discussed here.
The present invention is based on the following observations: the parasitic photodiode associated with the floating diffusion node can be used to form the linear imaging array required for an autofocus system without changing the main photodiode, and therefore, the pixel loss associated with the prior art can be avoided.
Referring now to fig. 6, fig. 6 is a cross-sectional view of pixel sensors 66 and 67 through line 6-6 shown in fig. 5. Again, various gate and wiring structures for connecting the gate and the photodiode to the bit line are omitted to simplify the drawing. The main photodiodes are shown at 75 and 73, respectively. The respective floating diffusion nodes with their parasitic photodiodes are shown at 74 and 76. The wiring layers on the substrate from which the photodiodes are constructed include a plurality of patterned metal layers 68 and 69 that form apertures for limiting light from the microlenses 64 and 72 that can reach the photodiodes. Color filters 70 and 71 are deposited over the wiring layers and under the microlenses. It should be noted that in this configuration, both parasitic photodiodes preferentially receive light from one half of the same microlens, i.e., halves 64A and 72A. Therefore, the parasitic photodiode in this arrangement is not suitable for use in an autofocus pixel sensor.
Referring now to FIG. 7, FIG. 7 is a top view of a portion of an imaging array according to one embodiment of the invention. Imaging array 130 differs from imaging array 60 shown in fig. 5 in that each third row of pixel sensors is a mirror image of the corresponding row in imaging array 60. This results in two arrays of floating diffusion nodes, shown as 131 and 132. As a result, floating diffusion nodes in one of the rows, e.g., row 133, preferably receive light from one side of the microlens in the pixel sensor in which the floating diffusion nodes are located, and floating diffusion nodes in another of the rows, e.g., row 134, preferentially receive light from the other side of the microlens.
Referring now to fig. 8, fig. 8 is a cross-sectional view through line 8-8 shown in fig. 7. The floating diffusion node 141 in the pixel sensor 166 as part of row 133 receives light from one half of the microlens 140 as shown, and receives substantially less light from the other half of the microlens 140. Instead, the floating diffusion node 142 in the pixel sensor 167 preferentially receives light from one half of the microlens 143 as shown at 142A. Thus, the floating diffusion nodes in the two rows of pixel sensors can be used as an autofocus sensing array.
To simplify the following discussion, a pixel sensor whose floating diffusion node is used for autofocus purposes will be referred to as an autofocus pixel sensor. Those autofocus pixel sensors located in a row similar to row 133 will be referred to as top autofocus pixel sensors. Those autofocus pixel sensors located in a row at a similar position as row 134 will be referred to as bottom autofocus pixel sensors. The labels "top" and "bottom" are merely labels and are not intended to indicate position relative to the earth. Typically, the area of the imaging array that generates an image in a particular area of the field of view that is to be kept in focus will have a two-dimensional array of autofocus pixel sensors available for making autofocus measurements. In the following discussion, this region will be referred to as an autofocus region. Any particular autofocus pixel sensor may be identified by a pair on the index (I, J), indicating the position of the autofocus pixel sensor in the two-dimensional imaging array. The signals from the floating diffusion node in the bottom autofocus pixel sensor will be represented by B (I, J), and those from the floating diffusion node in the top autofocus pixel sensor will be represented by T (I, J). Since each top autofocus pixel sensor has a corresponding bottom autofocus pixel sensor, the index is selected such that B (I, J) is the autofocus pixel sensor corresponding to T (I, J). The autofocus area signal will correspond to a set of possible a (I, J) and B (I, J) signals.
It should be noted that using a floating diffusion node as part of an imaging array that generates an image of a scene requires that the floating diffusion node operate under a color filter. Any distortion introduced by the color filters can be eliminated by using multiple pairs of autofocus pixel sensors. Referring again to fig. 7, the top autofocus pixel sensors in the array 131 are covered by the red or green filter, but not the blue filter. Similarly, the bottom autofocus pixel sensor is covered by the blue and green filters, but not the red filter. However, autofocus measurements are taken using arrays 131 and 132, and then all possible combinations are obtained. In one aspect of the invention, a set of top autofocus pixel sensors for autofocus measurement includes a substantially equal number of pixel sensors having red, blue, and green filters. Similarly, the set of bottom autofocus pixel sensors for autofocus measurement includes a substantially equal number of pixel sensors with red, blue, and green filters. For the purposes of this discussion, the number of color filters of each color included will be defined as substantially equal if the autofocus adjustment obtained from the autocorrelation measurement discussed below does not change due to any inequality in the numbers.
As described above, the camera lens is not obscured, and thus, the autofocus pixel sensor receives light from a plurality of different points in the scene. Therefore, some form of cross-correlation function must be used to determine the top and bottom pixel positions from which the lens position correction is to be determined.
Figure BDA0002180348590000131
Here, TA (x, y) and BA (x, y) are average values of T (x, y) and B (x, y), respectively, on the autofocus pixel sensor. Summing is performed on a set of autofocus pixel sensors for focusing a selected area of the image. The (u, v) value of max p (u, v) provides a value that can be used to access the camera lens movement required to focus the imaged scene area onto the autofocus pixel sensor. In the case of a simple shot, the distance the shot moves is determined. Alternatively, the focal length of a more complex imaging lens may be changed to bring the image into focus. In this case, the change in focal length will be determined. In one aspect of the invention, the controller stores a focus table that maps the determined (u, v) value to a camera lens movement or focal length change required to bring the scene into focus.
Typically, moving the lens brings a particular region of the image into focus. This is typically the region near the center of the image. In the present invention, the autofocus pixel sensor can be used for substantially the entire imaging array. Thus, there are multiple regions where autofocus data may be provided. In the present discussion, the area having enough autofocus pixel sensors to perform focus adjustment will be referred to as an autofocus area. Referring now to FIG. 9, an imaging array having multiple autofocus regions is shown. The imaging array 200 is organized as a rectangular array with an array of autofocus pixel sensors on the rows. Basically, two of every three rows contain autofocus pixel sensors, as shown at 202-205. The autofocus area may be as small as a portion of two autofocus pixel sensor rows, as shown at 206 and 208, or the autofocus area may include portions of four or more autofocus pixel sensor rows, as shown at 209.
In practice, the autofocus controller 210 is programmed to use one of the autofocus areas to set the focusing characteristics of the lens 212. The autofocus controller 210 may be implemented in the entire camera controller or as a separate controller in communication with the main camera controller as shown at 92 in fig. 1. The controller 210 then sends a signal to the actuator 211 to move the lens 212 so that the autofocus area is in focus. As mentioned above, a commonly used autofocus area is the autofocus area near the center of the imaging array. However, the correlation function for setting the lens focus may be calculated at a large number of autofocus areas in the imaging array and transmitted with the image measured after the autofocus control has brought the autofocus area of interest into focus. This additional information may be used to provide a measure of the distance between the respective region of the scene and the camera focus region.
In one aspect of the invention, a motion image sequence of images is acquired by taking an autofocus measurement before each frame of the motion image sequence. Thus, the time available for performing autofocus adjustments is limited. The time required to perform the auto-focus adjustment will be referred to as an auto-focus period. The time period includes the time required to expose the autofocus pixel sensors, the time required to read out those pixel sensors and perform the relevant calculations, and the time required to move the lens. Typically, some areas of the imaging array (e.g., the central area) will be kept in focus by the autofocus system. It is advantageous to reduce the autofocus exposure time. The autofocus exposure time depends on the number of autofocus pixel sensors in the region of interest used in the autofocus focus calculation and the light level in the scene. If the illumination level is low or the autofocus exposure time is too short, the resulting autofocus pixel sensor output will have a lot of noise. The autofocus exposure calculation depends on a correlation measure, such as the p (u, v) calculation discussed above. As more pixels are added to the calculation, the effect of noise is reduced. Since more than half of the pixel sensors in the array are autofocus pixel sensors, the present invention can reduce the autofocus exposure period and use the output from more autofocus pixel sensors to compensate for the increased noise. It should be noted that this is a significant advantage of the present invention compared to a system having a small number of dedicated autofocus pixel sensors embedded in the imaging array instead of pixel sensors that record the image. In one exemplary embodiment, the number of autofocus pixel sensors used to determine the calibration focus adjustment is greater than 1000. In another exemplary embodiment, the number of autofocus pixel sensors used to determine the calibration focus adjustment is less than or equal to 1000.
In one aspect of the invention, the area in the center of the imaging array is used to set the camera lens distance from the imaging array. It should be noted, however, that the "focus map" of the entire scene projected onto the imaging array may be computed by repeating the distance calculations over a small segment of the imaging array at the location of the entire imaging array. Such a map is useful in constructing a three-dimensional image of a scene. Accordingly, in one aspect of the present invention, a signal from an autofocus pixel sensor for setting a lens distance before capturing an image is output as a separate image for later image post-processing.
The above-mentioned U.S. patent application describes a method of extending the range of a pixel sensor by providing a second light measurement of light received by the pixel sensor during an imaging exposure using a floating diffusion node. The floating diffusion node in this approach has a photoconversion efficiency of 1/30, which is typically the photoconversion efficiency of the main photodiode, and thus provides a measure of received light when the pixel is subjected to a light intensity that causes the main photodiode to saturate. The floating diffusion node of the present invention can also be used to extend the dynamic range of the pixel sensor.
It should be noted that the main photodiodes and the microlenses in the above-described embodiments form a regular array with equal spacing in the column and row directions. The floating diffusion nodes are not evenly distributed across the imaging array and some post-imaging processing may be required. For example, the image seen by the floating diffusion nodes may be resampled to provide an image on a uniform grid. The values of this resampled floating diffusion node image will then be combined with corresponding values in the image generated by the main photodiode to provide an expanded light intensity measurement. To perform post-processing, the image seen by the floating diffusion node must be output and saved along with the image seen by the main photodiode.
In the above embodiments, the floating diffusion node in the autofocus pixel sensor is positioned such that the floating diffusion node receives light from only one side of the microlens. However, embodiments may also be constructed in which the floating diffusion node preferentially receives light from one side of the microlens. For example, the floating diffusion node is positioned such that 80% of the light is from one side of the microlens and 20% of the light received by the floating diffusion node is from the other side of the microlens. In another exemplary embodiment, the floating diffusion node is positioned such that 90% of the light is from one side of the microlens and 10% of the light received by the floating diffusion node is from the other side of the microlens. The use of an additional autofocus pixel sensor in the autofocus cross correlation method may make up for this lack of light separation.
Although the autofocus system of the present invention tolerates noise in the autofocus pixel sensor, the floating diffusion node in the autofocus pixel sensor must have sufficient light conversion efficiency to measure the light level in the autofocus area of the imaging sensor. Therefore, the light conversion efficiency of the floating diffusion node is preferably adjusted to 1/30 slightly higher than that of the main photodiode described above. The mechanism for adjusting the optical conversion efficiency of the floating diffusion node is discussed in the above-referenced U.S. patent application, which is incorporated by reference herein in its entirety. However, increasing the light conversion efficiency of the floating diffusion node reduces the improvement in dynamic range achievable by utilizing the floating diffusion node as the second photodiode during image exposure. In one embodiment, the floating diffusion node light conversion efficiency is set to 1/10 greater than the main photodiode light conversion efficiency. In another embodiment, the floating diffusion node light conversion efficiency is set to 1/30 greater than the main photodiode light conversion efficiency.
The embodiments described above relate to rows and columns of pixel sensors; however, it should be understood that in other embodiments, the rows and columns may be interchanged. In addition, the autofocus pixel sensor may be organized such that the columns of floating diffusion nodes form two linear arrays for autofocus purposes.
To simplify the following discussion, the photodiode used in the autofocus adjustment will be referred to as an autofocus photodiode. In the above embodiments, the parasitic photodiode associated with the floating diffusion node is an autofocus photodiode. These embodiments do not increase the area of the pixel sensor and therefore provide significant advantages. However, the parasitic photodiode is not a pinned photodiode and therefore has increased noise relative to the main photodiode. These noise problems can be reduced by using a separate small pinned photodiode instead of the parasitic photodiode of the floating diffusion node. In such an embodiment, the optical conversion efficiency of the floating diffusion node would be intentionally reduced, as is the case with conventional imaging arrays.
Referring now to fig. 10, fig. 10 is a schematic diagram of a pixel sensor having two photodiodes that may be used in such a two photodiode autofocus embodiment. The pixel sensor 300 includes a main photodiode 322 and an auxiliary photodiode 301. The area of the auxiliary photodiode 301 is selected to be much smaller than the area of the photodiode 322. For example, in one embodiment, the area of the auxiliary photodiode 301 is less than 0.1 times the area of the main photodiode 322. Both photodiodes may be connected to the floating diffusion node 343 through control gates 302 and 304, respectively. Since the auxiliary photodiode 301 has a much smaller area than the main photodiode 322, no anti-blooming gate is required. Both photodiodes may be read out in a manner similar to that discussed above with respect to the parasitic photodiode embodiment. During non-autofocus operation, the photo-charges accumulated on the auxiliary photodiode 301 may be used to extend the dynamic range of the pixel sensor 300 in a manner similar to that described above. For the purposes of this discussion, an important aspect of the pixel sensor 300 is the relative arrangement of the primary photodiode 322 and the secondary photodiode 301 within the pixel sensor 300.
Referring now to FIG. 11, FIG. 11 is a top view of a portion of an imaging array utilizing the pixel sensor design shown in FIG. 10 according to one embodiment of the present invention. Imaging array 400 differs from imaging array 60 shown in fig. 5 in that each third row of pixel sensors is a mirror image of the corresponding row in imaging array 60. This results in two arrays of auxiliary photodiodes as shown at 431 and 432. As a result, the auxiliary photodiode in one of the rows (e.g., row 433) preferentially receives light from one side of the microlens in the pixel sensor where the auxiliary photodiode is located, and the auxiliary photodiode in another one of the rows (e.g., 434) preferentially receives light from the other side of the microlens.
Referring now to fig. 12, fig. 12 is a cross-sectional view through line 12-12 shown in fig. 11. The auxiliary photodiode 471 in the pixel sensor 466, which is part of row 433, receives light from one half of the microlens 440 and substantially less light from the other half of the microlens 440. In contrast, the auxiliary photodiode 472 in the pixel sensor 467 preferentially receives light from half of the microlens 443 shown at 442A. Thus, the auxiliary photodiodes in the two rows of pixel sensors may be used as an autofocus sensing array. While the auxiliary photodiodes are asymmetrically placed, the main photodiodes 422 and 423 form a regular rectangular array.
The manner in which the auxiliary photodiode is used during auto-focus is similar to that described above with respect to the parasitic photodiode. To simplify the following discussion, the pixel sensor whose auxiliary photodiode is used for autofocus purposes will be referred to again as an autofocus pixel sensor. Those autofocus pixel sensors located in a row similar to row 433 will be referred to as top autofocus pixel sensors. Those autofocus pixel sensors in a row that are located in a similar position as row 434 will be referred to as bottom autofocus pixel sensors. The labels "top" and "bottom" are merely labels and are not intended to indicate position relative to the earth. Typically, the area of the imaging array that generates an image in a particular area of the field of view that is to be kept in focus will have a two-dimensional array of autofocus pixel sensors that can be used to make autofocus measurements. In the following discussion, this region will be referred to as an autofocus region. Any particular autofocus pixel sensor may be identified by a pair on the index (I, J), indicating the position of the autofocus pixel sensor in the two-dimensional imaging array. The signal from the auxiliary photodiode in the bottom autofocus pixel sensor will be represented by B (I, J) and the signal from the auxiliary photodiode in the top autofocus pixel sensor will be represented by T (I, J). Since each top autofocus pixel sensor has a corresponding bottom autofocus pixel sensor, the index is selected such that B (I, J) is the autofocus pixel sensor corresponding to T (I, J). The autofocus area signal will correspond to a set of possible a (I, J) and B (I, J) signals. The autofocus adjustment is then performed with reference to the parasitic photodiode as described above.
Other layouts of the autofocus photodiode (parasitic photodiode of the floating diffusion node or individual photodiode) than those discussed above are also possible.
The above-described embodiments of the present invention have been provided to illustrate various aspects of the present invention. It is to be understood, however, that the various aspects of the invention, as shown in the various specific embodiments, may be combined to provide yet other embodiments of the invention. In addition, various modifications to the present invention will become apparent from the foregoing description and accompanying drawings. Accordingly, the invention is not to be restricted except in light of the attached claims. Fig. 13-15 show three other possible embodiments. In principle, any arrangement in which the autofocus photodiodes form two linear arrays and each linear array preferentially receives light from the side of the microlens may be used.

Claims (14)

1. An apparatus comprising a two-dimensional array of pixel sensors, each pixel sensor comprising:
a main photodiode;
an auto-focus photodiode; and
a microlens that focuses light onto the main photodiode and the autofocus photodiode,
the two-dimensional array of pixel sensors includes first and second autofocus arrays of pixel sensors, the pixel sensors in the first autofocus array of pixel sensors having the autofocus photodiodes positioned such that each autofocus photodiode preferentially receives light from one half of the microlenses in the pixel sensor in the first autofocus array, and the pixel sensors in the second autofocus array of pixel sensors having the autofocus photodiodes positioned such that each autofocus photodiode preferentially receives light from the other half of the microlenses in the pixel sensor in the second autofocus array.
2. The apparatus of claim 1, wherein the autofocus photodiode comprises a pinned photodiode.
3. The apparatus of claim 2, the primary photodiode characterized by a primary photodiode region, the pinned photodiode characterized by a pinned photodiode region, the primary photodiode region larger than the pinned photodiode region.
4. The apparatus of claim 1, wherein the autofocus photodiode comprises a parasitic photodiode associated with a floating diffusion node in each of the pixel sensors.
5. The apparatus of claim 1, wherein the pixel sensors in the first autofocus array of pixel sensors have autofocus photodiodes positioned such that each autofocus photodiode receives more than 80% of the light from one half of the microlenses in the pixel sensor, and wherein the pixel sensors in the second autofocus array of pixel sensors have autofocus photodiodes each positioned such that each autofocus photodiode preferentially receives light from the other half of the microlenses in the pixel sensor.
6. The apparatus of claim 1, wherein the pixel sensors in the first autofocus array of pixel sensors have autofocus photodiodes positioned such that each autofocus photodiode receives more than 90% of the light from one half of the microlenses in the pixel sensor, and wherein the pixel sensors in the second autofocus array of pixel sensors have autofocus photodiodes each positioned such that each autofocus photodiode preferentially receives light from the other half of the microlenses in the pixel sensor.
7. The apparatus of claim 1, further comprising
A camera lens that images a scene to be photographed onto the two-dimensional array of pixel sensors;
an actuator to move the camera lens relative to the two-dimensional imaging array in response to an autofocus signal from a controller, the controller configured to
Exposing the pixel sensor to light from a scene to be photographed for an auto-focus time period;
obtaining a signal from each pixel sensor in the first and second arrays of pixel sensors, the signal being indicative of an amount of light received during an auto-focus period; and
generating the autofocus signal such that a predetermined portion of the scene will be focused on a predetermined area of the two-dimensional array of pixel sensors.
8. The apparatus of claim 7, wherein generating the autofocus signal comprises calculating a cross-correlation function between a signal from the autofocus photodiode in the first autofocus array for a pixel sensor and a signal from the autofocus photodiode in the second autofocus array for a pixel sensor.
9. The apparatus of claim 7, wherein the primary photodiodes of the pixel sensors in the two-dimensional array of pixel sensors are organized as a uniform array having equal spacing in each dimension in the two-dimensional array, and wherein the autofocus photodiodes form a non-uniform array.
10. The apparatus of claim 9, wherein the first autofocus array of pixel sensors is a mirror image of the second autofocus array of pixel sensors.
11. The apparatus of claim 7, wherein the controller generates a first image of the scene using the primary photodiode in the two-dimensional array of pixel sensors including the first and second autofocus arrays of pixel sensors.
12. The apparatus of claim 1, wherein the first autofocus array of pixel sensors comprises a first linear array of pixel sensors, wherein the second autofocus array of pixel sensors comprises a second linear array of pixel sensors configured as a mirror image of the first linear array of pixel sensors.
13. The apparatus of claim 1, wherein the pixel sensor includes a plurality of color filters of different colors, one color filter of the plurality of color filters disposed under each microlens of the two-dimensional array of pixel sensors, the first autofocus array is characterized by having a first number of color filters of each color included in the first autofocus array of pixel sensors, and the second autofocus array is characterized by having a second number of color filters of each color included in the second autofocus array of pixel sensors, the first and second numbers being equal.
14. The apparatus of claim 11, wherein the controller outputs the first image and light intensity measurements determined from the autofocus photodiode in each of the pixel sensors.
CN201780087438.XA 2017-02-28 2017-02-28 Autofocus system for CMOS imaging sensor Active CN110352489B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2017/020026 WO2018160172A1 (en) 2017-02-28 2017-02-28 AUTOFOCUS SYSTEM FOR CMOS lMAGING SENSORS

Publications (2)

Publication Number Publication Date
CN110352489A CN110352489A (en) 2019-10-18
CN110352489B true CN110352489B (en) 2020-10-16

Family

ID=63246628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780087438.XA Active CN110352489B (en) 2017-02-28 2017-02-28 Autofocus system for CMOS imaging sensor

Country Status (6)

Country Link
US (1) US10237501B2 (en)
EP (1) EP3590134A4 (en)
JP (1) JP6780128B2 (en)
CN (1) CN110352489B (en)
CA (1) CA3054777C (en)
WO (1) WO2018160172A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019050522A (en) * 2017-09-11 2019-03-28 キヤノン株式会社 Imaging device
RU2726219C1 (en) * 2020-01-09 2020-07-10 АКЦИОНЕРНОЕ ОБЩЕСТВО "Научно-исследовательский институт оптико-электронного приборостроения" (АО "НИИ ОЭП") Method for guidance and focusing of radiation on a target and device for its implementation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040006813A (en) * 2002-07-15 2004-01-24 주식회사 하이닉스반도체 Image sensor with improved light sensitivity
CN102610622A (en) * 2008-11-07 2012-07-25 索尼株式会社 Solid-state imaging device, method for manufacturing solid-state imaging device, and electronic apparatus
CN102890386A (en) * 2011-07-20 2013-01-23 三星电子株式会社 Imaging device
CN105359505A (en) * 2013-07-05 2016-02-24 索尼公司 Solid-state image pickup device and driving method therefor, and electronic apparatus

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4500434B2 (en) * 2000-11-28 2010-07-14 キヤノン株式会社 Imaging apparatus, imaging system, and imaging method
JP5045012B2 (en) * 2006-07-20 2012-10-10 株式会社ニコン Solid-state imaging device and imaging apparatus using the same
JP4935544B2 (en) * 2007-07-06 2012-05-23 株式会社ニコン Imaging device
KR100915758B1 (en) * 2007-11-19 2009-09-04 주식회사 동부하이텍 Method for Manufacturing An Image Sensor
JP5593602B2 (en) * 2008-09-24 2014-09-24 ソニー株式会社 Imaging device and imaging apparatus
JP5537905B2 (en) * 2009-11-10 2014-07-02 富士フイルム株式会社 Imaging device and imaging apparatus
JP5693082B2 (en) * 2010-08-09 2015-04-01 キヤノン株式会社 Imaging device
JP6041495B2 (en) * 2011-03-24 2016-12-07 キヤノン株式会社 Imaging apparatus and defective pixel determination method
JP2013021168A (en) 2011-07-12 2013-01-31 Sony Corp Solid-state imaging device, manufacturing method of solid-state imaging device, and electronic apparatus
JP5914055B2 (en) * 2012-03-06 2016-05-11 キヤノン株式会社 Imaging device
WO2013183382A1 (en) * 2012-06-07 2013-12-12 富士フイルム株式会社 Imaging device and imaging apparatus
JP6149369B2 (en) * 2012-09-27 2017-06-21 株式会社ニコン Image sensor
JP2014165778A (en) * 2013-02-27 2014-09-08 Nikon Corp Solid state image sensor, imaging device and focus detector
US10128296B2 (en) * 2013-07-08 2018-11-13 BAE Systems Imaging Solutions Inc. Imaging array with improved dynamic range utilizing parasitic photodiodes
JP5861011B2 (en) * 2013-08-22 2016-02-16 富士フイルム株式会社 Imaging apparatus and focus control method
KR20160008385A (en) * 2014-07-14 2016-01-22 삼성전자주식회사 Phase Detection Auto Focus Pixel and Image Sensor therewith
JP6588702B2 (en) * 2015-01-05 2019-10-09 キヤノン株式会社 Imaging apparatus, control method therefor, program, and storage medium
US9749556B2 (en) * 2015-03-24 2017-08-29 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with phase detection capabilities
KR102374112B1 (en) * 2015-07-15 2022-03-14 삼성전자주식회사 An image sensor including an auto focusing pixel, and an image processing system including the same
US10044959B2 (en) 2015-09-24 2018-08-07 Qualcomm Incorporated Mask-less phase detection autofocus
JP6873729B2 (en) * 2017-02-14 2021-05-19 キヤノン株式会社 Focus detector and image pickup device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040006813A (en) * 2002-07-15 2004-01-24 주식회사 하이닉스반도체 Image sensor with improved light sensitivity
CN102610622A (en) * 2008-11-07 2012-07-25 索尼株式会社 Solid-state imaging device, method for manufacturing solid-state imaging device, and electronic apparatus
CN102890386A (en) * 2011-07-20 2013-01-23 三星电子株式会社 Imaging device
CN105359505A (en) * 2013-07-05 2016-02-24 索尼公司 Solid-state image pickup device and driving method therefor, and electronic apparatus

Also Published As

Publication number Publication date
US10237501B2 (en) 2019-03-19
US20180249106A1 (en) 2018-08-30
JP6780128B2 (en) 2020-11-04
WO2018160172A1 (en) 2018-09-07
CA3054777C (en) 2020-07-07
EP3590134A1 (en) 2020-01-08
CA3054777A1 (en) 2018-09-07
CN110352489A (en) 2019-10-18
WO2018160172A8 (en) 2019-02-14
JP2020510867A (en) 2020-04-09
EP3590134A4 (en) 2020-10-14

Similar Documents

Publication Publication Date Title
US10412349B2 (en) Image sensor including phase detection pixel
TWI500319B (en) Extended depth of field for image sensor
US10397465B2 (en) Extended or full-density phase-detection autofocus control
CN105518862B (en) The driving method and electronic device of solid imaging element, solid imaging element
US9338380B2 (en) Image processing methods for image sensors with phase detection pixels
CN109981939B (en) Imaging system
US20080080028A1 (en) Imaging method, apparatus and system having extended depth of field
CN112736101B (en) Image sensor having shared microlenses between multiple sub-pixels
JP2008263352A (en) Imaging element, focus detecting device, and imaging device
JP5249136B2 (en) Imaging device
US9071748B2 (en) Image sensor with focus-detection pixels, and method for reading focus-information
JP2017163416A (en) Imaging apparatus, its control method, program, and recording medium
US20170257583A1 (en) Image processing device and control method thereof
KR20200075828A (en) Imaging element, image processing apparatus, image processing method, and program
CN110352489B (en) Autofocus system for CMOS imaging sensor
US11451715B2 (en) Imaging apparatus, exposure controlling method, and imaging device
CN109982070A (en) Imaging sensor and its operating method with calibration phase-detection pixel
JP2016184868A (en) Imaging device and driving method of imaging device
JP6545013B2 (en) Image forming method, image forming apparatus, and image forming program
JP6274897B2 (en) Image sensor and image sensor driving method
WO2021192176A1 (en) Image capturing device
US20240205560A1 (en) Sensor including micro lenses of different sizes
JP2017208651A (en) Imaging apparatus
JP6365568B2 (en) Imaging device and imaging apparatus
JP2005300756A (en) Solid imaging apparatus and solid imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant