WO2022031503A1 - Image capture devices having phase detection auto-focus pixels - Google Patents

Image capture devices having phase detection auto-focus pixels Download PDF

Info

Publication number
WO2022031503A1
WO2022031503A1 PCT/US2021/043607 US2021043607W WO2022031503A1 WO 2022031503 A1 WO2022031503 A1 WO 2022031503A1 US 2021043607 W US2021043607 W US 2021043607W WO 2022031503 A1 WO2022031503 A1 WO 2022031503A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
array
ocls
pixel
photodetectors
Prior art date
Application number
PCT/US2021/043607
Other languages
French (fr)
Inventor
Xiangli Li
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc. filed Critical Apple Inc.
Publication of WO2022031503A1 publication Critical patent/WO2022031503A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/005Arrays characterized by the distribution or form of lenses arranged along a single direction only, e.g. lenticular sheets
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/201Filters in the form of arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/75Circuitry for providing, modifying or processing image signals from the pixel array
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • H04N25/778Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising amplifiers shared between a plurality of pixels, i.e. at least one part of the amplifier must be on the sensor array itself

Definitions

  • the described embodiments relate generally to devices having a camera or other image capture device. More particularly, the described embodiments relate to an image capture device having phase detection auto-focus (PDAF) pixels.
  • PDAF phase detection auto-focus
  • CMOS complementary metal-oxide-semiconductor
  • CCD charge-coupled device
  • a camera or other image capture device may include multiple image sensors, with the different image sensors having adjacent or interlaced arrays of pixels.
  • Many cameras and other image capture devices include one or more optical components (e.g., a lens or lens assembly) that are configurable to focus light, received or reflected from an image, onto the surface of an image sensor. Before or while capturing an image, the distance between the optical component(s) and image sensor (or a tilt or other parameters of the optical components or image sensor) may be adjusted to focus an image onto the image sensor.
  • macro (or rough) focusing may be performed for an image sensor prior to capturing an image using the image sensor (e.g., using a macro focus mechanism adjacent the image sensor).
  • Micro (or fine) focusing can then be performed after acquiring one or more images using the image sensor.
  • all focusing may be performed prior to capturing an image (e.g., by adjusting one or more relationships between a lens, lens assembly, or image sensor); or all focusing may be performed after acquiring an image (e.g., by adjusting pixel values using one or more digital image processing algorithms).
  • Many cameras and other image capture devices perform focusing operations frequently, and in some cases before and/or after the capture of each image capture frame.
  • Focusing an image onto an image sensor often entails identifying a perceptible edge between objects, or an edge defined by different colors or brightness (e.g., an edge between dark and light regions), and making adjustments to a lens, lens assembly, image sensor, or pixel value(s) to bring the edge into focus.
  • Embodiments of the systems, devices, methods, and apparatus described in the present disclosure are directed to an image capture device having PDAF pixels.
  • the present disclosure describes an image capture device.
  • the image capture device may include an array of pixels, and an array of 1x2 on-chip lenses (OCLs) disposed over the array of pixels.
  • OCLs on-chip lenses
  • Each pixel may include a 2x2 array of photodetectors.
  • a respective pair of adjacent 1x2 OCLs may be disposed over a pixel.
  • Each respective pair of adjacent 1x2 OCLs may include a respective first 1x2 OCL disposed over a first photodetector and a second photodetector in the 2x2 array of photodetectors for the pixel, and a respective second 1x2 OCL disposed over a third photodetector and a fourth photodetector in the 2x2 array of photodetectors for the pixel.
  • the present disclosure describes another image capture device.
  • the image capture device may include an array of photodetectors, and an array of oblong OCLs disposed over the array of photodetectors.
  • a different pair of photodetectors in the array of photodetectors may be disposed under each of the oblong OCLs, and every photodetector in the array of photodetectors may be disposed under a respective one of the oblong OCLs in the array of oblong OCLs.
  • FIGs. 1 A and 1 B show an example of a device that may include one or more image capture devices;
  • FIG. 2 shows an example embodiment of an image capture device, including an image sensor, a lens or lens assembly, and an auto-focus mechanism;
  • FIG. 3 shows an example of an image that may be captured by an image capture device
  • FIG. 4 shows a plan view of one example of an image sensor
  • FIG. 5 shows an example imaging area (e.g., a plan view) of a pixel in an image capture device
  • FIG. 6 shows an example cross-section of the pixel shown in FIG. 5;
  • FIG. 7 shows a simplified schematic of a pixel usable in an image sensor
  • FIGs. 8A and 8B show an array of pixels
  • FIGs. 9A-9D show an imaging area of an image capture device, in which the pixels of the image capture device are arranged in accordance with a Bayer pattern (i.e., a 2x2 pattern including red pixels and blue pixels along one diagonal, and green pixels along the other diagonal); and
  • FIG. 10 shows a sample electrical block diagram of an electronic device.
  • cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and also to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalities of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures.
  • the present disclosure relates to an image capture device that provides improved PDAF performance.
  • PDAF pixels may have a metal shield configuration.
  • a metal shield pixel may include a microlens that focuses incoming light on a photodiode, which photodiode in turn converts photons into electron (or hole) pairs.
  • the collected electrons (for electron collection devices) or holes (for hole collection devices) may be converted into an analog voltage through a pixel source follower (SF) transistor amplifier.
  • the analog voltage may then be converted to a digital signal by an analog-to-digital converter (ADC).
  • ADC analog-to-digital converter
  • a metal shield e.g., a Tungsten (W) or Copper/Aluminum (Cu/AI) metal shield
  • W Tungsten
  • Cu/AI Copper/Aluminum
  • a metal shield may cover half of the photodiode (e.g., a left half or a right half).
  • a left-shielded pixel For a left-shielded pixel, light from left-incident angles is blocked by the metal shield, and only light approaching the pixel from right-incident angles is received by the photodiode.
  • a right-shielded pixel functions in the opposite manner.
  • the angular sensitivity of left and right metal shield pixels can be used to generate PDAF information.
  • metal-shielded pixels need to be treated as defective pixels, and their signals need to be corrected before being used to generate an image.
  • metal shield pixels (or pairs of left/right-shielded pixels) may be sparsely distributed over the surface of an image sensor. That is, a relatively small number of an image sensor’s pixels (e.g., 3-4%) may be configured as left- or right- shielded pixels. In one example, for every eight rows and eight columns of pixels (e.g., for every block of 64 pixels in a pixel array), one left-shielded pixel and one right-shielded pixel may be provided.
  • left- and right-shielded pixels will have disparate signals (e.g., signals not matched in magnitude and/or polarity) when an image is not in focus, but will have well-matched signals when an image is in focus.
  • the signals of left- and right- shielded pixels therefore provide PDAF information that can be used by an auto-focus (AF) mechanism to adjust the position of one or more optical components (e.g., a lens) or an image sensor, and thereby adjust the focus of an image on the image sensor, or to digitally adjust or compensate for an out-of-focus condition.
  • AF auto-focus
  • an image may be brought into focus based on a PDAF information obtained during a single image capture frame.
  • images may be quickly and continuously focused on an image sensor.
  • left- and right-shielded pixels may be fabricated without metal shields by placing both pixels adjacent one another under a single microlens. Each pixel has its own photodiode, and there may be implant isolation or physical trench isolation between the photodiodes of the two pixels. Because of the nature (e.g., curvature) of the microlens, light from left-incident angles is received mainly by the left-side pixel, and light from right-incident angles is received mainly by the right-side pixel. As a result, left- and right-side pixels placed adjacent one another under a single microlens may function similarly to left and right metal shielded pixels.
  • one blue pixel in every 8x8 block of pixels may be replaced by a green pixel (or may be modified to function as a green pixel), so that two adjacent green pixels may be placed under a single microlens to provide PDAF information.
  • the signals of both pixels need to be corrected before being used to generate an image.
  • each pixel in a pixel array may be divided into left and right sub-pixels, and PDAF information may be obtained from each pixel. Also, because all pixels are implemented in a similar manner, the sub-pixel signals for each pixel may be combined in a similar way, or signal corrections may be made to each pixel in a similar way, to increase the confidence level that pixel signals are being generated or corrected appropriately (especially in low light conditions).
  • the PDAF information provided by such a pixel array bases image focus entirely on vertical edges.
  • PDAF performance may suffer.
  • pixels in a pixel array may be configured to have a 2x2 array of sub-pixels (e.g., photodetectors) disposed under a microlens.
  • the entirety of a pixel array may incorporate such pixels.
  • the pixels can be used, in various embodiments or configurations, to provide PDAF information based on edges having more than one orientation (e.g., vertical and horizontal edges), to improve PDAF performance (especially in low light conditions), to reduce or eliminate the need for signal correction, or to increase the resolution of an image sensor.
  • the shared microlens over a 2x2 array of sub-pixels tends to reduce the signal-to-noise ratio (SNR) of the signals acquired by the sub-pixels.
  • SNR signal-to-noise ratio
  • the SNR of some sub-pixels may be reduced even further, and the SNR of each sub-pixel may differ (i.e., each of the four sub-pixels positioned under a shared microlens may have a different SNR).
  • This greatly increases the processing burden e.g., including lens offset or misalignment correction, re-mosaicing burden, and so on
  • the processing burden e.g., including lens offset or misalignment correction, re-mosaicing burden, and so on
  • an image capture device in which a pair of adjacent 1x2 on- chip lenses (OCLs) are disposed over the sub-pixels (or photodetectors) of a pixel having a 2x2 array of sub-pixels (or photodetectors).
  • OCLs on- chip lenses
  • the simplification in the corrections that need to be made can improve focus and image quality; reduce re-mosaicing challenges; and enable 1x2 OCLs to be disposed over all of the pixels and photodetectors of an image capture device.
  • a 1x2 OCL may also be shaped such that it allows more light into each of the sub-pixels over which it is disposed (e.g., as compared to a microlens disposed over a 2x2 array of subpixels). Allowing more light into a sub-pixel increases its SNR.
  • the 1x2 OCLs can be oriented in different directions, to enable the focus of orthogonal sets of edges to be detected.
  • FIGs. 1 A-10 These and other embodiments are described with reference to FIGs. 1 A-10. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.
  • FIGs. 1 A and 1 B show an example of a device 100 that may include one or more image capture devices.
  • the device s dimensions and form factor, including the ratio of the length of its long sides to the length of its short sides, suggest that the device 100 is a mobile phone (e.g., a smart phone).
  • the device could alternatively be any portable electronic device including, for example, a mobile phone, tablet computer, portable computer, portable music player, electronic watch, health monitor device, portable terminal, vehicle navigation system, robot navigation system, or other portable or mobile device.
  • the device 100 could also be a device that is semi-permanently located (or installed) at a single location.
  • FIG. 1 A shows a front isometric view of the device 100
  • FIG. 1 B shows a rear isometric view of the device 100
  • the device 100 may include a housing 102 that at least partially surrounds a display 104.
  • the housing 102 may include or support a front cover 106 or a rear cover 108.
  • the front cover 106 may be positioned over the display 104, and may provide a window through which the display 104 may be viewed.
  • the display 104 may be attached to (or abut) the housing 102 and/or the front cover 106.
  • the display 104 may not be included and/or the housing 102 may have an alternative configuration.
  • the display 104 may include one or more light-emitting elements including, for example, light-emitting diodes (LEDs), organic LEDs (OLEDs), a liquid crystal display (LCD), an electroluminescent (EL) display, or other types of display elements.
  • the display 104 may include, or be associated with, one or more touch and/or force sensors that are configured to detect a touch and/or a force applied to a surface of the front cover 106.
  • the various components of the housing 102 may be formed from the same or different materials.
  • the sidewall 118 may be formed using one or more metals (e.g., stainless steel), polymers (e.g., plastics), ceramics, or composites (e.g., carbon fiber).
  • the sidewall 118 may be a multi-segment sidewall including a set of antennas.
  • the antennas may form structural components of the sidewall 118.
  • the antennas may be structurally coupled (to one another or to other components) and electrically isolated (from each other or from other components) by one or more non-conductive segments of the sidewall 118.
  • the front cover 106 may be formed, for example, using one or more of glass, a crystal (e.g., sapphire), or a transparent polymer (e.g., plastic) that enables a user to view the display 104 through the front cover 106.
  • a portion of the front cover 106 e.g., a perimeter portion of the front cover 106 may be coated with an opaque ink to obscure components included within the housing 102.
  • the rear cover 108 may be formed using the same material(s) that are used to form the sidewall 118 or the front cover 106.
  • the rear cover 108 may be part of a monolithic element that also forms the sidewall 118 (or in cases where the sidewall 118 is a multi-segment sidewall, those portions of the sidewall 118 that are non-conductive).
  • all of the exterior components of the housing 102 may be formed from a transparent material, and components within the device 100 may or may not be obscured by an opaque ink or opaque structure within the housing 102.
  • the front cover 106 may be mounted to the sidewall 118 to cover an opening defined by the sidewall 118 (/.e., an opening into an interior volume in which various electronic components of the device 100, including the display 104, may be positioned).
  • the front cover 106 may be mounted to the sidewall 118 using fasteners, adhesives, seals, gaskets, or other components.
  • a display stack or device stack including the display 104 may be attached (or abutted) to an interior surface of the front cover 106 and extend into the interior volume of the device 100.
  • the stack may include a touch sensor (e.g., a grid of capacitive, resistive, strain-based, ultrasonic, or other type of touch sensing elements), or other layers of optical, mechanical, electrical, or other types of components.
  • the touch sensor (or part of a touch sensor system) may be configured to detect a touch applied to an outer surface of the front cover 106 (e.g., to a display surface of the device 100).
  • a force sensor (or part of a force sensor system) may be positioned within the interior volume below and/or to the side of the display 104 (and in some cases within the device stack).
  • the force sensor (or force sensor system) may be triggered in response to the touch sensor detecting one or more touches on the front cover 106 (or a location or locations of one or more touches on the front cover 106), and may determine an amount of force associated with each touch, or an amount of force associated with the collection of touches as a whole.
  • the touch sensor (or touch sensor system) may be triggered in response to the force sensor detecting one or more forces on the front cover 106.
  • the force sensor may be used as (e.g., as an alternative to) a separate touch sensor.
  • the device 100 may include various other components.
  • the front of the device 100 may include one or more front-facing cameras 110 or other image capture devices (including one or more image sensors), speakers 112, microphones, or other components 114 e.g., audio, imaging, and/or sensing components) that are configured to transmit or receive signals to/from the device 100.
  • a front-facing camera 110 alone or in combination with other sensors, may be configured to operate as a bio-authentication or facial recognition sensor.
  • the device 100 may also include various input devices, including a mechanical or virtual button 116, which may be accessible from the front surface (or display surface) of the device 100.
  • the front-facing camera 110, one or more other cameras, and/or one or more other optical emitters, optical detectors, or other optical sensors may be positioned under the display 104 instead of adjacent the display 104.
  • the camera(s), optical emitter(s), optical detector(s), or sensor(s) may emit and/or receive light through the display 104.
  • the device 100 may also include buttons or other input devices positioned along the sidewall 118 and/or on a rear surface of the device 100.
  • a volume button or multipurpose button 120 may be positioned along the sidewall 118, and in some cases may extend through an aperture in the sidewall 118.
  • the sidewall 118 may include one or more ports 122 that allow air, but not liquids, to flow into and out of the device 100.
  • one or more sensors may be positioned in or near the port(s) 122.
  • an ambient pressure sensor, ambient temperature sensor, internal/external differential pressure sensor, gas sensor, particulate matter concentration sensor, or air quality sensor may be positioned in or near a port 122.
  • the rear surface of the device 100 may include a rearfacing camera 124 or other image capture device (including one or more image sensors; see FIG. 1 B).
  • a flash or light source 126 may also be positioned along the rear surface of the device 100 (e.g., near the rear-facing camera).
  • the rear surface of the device 100 may include multiple rear-facing cameras.
  • FIG. 2 shows an example embodiment of an image capture device (e.g., a camera 200), including an image sensor 202, a lens 204 or lens assembly, and a mechanical autofocus mechanism 206.
  • an image capture device e.g., a camera 200
  • the components shown in FIG. 2 may be associated with the first camera 110 or the second camera 124 shown in FIGs. 1 A-1 B.
  • the image sensor 202 may include a plurality of pixels, such as a plurality of pixels arranged in a two-dimensional array of pixels. Multiple ones (or all) of the pixels may each include a two-dimensional array of photodetectors (e.g., a 2x2 array of photodetectors). The photodetectors that are associated with a pixel may be electrically isolated from each other. As will be described with reference to other figures, different OCLs may be disposed over different pairs of a pixel’s photodetectors.
  • the lens 204 may be adjustable with respect to the image sensor 202, to focus an image of a scene 208 on the image sensor 202.
  • the lens 204 or lens assembly may be moved with respect to the image sensor 202 (e.g., moved to change a distance between the lens 204 or lens assembly and the image sensor 202, moved to change an angle between a plane of a lens 204 or lenses and a plane of the image sensor 202, and so on).
  • the image sensor 202 may be moved with respect to the lens 204 or lens assembly.
  • the auto-focus mechanism 206 may include (or the functions of the auto-focus mechanism 206 may be provided by) a processor in combination with a voice coil, piezoelectric element, or other actuator mechanism that moves the lens 204, lens assembly, or image sensor 202.
  • the auto-focus mechanism 206 may receive signals from the image sensor 202 and, in response to the signals, adjust a focus setting of the camera 200.
  • the signals may include PDAF information.
  • the PDAF information may include horizontal phase detection signals, vertical phase detection signals, and/or other phase detection signals.
  • the auto-focus mechanism 206 may adjust a focus setting of the camera 200 by, for example, adjusting a relationship between the image sensor 202 (or plurality of pixels) and the lens 204 or lens assembly (e.g., by adjusting a physical position of the lens 204, lens assembly, or image sensor 202). Additionally or alternatively, the processor of the auto-focus mechanism 206 may use digital image processing techniques to adjust the values output by the pixels and/or photodetectors of the image sensor 202. The values may be adjusted to digitally improve, or otherwise alter, the focus of an image of the scene 208. In some embodiments, the autofocus mechanism 206 may be used to provide only mechanical, or only digital, focus adjustments.
  • FIG. 3 there is shown an example of an image 300 that may be captured by an image capture device, such as one of the cameras described with reference to FIGs. 1A-1 B or 2.
  • the image 300 may include a number of objects 302, 304 having edges 306, 308 oriented in one or more directions.
  • the edges 306, 308 may include perceptible edges between objects, or edges defined by different colors or brightness levels (e.g., an edge between dark and light regions).
  • the camera may only detect a focus of one set of edges (e.g., only horizontal edges or only vertical edges).
  • the camera may detect a focus of both a first set of edges (e.g., horizontal edges) and a second set of edges (e.g., vertical edges, or edges that are orthogonal to the first set of edges).
  • the focus of the first and/or second sets of edges may be detected in the same or different image capture frames, using the same or different pixels.
  • a focus of edges in the first set of edges may be detected using a first subset of pixels configured to detect a focus of horizontal edges, in a same frame that a focus of edges in the second set of edges is detected by a second subset of pixels configured to detect a focus of vertical edges.
  • a focus of edges may be detected based on a phase difference (e.g., magnitude and polarity of the phase difference) in light captured by different photodetectors in a pair of photodetectors associated with a pixel.
  • a single pixel in a pixel array (and in some cases, some or each of the pixels in the pixel array, or each of the pixels in a subset of pixels in the pixel array) may be configured to produce a signal usable for detecting the focus of a horizontal edge or a vertical edge.
  • all of the pixels in a pixel array (or all of the pixels used to capture a particular image) may be employed in the detection of edge focus information for an image.
  • FIG. 4 shows a plan view of one example of an image sensor 400, such as an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2.
  • the image sensor 400 may include an image processor 402 and an imaging area 404.
  • the imaging area 404 may be implemented as a pixel array that includes a plurality of pixels 406.
  • the pixels 406 may be same colored pixels (e.g., for a monochrome imaging area 404) or differently colored pixels (e.g., for a multi-color imaging area 404).
  • the pixels 406 are arranged in rows and columns. However, other embodiments are not limited to this configuration.
  • the pixels in a pixel array may be arranged in any suitable configuration, such as, for example, a hexagonal configuration.
  • the imaging area 404 may be in communication with a column select circuit 408 through one or more column select lines 410, and with a row select circuit 412 through one or more row select lines 414.
  • the row select circuit 412 may selectively activate a particular pixel 406 or group of pixels, such as all of the pixels 406 in a certain row.
  • the column select circuit 408 may selectively receive the data output from a selected pixel 406 or group of pixels 406 (e.g., all of the pixels in a particular row).
  • the row select circuit 412 and/or column select circuit 408 may be in communication with an image processor 402.
  • the image processor 402 may process data from the pixels 406 and provide that data to another processor (e.g., a system processor) and/or other components of a device (e.g., other components of the electronic device 100). In some embodiments, the image processor 402 may be incorporated into the system.
  • the image processor 402 may also receive focus information (e.g., PDAF information) from some or all of the pixels, and may perform a focusing operation for the image sensor 400. In some examples, the image processor 402 may perform one or more of the operations performed by the auto-focus mechanism described with reference to FIG. 2.
  • FIG. 5 shows an example imaging area (e.g., a plan view) of a pixel 500 in an image capture device, such as a pixel included in an image sensor associated with one of the image capture devices or cameras described with reference to FIGS. 1 A-1 B and 2, or a pixel included in the image sensor described with reference to FIG. 4.
  • some pixels in an image sensor, each pixel in an image sensor, or each pixel in a subset of pixels in an image sensor may be configured as shown in FIG. 5.
  • the imaging area of the pixel 500 includes a two-dimensional array of photodetectors 502.
  • the imaging area may include a 2x2 array of photodetectors (e.g., a set of photodetectors 502 arranged in two rows and two columns).
  • the array may include a first photodetector 502a and a second photodetector 502b arranged in a first row, and a third photodetector 502c and a fourth photodetector 502d arranged in a second row.
  • the first photodetector 502a and the third photodetector 502c may be arranged in a first column, and the second photodetector 502b and the fourth photodetector 502d may be arranged in a second column.
  • Each photodetector 502 may be electrically isolated from each other photodetector 502 (e.g., by implant isolation or physical trench isolation).
  • a first 1x2 OCL 504a may be disposed over two of the photodetectors in the pixel 500 (e.g., over the first photodetector 502a and the second photodetector 502b).
  • a second 1x2 OCL 504b may be disposed over the remaining two of the photodetectors in the pixel 500 (e.g., over the third photodetector 502c and the fourth photodetector 502d).
  • An optional single-piece or multi-piece filter element may be disposed over the array of photodetectors 502 (e.g., over the first photodetector 502a, the second photodetector 502b, the third photodetector 502c, and the fourth photodetector 502d).
  • the filter element may be applied to an interior or exterior of each 1x2 OCL 504a, 504b.
  • each OCL 504a, 504b may be tinted to provide the filter element.
  • each of the photodetectors may be separately encapsulated under the OCLs 504a, 504b, and the filter element may be applied to or in the encapsulant.
  • a filter element may be positioned between the array of photodetectors 502 and the OCLs 504a, 504b (although other configurations of the filter element may also be considered as being disposed “between” the photodetectors 502 and the OCLs 504a, 504b).
  • the photodetectors 502 may be connected to a shared readout circuit (i.e., a readout circuit shared by all of the photodetectors 502 associated with the pixel 500).
  • a set of charge transfer transistors may be operable to connect the photodetectors 502 to the shared readout circuit (e.g., each charge transfer transistor in the set may be operable (e.g., by a processor) to connect a respective one of the photodetectors 502 to, and disconnect the respective one of the photodetectors 502 from, the shared readout circuit; alternatively, a charge transfer transistor may be statically configured to connect/disconnect a pair of the photodetectors 502 (e.g., a pair of photodetectors 502 under a common 1x2 OCL, or a pair of photodetectors 502 that are disposed along a direction that is orthogonal to each of the first and second OCLs 504a, 504b, to/from the shared readout circuit).
  • each charge transfer transistor may be
  • FIG. 6 shows an example cross-section of the pixel 500 shown in FIG. 5.
  • the cross-section is taken along line VI-VI, through the first row of photodetectors 502a, 50b shown in FIG. 5.
  • a cross-section taken through the second row of photodetectors 502c, 502d shown in FIG. 5 (not shown) may be configured similarly to the cross-section shown in FIG. 6.
  • the first and second photodetectors 502a, 502b may be formed in a substrate 602.
  • the substrate 602 may include a semiconductor-based material, such as, but not limited to, silicon, silicon-on-insulator (SOI), silicon-on-sapphire (SOS), doped and undoped semiconductor regions, epitaxial layers formed on a semiconductor substrate, well regions or buried layers formed in a semiconductor substrate, or other semiconductor structures.
  • SOI silicon-on-insulator
  • SOS silicon-on-sapphire
  • doped and undoped semiconductor regions epitaxial layers formed on a semiconductor substrate, well regions or buried layers formed in a semiconductor substrate, or other semiconductor structures.
  • the 1x2 OCL 504a may be disposed over part or all of both of the photodetectors 502a and 502b.
  • the OCL 504a may be formed of any material or combination of materials that is translucent to at least one wavelength of light.
  • the OCL 504a may have a lightreceiving side 612 opposite the array of photodetectors 502.
  • the light-receiving side 612 of the OCL 504a may include a central portion 608 and a peripheral portion 610.
  • the peripheral portion 610 may be configured to redirect at least a portion of light incident on the peripheral portion (e.g., the light 606a or light 606c) toward a corresponding peripheral portion of the imaging area that includes the photodetectors 502 (e.g., the light 606a may be redirected toward the photodetector 502a, and the light 606c may be redirected toward the photodetector 502b).
  • the OCL 504a may have a convex-shaped or dome-shaped light-receiving surface (or exterior surface).
  • the OCL 504a may be configured to focus incident light 606 received from different angles on different ones or both of the photodetectors 502a, 502b.
  • light 606a incident on the OCL 504a from a left side approach angle may be focused more (or solely) on the left side photodetector 502a, and thus the left side photodetector 502a may accumulate more charge than the right side photodetector 502b, making the signal response of the left side photodetector 502a greater than the signal response of the right side photodetector 502b.
  • light 606c incident on the OCL 504a from a right side approach angle may be focused more (or solely) on the right side photodetector 502b, and thus the right side photodetector 502b may accumulate more charge than the left side photodetector 502a, making the signal response of the right side photodetector 502b greater than the signal response of the left side photodetector 502a.
  • Light 606b incident on the OCL 504a from the front center (or top) of the OCL 504a may be focused on both of the photodetectors 502a, 502b, making the signal response of the left and right side photodetectors 502a, 502b about equal.
  • An optional same color filter element 604 (e.g., a red filter, a blue filter, a green filter, or the like) may be disposed over each (or both) of the photodetectors 502a, 502b (as well as the photodetectors 502c and 502d, not shown).
  • the pixel 700 may be an example of a pixel included in an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2, or a pixel included in the image sensor described with reference to FIG. 4, or the pixel described with reference to FIGs. 5-6.
  • some pixels in an image sensor, each pixel in an image sensor, or each pixel in a subset of pixels in an image sensor may be configured as shown in FIG. 7, and the shared readout circuit 704 for the pixel 700 may be part of an overall pixel readout circuit for an image sensor.
  • the pixel 700 may include a two-dimensional array of photodetectors 702, with each photodetector 702 being selectively connectable to (and disconnectable from) the shared readout circuit 704 by a respective charge transfer transistor in a set of charge transfer transistors 706.
  • the two-dimensional array of photodetectors 702 may include a 2x2 array of photodetectors (e.g., an array of photodetectors 702 arranged in two rows and two columns).
  • the array may include a first photodetector 702a (PD_TL) and a second photodetector 702b (PD_TR) arranged in a first row, and a third photodetector 702c (PDJ3L) and a fourth photodetector 702d (PDJ3R) arranged in a second row.
  • the first photodetector 702a and the third photodetector 702c may be arranged in a first column, and the second photodetector 702b and the fourth photodetector 702d may be arranged in a second column.
  • the photodetectors 702 may be disposed (positioned) in a 2x2 array under a pair of adjacent 1x2 OCLs.
  • the shared readout circuit 704 may include a sense region 708, a reset (RST) transistor 710, a readout transistor 712, and a row select (RS) transistor 714.
  • the sense region 708 may include a capacitor that temporarily stores charge received from one or more of the photodetectors 702. As described below, charge accumulated by one or more of the photodetectors 702 may be transferred to the sense region 708 by applying a drive signal (e.g., a gate voltage) to one or more of the charge transfer transistors 706. The transferred charge may be stored in the sense region 708 until a drive signal applied to the reset (RST) transistor 710 is pulsed.
  • a drive signal e.g., a gate voltage
  • Each of the charge transfer transistors 706 may have one terminal connected to a respective one of the photodetectors 702 and another terminal connected to the sense region 708.
  • One terminal of the reset transistor 710 and one terminal of the readout transistor 712 may be connected to a supply voltage (e.g., VDD) 720.
  • the other terminal of the reset transistor 710 may be connected to the sense region 708, while the other terminal of the readout transistor 712 may be connected to a terminal of the row select transistor 714.
  • the other terminal of the row select transistor 714 may be connected to an output line 716.
  • each of the photodetectors 702 may be implemented as a photodiode (PD) or pinned photodiode
  • the sense region 708 may be implemented as a floating diffusion (FD) node
  • the readout transistor 712 may be implemented as a source follower (SF) transistor.
  • the photodetectors 702 may be electronbased photodiodes or hole-based photodiodes.
  • the term photodetector is used herein to refer to substantially any type of photon or light detecting component, such as a photodiode, pinned photodiode, photogate, or other photon sensitive region.
  • the term sense region, as used herein, is meant to encompass substantially any type of charge storing or charge converting region.
  • the pixel 700 may be implemented using additional or different components.
  • the row select transistor 714 may be omitted and a pulsed power supply may be used to select the pixel.
  • a pulsed power supply may be used to select the pixel.
  • an integration period for the pixel begins and the photodetectors 702 accumulate photo-generated charge in response to incident light.
  • the integration period ends, the accumulated charge in some or all of the photodetectors 702 may be transferred to the sense region 708 by sequentially or simultaneously applying drive signals to (e.g., by pulsing gate voltages of) the charge transfer transistors 706.
  • the reset transistor 710 is used to reset the voltage on the sense region 708 to a predetermined level prior to the transfer of charge from a set of one or more photodetectors 702 to the sense region 708.
  • a drive signal may be applied to the row select transistor 714 (e.g., a gate voltage of the row select transistor 714 may be pulsed) via a row select line 718 coupled to row select circuitry, and charge from one, two, or any number of the photodetectors 702 may be read out over an output line 716 coupled to column select circuitry.
  • the readout transistor 712 senses the voltage on the sense region 708, and the row select transistor 714 transfers an indication of the voltage to the output line 716.
  • the column select circuitry may be coupled to an image processor, auto-focus mechanism, or combination thereof.
  • a processor may be configured to operate the set of charge transfer transistors 706 to simultaneously transfer charge from multiple photodetectors 702 (e.g., a pair of photodetectors) to the sense region 708 or floating diffusion node.
  • the gates of first and second charge transfer transistors 706a (TX_A) and 706b (TX_B) i.e., the charge transfer transistors of the first row
  • TX_A and TX_B the charge transfer transistors of the first row
  • the gates of third and fourth charge transfer transistors 706c (TX_C) and 706d (TX_D) may be simultaneously driven to transfer charges accumulated by the third and fourth photodetectors 702c, 702d to the sense region 708, where the charges may be summed.
  • This summed charge may also be read out of the pixel 700.
  • the gates of the first and third charge transfer transistors 706a and 706c i.e., the charge transfer transistors of the first column
  • the gates of the first and third charge transfer transistors 706a and 706c may be simultaneously driven to transfer charges accumulated by the first and third photodetectors 702a, 702c to the sense region 708.
  • the gates of the second and fourth charge transfer transistors 706b and 706d may be simultaneously driven to transfer charges accumulated by the second and fourth photodetectors 702b, 702d to the sense region 708. This charge may also be read out of the pixel 700. Additionally or alternatively, charge accumulated by the photodetectors 702 may be read out of the pixel 700 individually, or charges accumulated by any combination (including all) of the photodetectors 702 may be read out of the pixel 700 together, or charges accumulated by the photodetectors 702 along a left- or right-sloping diagonal may be read out of the pixel 700 together.
  • the charges may be summed in various ways, or a processor may interpolate between the values read out of the photodetectors in different pixels of a pixel array (e.g., perform a de-mosaicing operation) to generate an image having an effective 4x resolution for the pixel array.
  • a shared readout circuit per pixel may be configured differently for different pixels in a pixel array.
  • a single charge transfer transistor may be coupled to a pair of photodetectors, and may be operated by a processor to simultaneously read charges out of, and sum charges, integrated by a pair of photodetectors.
  • a single charge transfer transistor could replace both of the charge transfer transistors 706a and 706b and connect both of the photodetectors 702a and 702b to the shared readout circuit 704, and another charge transfer transistor could replace both of the charge transfer transistors 706c and 706d and connect both of the photodetectors 702c and 702d to the shared readout circuit 704.
  • a single charge transfer transistor could replace both of the charge transfer transistors 706a and 706c and connect both of the photodetectors 702a and 702c to the shared readout circuit 704, and another charge transfer transistor could replace both of the charge transfer transistors 706b and 706d and connect both of the photodetectors 702b and 702d to the shared readout circuit 704.
  • an image capture device such as a camera
  • the photodetectors 702 may have to be reset or depleted of charge before an image is captured (e.g., by applying drive signals (e.g., gate voltages) to the reset transistor 710 and charge transfer transistors 706). After the charge from the photodetectors 702 has been depleted, the charge transfer transistors 706 and reset transistor 710 may be turned off to isolate the photodetectors 702 from the shared readout circuit 704. The photodetectors 702 can then accumulate photongenerated charge during a charge integration period.
  • drive signals e.g., gate voltages
  • FIGs. 8A and 8B each show an array of pixels 800 (e.g., a 2x2 array of pixels).
  • the array of pixels 800 may represent a portion of a much larger array of pixels, such as an array of millions of pixels included in an image sensor.
  • the array of pixels 800 may be included in an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2, or a set of pixels included in the image sensor described with reference to FIG. 4.
  • each pixel 802 in the array of pixels 800 may be configured similarly to the pixel described with reference to FIGs. 5-6 and, in some cases, each pixel 802 may be associated with an instance of the shared readout circuit described with reference to FIG. 7.
  • the array of pixels 800 includes a red pixel 800a, first and second green pixels 800b, 800c, and a blue pixel 800d arranged in a Bayer pattern.
  • the Bayer pattern may be achieved by disposing a color filter array over the array of pixels 800.
  • different subsets of filter elements in the color filter array may be disposed over different subsets of pixels in the array of pixels 800, with each subset of filter elements having a different color (e.g., red filter elements 802a, green filter elements 802b, or blue filter elements 802c).
  • the different subsets of filter elements may be associated with different colors (e.g., cyan, yellow, and magenta filter elements; cyan, yellow, green, and magenta filter elements; red, green, blue, and white filter elements; and so on).
  • a color filter array may not be provided, or all of the filter elements in the color filter array may have the same color.
  • Each pixel 800a, 800b, 800c, 800d may include a two-dimensional array of photodetectors 804.
  • each pixel 800a, 800b, 800c, 800d may include a first photodetector 804a and a second photodetector 804b arranged in a first row, and a third photodetector 804c and a fourth photodetector 804d arranged in a second row.
  • the first photodetector 804a and the third photodetector 804c may be arranged in a first column, and the second photodetector 804b and the fourth photodetector 804d may be arranged in a second column.
  • Each photodetector 804a, 804b, 804c, 804d may be electrically isolated from each other photodetector 804a, 804b, 804c, 804d (e.g., by implant isolation or physical trench isolation).
  • An array of 1x2 OCLs 806 may be disposed over the array of pixels 800, with a pair of adjacent 1x2 OCLs 806a, 806b disposed over each pixel 800a, 800b, 800c, 800d.
  • the pair of adjacent OCLs 806 may include a first 1x2 OCL 806a disposed over two adjacent photodetectors 802 (e.g., over the first and second photodetectors 804a, 804b), and a second 1x2 OCL 806b disposed over two other adjacent photodetectors 802 (e.g., over the third and fourth photodetectors 804c, 804d).
  • the photodetectors 804a, 804b, 804c, 804d of a pixel 800a, 800b, 800c, or 800d may be connected to a shared readout circuit (i.e., a readout circuit shared by all of the photodetectors associated with the pixel, as described, for example, with reference to FIG. 7).
  • a set of charge transfer transistors may be operable to connect the photodetectors 804a, 804b, 804c, 804d to the shared readout circuit (e.g., each charge transfer transistor in the set may be operable (e.g., by a processor) to connect a respective one of the photodetectors to, and disconnect the respective one of the photodetectors from, the shared readout circuit).
  • each charge transfer transistor may be operated individually.
  • pairs (or all) of the charge transfer transistors may be operated contemporaneously.
  • all of the 1x2 OCLs 806 in the array of 1x2 OCLs may have a same orientation (e.g., a horizontal orientation, as shown; or alternatively, a vertical orientation).
  • the 1x2 OCLs disposed over alternating Bayer pattern rows, alternating Bayer pattern columns, or alternating 2x2 sets of Bayer pattern pixels may have different orientations.
  • FIG. 8A shows each of the 1x2 OCLs 806 as having a similarly shaped and sized oval perimeter.
  • Each of the 1x2 OCLs 806 may also have a similar curvature (i.e., curvature perpendicular to the plan view shown in FIG. 8A).
  • each of the 1x2 OCLs 806 may have a perimeter that is generally rectangular, or a perimeter having a different or non-symmetric shape. All of these perimeters may be referred to herein as oblong perimeters, providing oblong OCLs (e.g., 1x2 OCLs 806).
  • different 1x2 OCLs 806 may have somewhat different shapes as a result of manufacturing variance.
  • each (or all) of the 1x2 OCLs 806 in the array of 1x2 OCLs 806 may have the same or different shape, size, or curvature.
  • an OCL of any shape that extends over two adjacent photodetectors is considered a 1x2 OCL (and is also considered an oblong OCL).
  • FIG. 8A shows each of the 1 x2 OCLs 806 to have a focus area 808 (e.g., the first 1x2 OCL 806a has a first focus area 808a, and the second 1x2 OCL 806b has a second focus area 808b).
  • the focus area 808 of each 1x2 OCL is generally aligned with the perimeter of the OCL 806.
  • FIG. 8A shows that the perimeter of each 1x2 OCL 806 is generally aligned with respect to the pair of photodetectors over which it is disposed.
  • the first photodetector 804a has a first centroid 810a
  • the second photodetector 804b has a second centroid 810b
  • the first 1x2 OCL 806a has a centroid 812 that is disposed in line with, and centered between, the first centroid 810a and the second centroid 810b.
  • each 1x2 OCL 806 is generally aligned with the perimeter of the 1x2 OCL 806, and generally aligned with respect to the pair of photodetectors over which it is disposed.
  • the first 1x2 OCL 806a has a focus centroid 814 that is disposed in line with, and centered between, the first centroid 810a and the second centroid 810b.
  • the centroid of a 1x2 OCL may not be aligned with respect to the pair of photodetectors over which it is disposed, and/or the focus centroid of a 1x2 OCL (e.g., the focus centroid 814 of the focus area 808a of the first 1x2 OCL 806a) may not be aligned with respect to the pair of photodetectors over which it is disposed.
  • the centroid of a 1x2 OCL e.g., the centroid 812 of the first 1x2 OCL 806a
  • the focus centroid of a 1x2 OCL e.g., the focus centroid 814 of the focus area 808a of the first 1x2 OCL 806a
  • the focus area 808a of the first 1x2 OCL 806a may not be aligned with respect to the first and second photodetectors 804a, 804b (e.g., the focus centroid 814 of the first 1x2 OCL 806a is not aligned with, and is not centered between, the first centroid 810a then the second centroid 810b).
  • the focus centroid 814 may not be aligned with, but may be centered between, the first centroid 810a and the second centroid 810b; or, the focus centroid 814 may be aligned with, but not be centered between, the first centroid 810a and the second centroid 801 b.
  • centroid of a 1x2 OCL may not be aligned with respect to the photodetectors over which it is disposed (e.g., because of manufacturing variances that affect the placement or shape of the perimeter of the 1x2 OCL).
  • FIGs. 9A-9D show an imaging area 900 of an image capture device (e.g., an image sensor), in which the pixels 902 of the image capture device are arranged in accordance with a Bayer pattern (i.e., a 2x2 pattern including red pixels and blue pixels along one diagonal, and green pixels along the other diagonal).
  • the pixels 902 may be arranged in Bayer pattern rows 906a, 906b and Bayer pattern columns 908a, 908b. More generally, the pixels 902 may be arranged in rows extending in a first dimension, and in columns extending in a second dimension orthogonal to the first dimension.
  • FIGs. 9A-9D show each pixel 902 as having a 2x2 array of photodetectors 904, as described, for example, with reference to FIGs. 5-8B.
  • the imaging area 900 may be an example of an imaging area of an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2, or the imaging area of the image sensor described with reference to FIG. 4, or an imaging area including a plurality of the pixels described with reference to any of FIGs. 5-8B.
  • An array of 1x2 OCLs 910 is disposed over the entirety of the imaging area 900, with a pair of adjacent 1x2 OCLs 910 disposed over each pixel 902, and with each 1x2 OCL 910 being disposed over a pair of adjacent photodetectors 904.
  • the pixels 902 in all of the Bayer pattern rows 906a, 906b of the imaging area 900 are configured (or are operable) to detect a phase difference (e.g., an out- of-focus condition) in a first set of edges of an image (e.g., vertical edges).
  • a phase difference e.g., an out- of-focus condition
  • Each 1x2 OCL 910 in the array of 1 x2 OCLs 910 has a same orientation, with its longer dimension extending parallel to the Bayer pattern rows 906a, 906b.
  • the pixels 902 in all of the Bayer pattern columns 908a, 908b of the imaging area 900 are configured (or are operable) to detect a phase difference (e.g., an out- of-focus condition) in a second set of edges of an image (e.g., horizontal edges).
  • a phase difference e.g., an out- of-focus condition
  • Each 1x2 OCL 910 in the array of 1x2 OCLs 910 has a same orientation, with its longer dimension extending parallel to the Bayer pattern columns 908a, 908b.
  • the pixels 902 in the first Bayer pattern row 906a of the imaging area 900 are configured (or are operable) to detect a phase difference (e.g., an out-of-focus condition) in a first set of edges of an image (e.g., vertical edges), and the pixels 902 in the second Bayer pattern row 906b are configured (or are operable) to detect a phase difference in a second set of edges of the image (e.g., horizontal edges, or edges that are otherwise orthogonal to the first set of edges).
  • a phase difference e.g., an out-of-focus condition
  • the 1x2 OCLs 910 in the array of 1x2 OCLs 910 have different orientations, with a first subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern rows 906a, 906b), and a second subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a second orientation, orthogonal to the first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern columns 908a, 908b).
  • first orientation e.g., with its longer dimension extending parallel to the Bayer pattern rows 906a, 906b
  • a second subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a second orientation, orthogonal to the first orientation e.g., with its longer dimension extending parallel to the Bayer pattern columns 908a, 908b.
  • the first subset of 1x2 OCLs 910 may be disposed over a first set of rows of pixels 902 (e.g., over the Bayer pattern row 906a, or over interspersed rows (e.g., interspersed Bayer pattern rows 906) when the imaging area 900 includes more Bayer pattern rows than are shown).
  • the second subset of 1x2 OCLs 910 may be disposed over a second set of rows of pixels 902 (e.g., over the Bayer pattern row 906b, or over interspersed rows (e.g., interspersed Bayer pattern rows 906) when the imaging area 900 includes more Bayer pattern rows than are shown).
  • the first and second subsets of 1x2 OCLs 910 may be disposed over interspersed columns, such as interspersed Bayer pattern columns.
  • the pixels 902 in a first lattice of pixels are configured (or are operable) to detect a phase difference (e.g., an out-of-focus condition) in a first set of edges of an image e.g., vertical edges), and the pixels 902 in a second lattice of pixels are configured (or are operable) to detect a phase difference in a second set of edges of the image (e.g., horizontal edges, or edges that are otherwise orthogonal to the first set of edges).
  • a phase difference e.g., an out-of-focus condition
  • the lattices of pixels may be overlapping checkerboard lattices of pixels, or overlapping checkerboard lattices of Bayer pattern sets of pixels (e.g., each segment of each lattice may be a 2x2 array of pixels 902).
  • the 1x2 OCLs 910 in the array of 1x2 OCLs 910 have different orientations, with a first subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern rows 906a, 906b), and a second subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a second orientation, orthogonal to the first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern columns 908a, 908b).
  • the first subset of 1x2 OCLs 910 may be disposed over the first lattice of pixels 902
  • the second subset of 1x2 OCLs 910 may be disposed over the second lattice of pixels 902.
  • At least some of the 1x2 OCLs 910 in the first subset of 1x2 OCLs 910 and at least some of the 1x2 OCLs 910 in the second subset of 1x2 OCLs 910 are disposed over a subset of pixels disposed under a same-colored subset of filter elements.
  • PDAF information for detecting the focus of two orthogonal sets of edges may be collected from pixels 902 having the same color.
  • FIG. 10 shows a sample electrical block diagram of an electronic device 1000, which may be the electronic device described with reference to FIGs. 1 A-1 B, 2, 4, and so on.
  • the electronic device 1000 may include a display 1002 (e.g., a light-emitting display), a processor 1004, a power source 1006, a memory 1008 or storage device, a sensor 1010, and an input/output (I/O) mechanism 1012 (e.g., an input/output device and/or input/output port).
  • the processor 1004 may control some or all of the operations of the electronic device 1000.
  • the processor 1004 may communicate, either directly or indirectly, with substantially all of the components of the electronic device 1000.
  • a system bus or other communication mechanism 1014 may provide communication between the processor 1004, the power source 1006, the memory 1008, the sensor 1010, and/or the input/output mechanism 1012.
  • the processor 1004 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions.
  • the processor 1004 may be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of such devices.
  • the term “processor” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
  • components of the electronic device 1000 may be controlled by multiple processors. For example, select components of the electronic device 1000 may be controlled by a first processor and other components of the electronic device 1000 may be controlled by a second processor, where the first and second processors may or may not be in communication with each other.
  • the power source 1006 may be implemented with any device capable of providing energy to the electronic device 1000.
  • the power source 1006 may be one or more batteries or rechargeable batteries.
  • the power source 1006 may be a power connector or power cord that connects the electronic device 1000 to another power source, such as a wall outlet.
  • the memory 1008 may store electronic data that may be used by the electronic device 1000.
  • the memory 1008 may store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing signals, control signals, data structures or databases, image data, or focus settings.
  • the memory 1008 may be configured as any type of memory.
  • the memory 1008 may be implemented as random access memory, read-only memory, Flash memory, removable memory, other types of storage elements, or combinations of such devices.
  • the electronic device 1000 may also include one or more sensors 1010 positioned substantially anywhere on the electronic device 1000.
  • the sensor(s) 1010 may be configured to sense substantially any type of characteristic, such as but not limited to, pressure, light, touch, heat, movement, relative motion, biometric data, and so on.
  • the sensor(s) 1010 may include a heat sensor, a position sensor, a light or optical sensor, an accelerometer, a pressure transducer, a gyroscope, a magnetometer, a health monitoring sensor, and so on.
  • the one or more sensors 1010 may utilize any suitable sensing technology, including, but not limited to, capacitive, ultrasonic, resistive, optical, ultrasound, piezoelectric, and thermal sensing technology.
  • the I/O mechanism 1012 may transmit and/or receive data from a user or another electronic device.
  • An I/O device may include a display, a touch sensing input surface such as a track pad, one or more buttons (e.g., a graphical user interface “home” button), one or more cameras, one or more microphones or speakers, one or more ports such as a microphone port, and/or a keyboard. Additionally or alternatively, an I/O device or port may transmit electronic signals via a communications network, such as a wireless and/or wired network connection. Examples of wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, IR, and Ethernet connections.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

An image capture device is described. The image capture device includes an array of pixels. Each pixel includes a 2x2 array of photodetectors. The image capture device also includes an array of 1x2 on chip lenses (OCLs) disposed over the array of pixels. For each pixel in the array of pixels, a respective pair of adjacent 1x2 OCLs is disposed over a pixel, with each respective pair of adjacent 1x2 OCLs including a respective first 1x2 OCL disposed over a first photodetector and a respective second photodetector in the 2x2 array of photodetectors for the pixel, and a second 1x2 OCL disposed over a third photodetector and a fourth photodetector in the 2x2 array of photodetectors for the pixel.

Description

IMAGE CAPTURE DEVICES HAVING PHASE DETECTION AUTO-FOCUS PIXELS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This This Patent Cooperation Treaty patent application claims priority to U.S. Nonprovisional Patent Application No. 17/380,852, filed July 20, 2021 , and titled “Image Capture Devices Having Phase Detection Auto-Focus Pixels,” and U.S. Provisional Patent Application No. 63/061 ,074, filed August 4, 2020, and titled “Image Capture Devices Having Phase Detection Auto-Focus Pixels,” the contents of which are incorporated herein by reference in their entirety.
FIELD
[0002] The described embodiments relate generally to devices having a camera or other image capture device. More particularly, the described embodiments relate to an image capture device having phase detection auto-focus (PDAF) pixels.
BACKGROUND
[0003] Digital cameras and other image capture devices use an image sensor, such as a complementary metal-oxide-semiconductor (CMOS) image sensor or a charge-coupled device (CCD) image sensor, to capture an image. In some cases, a camera or other image capture device may include multiple image sensors, with the different image sensors having adjacent or interlaced arrays of pixels.
[0004] Many cameras and other image capture devices include one or more optical components (e.g., a lens or lens assembly) that are configurable to focus light, received or reflected from an image, onto the surface of an image sensor. Before or while capturing an image, the distance between the optical component(s) and image sensor (or a tilt or other parameters of the optical components or image sensor) may be adjusted to focus an image onto the image sensor. In some cases, macro (or rough) focusing may be performed for an image sensor prior to capturing an image using the image sensor (e.g., using a macro focus mechanism adjacent the image sensor). Micro (or fine) focusing can then be performed after acquiring one or more images using the image sensor. In other cases, all focusing may be performed prior to capturing an image (e.g., by adjusting one or more relationships between a lens, lens assembly, or image sensor); or all focusing may be performed after acquiring an image (e.g., by adjusting pixel values using one or more digital image processing algorithms). Many cameras and other image capture devices perform focusing operations frequently, and in some cases before and/or after the capture of each image capture frame. [0005] Focusing an image onto an image sensor often entails identifying a perceptible edge between objects, or an edge defined by different colors or brightness (e.g., an edge between dark and light regions), and making adjustments to a lens, lens assembly, image sensor, or pixel value(s) to bring the edge into focus.
SUMMARY
[0006] Embodiments of the systems, devices, methods, and apparatus described in the present disclosure are directed to an image capture device having PDAF pixels.
[0007] In a first aspect, the present disclosure describes an image capture device. The image capture device may include an array of pixels, and an array of 1x2 on-chip lenses (OCLs) disposed over the array of pixels. Each pixel may include a 2x2 array of photodetectors. For each pixel in the array of pixels, a respective pair of adjacent 1x2 OCLs may be disposed over a pixel. Each respective pair of adjacent 1x2 OCLs may include a respective first 1x2 OCL disposed over a first photodetector and a second photodetector in the 2x2 array of photodetectors for the pixel, and a respective second 1x2 OCL disposed over a third photodetector and a fourth photodetector in the 2x2 array of photodetectors for the pixel.
[0008] In another aspect, the present disclosure describes another image capture device. The image capture device may include an array of photodetectors, and an array of oblong OCLs disposed over the array of photodetectors. A different pair of photodetectors in the array of photodetectors may be disposed under each of the oblong OCLs, and every photodetector in the array of photodetectors may be disposed under a respective one of the oblong OCLs in the array of oblong OCLs.
[0009] In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the drawings and by study of the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
[0011 ] FIGs. 1 A and 1 B show an example of a device that may include one or more image capture devices; [0012] FIG. 2 shows an example embodiment of an image capture device, including an image sensor, a lens or lens assembly, and an auto-focus mechanism;
[0013] FIG. 3 shows an example of an image that may be captured by an image capture device;
[0014] FIG. 4 shows a plan view of one example of an image sensor;
[0015] FIG. 5 shows an example imaging area (e.g., a plan view) of a pixel in an image capture device;
[0016] FIG. 6 shows an example cross-section of the pixel shown in FIG. 5;
[0017] FIG. 7 shows a simplified schematic of a pixel usable in an image sensor;
[0018] FIGs. 8A and 8B show an array of pixels;
[0019] FIGs. 9A-9D show an imaging area of an image capture device, in which the pixels of the image capture device are arranged in accordance with a Bayer pattern (i.e., a 2x2 pattern including red pixels and blue pixels along one diagonal, and green pixels along the other diagonal); and
[0020] FIG. 10 shows a sample electrical block diagram of an electronic device.
[0021] The use of cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and also to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalities of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures.
[0022] Additionally, it should be understood that the proportions and dimensions (either relative or absolute) of the various features and elements (and collections and groupings thereof) and the boundaries, separations, and positional relationships presented therebetween, are provided in the accompanying figures merely to facilitate an understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale, and are not intended to indicate any preference or requirement for an illustrated embodiment to the exclusion of embodiments described with reference thereto.
DETAILED DESCRIPTION
[0023] Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following description is not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
[0024] The present disclosure relates to an image capture device that provides improved PDAF performance.
[0025] In some cases, PDAF pixels (/.e., pixels configured to collect PDAF information) may have a metal shield configuration. A metal shield pixel may include a microlens that focuses incoming light on a photodiode, which photodiode in turn converts photons into electron (or hole) pairs. The collected electrons (for electron collection devices) or holes (for hole collection devices) may be converted into an analog voltage through a pixel source follower (SF) transistor amplifier. The analog voltage may then be converted to a digital signal by an analog-to-digital converter (ADC). A metal shield (e.g., a Tungsten (W) or Copper/Aluminum (Cu/AI) metal shield) may cover half of the photodiode (e.g., a left half or a right half). For a left-shielded pixel, light from left-incident angles is blocked by the metal shield, and only light approaching the pixel from right-incident angles is received by the photodiode. A right-shielded pixel functions in the opposite manner. The angular sensitivity of left and right metal shield pixels can be used to generate PDAF information.
[0026] Because the signal (e.g., the analog voltage or digital signal) generated by a metal shield pixel will be much lower than the signal generated by an unshielded (or regular) pixel, metal-shielded pixels need to be treated as defective pixels, and their signals need to be corrected before being used to generate an image. To minimize the effect that signal correction may have on image quality, metal shield pixels (or pairs of left/right-shielded pixels) may be sparsely distributed over the surface of an image sensor. That is, a relatively small number of an image sensor’s pixels (e.g., 3-4%) may be configured as left- or right- shielded pixels. In one example, for every eight rows and eight columns of pixels (e.g., for every block of 64 pixels in a pixel array), one left-shielded pixel and one right-shielded pixel may be provided.
[0027] At a vertical edge within an image (e.g., at an edge defined by a perceptible edge between objects, or at an edge defined by different colors or brightness (e.g., an edge between dark and light regions)), left- and right-shielded pixels will have disparate signals (e.g., signals not matched in magnitude and/or polarity) when an image is not in focus, but will have well-matched signals when an image is in focus. The signals of left- and right- shielded pixels therefore provide PDAF information that can be used by an auto-focus (AF) mechanism to adjust the position of one or more optical components (e.g., a lens) or an image sensor, and thereby adjust the focus of an image on the image sensor, or to digitally adjust or compensate for an out-of-focus condition. In some cases, an image may be brought into focus based on a PDAF information obtained during a single image capture frame. By analyzing PDAF information obtained during each image capture frame, images may be quickly and continuously focused on an image sensor.
[0028] In some embodiments, left- and right-shielded pixels may be fabricated without metal shields by placing both pixels adjacent one another under a single microlens. Each pixel has its own photodiode, and there may be implant isolation or physical trench isolation between the photodiodes of the two pixels. Because of the nature (e.g., curvature) of the microlens, light from left-incident angles is received mainly by the left-side pixel, and light from right-incident angles is received mainly by the right-side pixel. As a result, left- and right-side pixels placed adjacent one another under a single microlens may function similarly to left and right metal shielded pixels. In a Bayer pattern pixel configuration (i.e., a repetitive 2x2 pattern including red pixels and blue pixels along one diagonal, and green pixels along the other diagonal), one blue pixel in every 8x8 block of pixels may be replaced by a green pixel (or may be modified to function as a green pixel), so that two adjacent green pixels may be placed under a single microlens to provide PDAF information. The signals of both pixels need to be corrected before being used to generate an image.
[0029] Because the signals provided by metal shield pixels, or the signals provided by adjacent pixels under a microlens, need to be corrected before being used to generate an image, the density of such pixels may be kept low. However, this provides limited PDAF information, which in turn degrades AF performance (especially in low light conditions). To improve PDAF performance, each pixel in a pixel array may be divided into left and right sub-pixels, and PDAF information may be obtained from each pixel. Also, because all pixels are implemented in a similar manner, the sub-pixel signals for each pixel may be combined in a similar way, or signal corrections may be made to each pixel in a similar way, to increase the confidence level that pixel signals are being generated or corrected appropriately (especially in low light conditions). However, the PDAF information provided by such a pixel array (and by all pixel arrays using metal shield pixels or adjacent pixels under a single microlens) bases image focus entirely on vertical edges. For an image containing few vertical edges or more horizontal edges, or for an image acquired under a low light condition, PDAF performance may suffer. For example, it may be difficult or impossible to focus an image on an image sensor, or it may take longer than desired to focus an image on an image sensor.
[0030] In some cases, pixels in a pixel array may be configured to have a 2x2 array of sub-pixels (e.g., photodetectors) disposed under a microlens. In some embodiments, the entirety of a pixel array may incorporate such pixels. The pixels can be used, in various embodiments or configurations, to provide PDAF information based on edges having more than one orientation (e.g., vertical and horizontal edges), to improve PDAF performance (especially in low light conditions), to reduce or eliminate the need for signal correction, or to increase the resolution of an image sensor. However, the shared microlens over a 2x2 array of sub-pixels tends to reduce the signal-to-noise ratio (SNR) of the signals acquired by the sub-pixels. When the shared microlens is shifted as a result of manufacturing variance, the SNR of some sub-pixels may be reduced even further, and the SNR of each sub-pixel may differ (i.e., each of the four sub-pixels positioned under a shared microlens may have a different SNR). This greatly increases the processing burden (e.g., including lens offset or misalignment correction, re-mosaicing burden, and so on) that is needed to correct the signals produced by the various sub-pixels, and can lead to image resolution loss as a result of the lens offset or misalignment correction and other factors.
[0031] Disclosed herein is an image capture device in which a pair of adjacent 1x2 on- chip lenses (OCLs) are disposed over the sub-pixels (or photodetectors) of a pixel having a 2x2 array of sub-pixels (or photodetectors). Although this results in a mismatch between the footprints of the 1x2 OCLs and the imaging areas of the pixels (e.g., a 1x2 footprint versus a 2x2 footprint), it simplifies the corrections that need to be made for manufacturing variances resulting from shifts between the centroids of the sub-pixels or photodetectors and centroids of the OCLs or their focus areas. The simplification in the corrections that need to be made can improve focus and image quality; reduce re-mosaicing challenges; and enable 1x2 OCLs to be disposed over all of the pixels and photodetectors of an image capture device. A 1x2 OCL may also be shaped such that it allows more light into each of the sub-pixels over which it is disposed (e.g., as compared to a microlens disposed over a 2x2 array of subpixels). Allowing more light into a sub-pixel increases its SNR. If desired, the 1x2 OCLs can be oriented in different directions, to enable the focus of orthogonal sets of edges to be detected.
[0032] These and other embodiments are described with reference to FIGs. 1 A-10. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.
[0033] Directional terminology, such as “top”, “bottom”, “upper”, “lower”, “front”, “back”, “over”, “under”, “above”, “below”, “left”, “right”, etc. is used with reference to the orientation of some of the components in some of the figures described below. Because components in various embodiments can be positioned in a number of different orientations, directional terminology is used for purposes of illustration only and is in no way limiting. The directional terminology is intended to be construed broadly, and therefore should not be interpreted to preclude components being oriented in different ways. The use of alternative terminology, such as “or”, is intended to indicate different combinations of the alternative elements. For example, A or B is intended to include, A, or B, or A and B.
[0034] FIGs. 1 A and 1 B show an example of a device 100 that may include one or more image capture devices. The device’s dimensions and form factor, including the ratio of the length of its long sides to the length of its short sides, suggest that the device 100 is a mobile phone (e.g., a smart phone). However, the device’s dimensions and form factor are arbitrarily chosen, and the device 100 could alternatively be any portable electronic device including, for example, a mobile phone, tablet computer, portable computer, portable music player, electronic watch, health monitor device, portable terminal, vehicle navigation system, robot navigation system, or other portable or mobile device. The device 100 could also be a device that is semi-permanently located (or installed) at a single location. FIG. 1 A shows a front isometric view of the device 100, and FIG. 1 B shows a rear isometric view of the device 100. The device 100 may include a housing 102 that at least partially surrounds a display 104. The housing 102 may include or support a front cover 106 or a rear cover 108. The front cover 106 may be positioned over the display 104, and may provide a window through which the display 104 may be viewed. In some embodiments, the display 104 may be attached to (or abut) the housing 102 and/or the front cover 106. In alternative embodiments of the device 100, the display 104 may not be included and/or the housing 102 may have an alternative configuration.
[0035] The display 104 may include one or more light-emitting elements including, for example, light-emitting diodes (LEDs), organic LEDs (OLEDs), a liquid crystal display (LCD), an electroluminescent (EL) display, or other types of display elements. In some embodiments, the display 104 may include, or be associated with, one or more touch and/or force sensors that are configured to detect a touch and/or a force applied to a surface of the front cover 106.
[0036] The various components of the housing 102 may be formed from the same or different materials. For example, the sidewall 118 may be formed using one or more metals (e.g., stainless steel), polymers (e.g., plastics), ceramics, or composites (e.g., carbon fiber). In some cases, the sidewall 118 may be a multi-segment sidewall including a set of antennas. The antennas may form structural components of the sidewall 118. The antennas may be structurally coupled (to one another or to other components) and electrically isolated (from each other or from other components) by one or more non-conductive segments of the sidewall 118. The front cover 106 may be formed, for example, using one or more of glass, a crystal (e.g., sapphire), or a transparent polymer (e.g., plastic) that enables a user to view the display 104 through the front cover 106. In some cases, a portion of the front cover 106 (e.g., a perimeter portion of the front cover 106) may be coated with an opaque ink to obscure components included within the housing 102. The rear cover 108 may be formed using the same material(s) that are used to form the sidewall 118 or the front cover 106. In some cases, the rear cover 108 may be part of a monolithic element that also forms the sidewall 118 (or in cases where the sidewall 118 is a multi-segment sidewall, those portions of the sidewall 118 that are non-conductive). In still other embodiments, all of the exterior components of the housing 102 may be formed from a transparent material, and components within the device 100 may or may not be obscured by an opaque ink or opaque structure within the housing 102.
[0037] The front cover 106 may be mounted to the sidewall 118 to cover an opening defined by the sidewall 118 (/.e., an opening into an interior volume in which various electronic components of the device 100, including the display 104, may be positioned). The front cover 106 may be mounted to the sidewall 118 using fasteners, adhesives, seals, gaskets, or other components.
[0038] A display stack or device stack (hereafter referred to as a “stack”) including the display 104 may be attached (or abutted) to an interior surface of the front cover 106 and extend into the interior volume of the device 100. In some cases, the stack may include a touch sensor (e.g., a grid of capacitive, resistive, strain-based, ultrasonic, or other type of touch sensing elements), or other layers of optical, mechanical, electrical, or other types of components. In some cases, the touch sensor (or part of a touch sensor system) may be configured to detect a touch applied to an outer surface of the front cover 106 (e.g., to a display surface of the device 100).
[0039] In some cases, a force sensor (or part of a force sensor system) may be positioned within the interior volume below and/or to the side of the display 104 (and in some cases within the device stack). The force sensor (or force sensor system) may be triggered in response to the touch sensor detecting one or more touches on the front cover 106 (or a location or locations of one or more touches on the front cover 106), and may determine an amount of force associated with each touch, or an amount of force associated with the collection of touches as a whole. Alternatively, the touch sensor (or touch sensor system) may be triggered in response to the force sensor detecting one or more forces on the front cover 106. In some cases, the force sensor may be used as (e.g., as an alternative to) a separate touch sensor.
[0040] As shown primarily in FIG. 1 A, the device 100 may include various other components. For example, the front of the device 100 may include one or more front-facing cameras 110 or other image capture devices (including one or more image sensors), speakers 112, microphones, or other components 114 e.g., audio, imaging, and/or sensing components) that are configured to transmit or receive signals to/from the device 100. In some cases, a front-facing camera 110, alone or in combination with other sensors, may be configured to operate as a bio-authentication or facial recognition sensor. The device 100 may also include various input devices, including a mechanical or virtual button 116, which may be accessible from the front surface (or display surface) of the device 100. In some embodiments, the front-facing camera 110, one or more other cameras, and/or one or more other optical emitters, optical detectors, or other optical sensors may be positioned under the display 104 instead of adjacent the display 104. In these embodiments, the camera(s), optical emitter(s), optical detector(s), or sensor(s) may emit and/or receive light through the display 104.
[0041] The device 100 may also include buttons or other input devices positioned along the sidewall 118 and/or on a rear surface of the device 100. For example, a volume button or multipurpose button 120 may be positioned along the sidewall 118, and in some cases may extend through an aperture in the sidewall 118. The sidewall 118 may include one or more ports 122 that allow air, but not liquids, to flow into and out of the device 100. In some embodiments, one or more sensors may be positioned in or near the port(s) 122. For example, an ambient pressure sensor, ambient temperature sensor, internal/external differential pressure sensor, gas sensor, particulate matter concentration sensor, or air quality sensor may be positioned in or near a port 122.
[0042] In some embodiments, the rear surface of the device 100 may include a rearfacing camera 124 or other image capture device (including one or more image sensors; see FIG. 1 B). A flash or light source 126 may also be positioned along the rear surface of the device 100 (e.g., near the rear-facing camera). In some cases, the rear surface of the device 100 may include multiple rear-facing cameras.
[0043] FIG. 2 shows an example embodiment of an image capture device (e.g., a camera 200), including an image sensor 202, a lens 204 or lens assembly, and a mechanical autofocus mechanism 206. In some embodiments, the components shown in FIG. 2 may be associated with the first camera 110 or the second camera 124 shown in FIGs. 1 A-1 B.
[0044] The image sensor 202 may include a plurality of pixels, such as a plurality of pixels arranged in a two-dimensional array of pixels. Multiple ones (or all) of the pixels may each include a two-dimensional array of photodetectors (e.g., a 2x2 array of photodetectors). The photodetectors that are associated with a pixel may be electrically isolated from each other. As will be described with reference to other figures, different OCLs may be disposed over different pairs of a pixel’s photodetectors. [0045] The lens 204 may be adjustable with respect to the image sensor 202, to focus an image of a scene 208 on the image sensor 202. In some embodiments, the lens 204 or lens assembly may be moved with respect to the image sensor 202 (e.g., moved to change a distance between the lens 204 or lens assembly and the image sensor 202, moved to change an angle between a plane of a lens 204 or lenses and a plane of the image sensor 202, and so on). In other embodiments, the image sensor 202 may be moved with respect to the lens 204 or lens assembly.
[0046] In some embodiments, the auto-focus mechanism 206 may include (or the functions of the auto-focus mechanism 206 may be provided by) a processor in combination with a voice coil, piezoelectric element, or other actuator mechanism that moves the lens 204, lens assembly, or image sensor 202. The auto-focus mechanism 206 may receive signals from the image sensor 202 and, in response to the signals, adjust a focus setting of the camera 200. In some embodiments, the signals may include PDAF information. The PDAF information may include horizontal phase detection signals, vertical phase detection signals, and/or other phase detection signals. In response to the PDAF information (e.g., in response to an out-of-focus condition identified from the PDAF information), the auto-focus mechanism 206 may adjust a focus setting of the camera 200 by, for example, adjusting a relationship between the image sensor 202 (or plurality of pixels) and the lens 204 or lens assembly (e.g., by adjusting a physical position of the lens 204, lens assembly, or image sensor 202). Additionally or alternatively, the processor of the auto-focus mechanism 206 may use digital image processing techniques to adjust the values output by the pixels and/or photodetectors of the image sensor 202. The values may be adjusted to digitally improve, or otherwise alter, the focus of an image of the scene 208. In some embodiments, the autofocus mechanism 206 may be used to provide only mechanical, or only digital, focus adjustments.
[0047] Referring now to FIG. 3, there is shown an example of an image 300 that may be captured by an image capture device, such as one of the cameras described with reference to FIGs. 1A-1 B or 2. The image 300 may include a number of objects 302, 304 having edges 306, 308 oriented in one or more directions. The edges 306, 308 may include perceptible edges between objects, or edges defined by different colors or brightness levels (e.g., an edge between dark and light regions). In some embodiments, the camera may only detect a focus of one set of edges (e.g., only horizontal edges or only vertical edges). In some embodiments, the camera may detect a focus of both a first set of edges (e.g., horizontal edges) and a second set of edges (e.g., vertical edges, or edges that are orthogonal to the first set of edges). [0048] The focus of the first and/or second sets of edges may be detected in the same or different image capture frames, using the same or different pixels. In some cases, a focus of edges in the first set of edges may be detected using a first subset of pixels configured to detect a focus of horizontal edges, in a same frame that a focus of edges in the second set of edges is detected by a second subset of pixels configured to detect a focus of vertical edges. A focus of edges may be detected based on a phase difference (e.g., magnitude and polarity of the phase difference) in light captured by different photodetectors in a pair of photodetectors associated with a pixel.
[0049] In some embodiments, a single pixel in a pixel array (and in some cases, some or each of the pixels in the pixel array, or each of the pixels in a subset of pixels in the pixel array) may be configured to produce a signal usable for detecting the focus of a horizontal edge or a vertical edge. In some embodiments, all of the pixels in a pixel array (or all of the pixels used to capture a particular image) may be employed in the detection of edge focus information for an image.
[0050] FIG. 4 shows a plan view of one example of an image sensor 400, such as an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2. The image sensor 400 may include an image processor 402 and an imaging area 404. The imaging area 404 may be implemented as a pixel array that includes a plurality of pixels 406. The pixels 406 may be same colored pixels (e.g., for a monochrome imaging area 404) or differently colored pixels (e.g., for a multi-color imaging area 404). In the illustrated embodiment, the pixels 406 are arranged in rows and columns. However, other embodiments are not limited to this configuration. The pixels in a pixel array may be arranged in any suitable configuration, such as, for example, a hexagonal configuration.
[0051] The imaging area 404 may be in communication with a column select circuit 408 through one or more column select lines 410, and with a row select circuit 412 through one or more row select lines 414. The row select circuit 412 may selectively activate a particular pixel 406 or group of pixels, such as all of the pixels 406 in a certain row. The column select circuit 408 may selectively receive the data output from a selected pixel 406 or group of pixels 406 (e.g., all of the pixels in a particular row).
[0052] The row select circuit 412 and/or column select circuit 408 may be in communication with an image processor 402. The image processor 402 may process data from the pixels 406 and provide that data to another processor (e.g., a system processor) and/or other components of a device (e.g., other components of the electronic device 100). In some embodiments, the image processor 402 may be incorporated into the system. The image processor 402 may also receive focus information (e.g., PDAF information) from some or all of the pixels, and may perform a focusing operation for the image sensor 400. In some examples, the image processor 402 may perform one or more of the operations performed by the auto-focus mechanism described with reference to FIG. 2.
[0053] FIG. 5 shows an example imaging area (e.g., a plan view) of a pixel 500 in an image capture device, such as a pixel included in an image sensor associated with one of the image capture devices or cameras described with reference to FIGS. 1 A-1 B and 2, or a pixel included in the image sensor described with reference to FIG. 4. In some embodiments, some pixels in an image sensor, each pixel in an image sensor, or each pixel in a subset of pixels in an image sensor, may be configured as shown in FIG. 5.
[0054] The imaging area of the pixel 500 includes a two-dimensional array of photodetectors 502. In some embodiments, the imaging area may include a 2x2 array of photodetectors (e.g., a set of photodetectors 502 arranged in two rows and two columns). For example, the array may include a first photodetector 502a and a second photodetector 502b arranged in a first row, and a third photodetector 502c and a fourth photodetector 502d arranged in a second row. The first photodetector 502a and the third photodetector 502c may be arranged in a first column, and the second photodetector 502b and the fourth photodetector 502d may be arranged in a second column.
[0055] Each photodetector 502 may be electrically isolated from each other photodetector 502 (e.g., by implant isolation or physical trench isolation). A first 1x2 OCL 504a may be disposed over two of the photodetectors in the pixel 500 (e.g., over the first photodetector 502a and the second photodetector 502b). A second 1x2 OCL 504b may be disposed over the remaining two of the photodetectors in the pixel 500 (e.g., over the third photodetector 502c and the fourth photodetector 502d).
[0056] An optional single-piece or multi-piece filter element (e.g., a red filter, a blue filter, a green filter, or the like) may be disposed over the array of photodetectors 502 (e.g., over the first photodetector 502a, the second photodetector 502b, the third photodetector 502c, and the fourth photodetector 502d). In some examples, the filter element may be applied to an interior or exterior of each 1x2 OCL 504a, 504b. In some examples, each OCL 504a, 504b may be tinted to provide the filter element. In some examples, each of the photodetectors may be separately encapsulated under the OCLs 504a, 504b, and the filter element may be applied to or in the encapsulant. In some examples, a filter element may be positioned between the array of photodetectors 502 and the OCLs 504a, 504b (although other configurations of the filter element may also be considered as being disposed “between” the photodetectors 502 and the OCLs 504a, 504b).
[0057] The photodetectors 502 may be connected to a shared readout circuit (i.e., a readout circuit shared by all of the photodetectors 502 associated with the pixel 500). A set of charge transfer transistors may be operable to connect the photodetectors 502 to the shared readout circuit (e.g., each charge transfer transistor in the set may be operable (e.g., by a processor) to connect a respective one of the photodetectors 502 to, and disconnect the respective one of the photodetectors 502 from, the shared readout circuit; alternatively, a charge transfer transistor may be statically configured to connect/disconnect a pair of the photodetectors 502 (e.g., a pair of photodetectors 502 under a common 1x2 OCL, or a pair of photodetectors 502 that are disposed along a direction that is orthogonal to each of the first and second OCLs 504a, 504b, to/from the shared readout circuit). In some cases, each charge transfer transistor may be operated individually. In other cases, the charge transfer transistors may be statically configured for pair-wise operation.
[0058] FIG. 6 shows an example cross-section of the pixel 500 shown in FIG. 5. By way of example, the cross-section is taken along line VI-VI, through the first row of photodetectors 502a, 50b shown in FIG. 5. A cross-section taken through the second row of photodetectors 502c, 502d shown in FIG. 5 (not shown) may be configured similarly to the cross-section shown in FIG. 6.
[0059] The first and second photodetectors 502a, 502b may be formed in a substrate 602. The substrate 602 may include a semiconductor-based material, such as, but not limited to, silicon, silicon-on-insulator (SOI), silicon-on-sapphire (SOS), doped and undoped semiconductor regions, epitaxial layers formed on a semiconductor substrate, well regions or buried layers formed in a semiconductor substrate, or other semiconductor structures.
[0060] The 1x2 OCL 504a may be disposed over part or all of both of the photodetectors 502a and 502b. The OCL 504a may be formed of any material or combination of materials that is translucent to at least one wavelength of light. The OCL 504a may have a lightreceiving side 612 opposite the array of photodetectors 502. The light-receiving side 612 of the OCL 504a may include a central portion 608 and a peripheral portion 610. The peripheral portion 610 may be configured to redirect at least a portion of light incident on the peripheral portion (e.g., the light 606a or light 606c) toward a corresponding peripheral portion of the imaging area that includes the photodetectors 502 (e.g., the light 606a may be redirected toward the photodetector 502a, and the light 606c may be redirected toward the photodetector 502b). In some embodiments, the OCL 504a may have a convex-shaped or dome-shaped light-receiving surface (or exterior surface).
[0061] The OCL 504a may be configured to focus incident light 606 received from different angles on different ones or both of the photodetectors 502a, 502b. For example, light 606a incident on the OCL 504a from a left side approach angle may be focused more (or solely) on the left side photodetector 502a, and thus the left side photodetector 502a may accumulate more charge than the right side photodetector 502b, making the signal response of the left side photodetector 502a greater than the signal response of the right side photodetector 502b. Similarly, light 606c incident on the OCL 504a from a right side approach angle may be focused more (or solely) on the right side photodetector 502b, and thus the right side photodetector 502b may accumulate more charge than the left side photodetector 502a, making the signal response of the right side photodetector 502b greater than the signal response of the left side photodetector 502a. Light 606b incident on the OCL 504a from the front center (or top) of the OCL 504a may be focused on both of the photodetectors 502a, 502b, making the signal response of the left and right side photodetectors 502a, 502b about equal.
[0062] An optional same color filter element 604 (e.g., a red filter, a blue filter, a green filter, or the like) may be disposed over each (or both) of the photodetectors 502a, 502b (as well as the photodetectors 502c and 502d, not shown).
[0063] Referring now to FIG. 7, there is shown a simplified schematic of a pixel 700 (and associated shared readout circuit 704) usable in an image sensor. In some embodiments, the pixel 700 may be an example of a pixel included in an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2, or a pixel included in the image sensor described with reference to FIG. 4, or the pixel described with reference to FIGs. 5-6. In some embodiments, some pixels in an image sensor, each pixel in an image sensor, or each pixel in a subset of pixels in an image sensor, may be configured as shown in FIG. 7, and the shared readout circuit 704 for the pixel 700 may be part of an overall pixel readout circuit for an image sensor.
[0064] The pixel 700 may include a two-dimensional array of photodetectors 702, with each photodetector 702 being selectively connectable to (and disconnectable from) the shared readout circuit 704 by a respective charge transfer transistor in a set of charge transfer transistors 706. In some embodiments, the two-dimensional array of photodetectors 702 may include a 2x2 array of photodetectors (e.g., an array of photodetectors 702 arranged in two rows and two columns). For example, the array may include a first photodetector 702a (PD_TL) and a second photodetector 702b (PD_TR) arranged in a first row, and a third photodetector 702c (PDJ3L) and a fourth photodetector 702d (PDJ3R) arranged in a second row. The first photodetector 702a and the third photodetector 702c may be arranged in a first column, and the second photodetector 702b and the fourth photodetector 702d may be arranged in a second column. As described with reference to FIGs. 5 and 6, the photodetectors 702 may be disposed (positioned) in a 2x2 array under a pair of adjacent 1x2 OCLs.
[0065] The shared readout circuit 704 may include a sense region 708, a reset (RST) transistor 710, a readout transistor 712, and a row select (RS) transistor 714. The sense region 708 may include a capacitor that temporarily stores charge received from one or more of the photodetectors 702. As described below, charge accumulated by one or more of the photodetectors 702 may be transferred to the sense region 708 by applying a drive signal (e.g., a gate voltage) to one or more of the charge transfer transistors 706. The transferred charge may be stored in the sense region 708 until a drive signal applied to the reset (RST) transistor 710 is pulsed.
[0066] Each of the charge transfer transistors 706 may have one terminal connected to a respective one of the photodetectors 702 and another terminal connected to the sense region 708. One terminal of the reset transistor 710 and one terminal of the readout transistor 712 may be connected to a supply voltage (e.g., VDD) 720. The other terminal of the reset transistor 710 may be connected to the sense region 708, while the other terminal of the readout transistor 712 may be connected to a terminal of the row select transistor 714. The other terminal of the row select transistor 714 may be connected to an output line 716.
[0067] By way of example only, and in one embodiment, each of the photodetectors 702 may be implemented as a photodiode (PD) or pinned photodiode, the sense region 708 may be implemented as a floating diffusion (FD) node, and the readout transistor 712 may be implemented as a source follower (SF) transistor. The photodetectors 702 may be electronbased photodiodes or hole-based photodiodes. The term photodetector is used herein to refer to substantially any type of photon or light detecting component, such as a photodiode, pinned photodiode, photogate, or other photon sensitive region. Additionally, the term sense region, as used herein, is meant to encompass substantially any type of charge storing or charge converting region.
[0068] In some embodiments, the pixel 700 may be implemented using additional or different components. For example, the row select transistor 714 may be omitted and a pulsed power supply may be used to select the pixel. [0069] When an image is to be captured, an integration period for the pixel begins and the photodetectors 702 accumulate photo-generated charge in response to incident light. When the integration period ends, the accumulated charge in some or all of the photodetectors 702 may be transferred to the sense region 708 by sequentially or simultaneously applying drive signals to (e.g., by pulsing gate voltages of) the charge transfer transistors 706. Typically, the reset transistor 710 is used to reset the voltage on the sense region 708 to a predetermined level prior to the transfer of charge from a set of one or more photodetectors 702 to the sense region 708. When charge is to be read out of the pixel 700, a drive signal may be applied to the row select transistor 714 (e.g., a gate voltage of the row select transistor 714 may be pulsed) via a row select line 718 coupled to row select circuitry, and charge from one, two, or any number of the photodetectors 702 may be read out over an output line 716 coupled to column select circuitry. The readout transistor 712 senses the voltage on the sense region 708, and the row select transistor 714 transfers an indication of the voltage to the output line 716. The column select circuitry may be coupled to an image processor, auto-focus mechanism, or combination thereof.
[0070] In some embodiments, a processor may be configured to operate the set of charge transfer transistors 706 to simultaneously transfer charge from multiple photodetectors 702 (e.g., a pair of photodetectors) to the sense region 708 or floating diffusion node. For example, the gates of first and second charge transfer transistors 706a (TX_A) and 706b (TX_B) (i.e., the charge transfer transistors of the first row) may be simultaneously driven to transfer charges accumulated by the first and second photodetectors 702a, 702b to the sense region 708, where the charges may be summed. After reading the summed charge out of the pixel 700, the gates of third and fourth charge transfer transistors 706c (TX_C) and 706d (TX_D) (i.e., the charge transfer transistors of the second row) may be simultaneously driven to transfer charges accumulated by the third and fourth photodetectors 702c, 702d to the sense region 708, where the charges may be summed. This summed charge may also be read out of the pixel 700. In a subsequent frame of image capture, the gates of the first and third charge transfer transistors 706a and 706c (i.e., the charge transfer transistors of the first column) may be simultaneously driven to transfer charges accumulated by the first and third photodetectors 702a, 702c to the sense region 708. After reading this charge out of the pixel 700, the gates of the second and fourth charge transfer transistors 706b and 706d (i.e., the charge transfer transistors of the second column) may be simultaneously driven to transfer charges accumulated by the second and fourth photodetectors 702b, 702d to the sense region 708. This charge may also be read out of the pixel 700. Additionally or alternatively, charge accumulated by the photodetectors 702 may be read out of the pixel 700 individually, or charges accumulated by any combination (including all) of the photodetectors 702 may be read out of the pixel 700 together, or charges accumulated by the photodetectors 702 along a left- or right-sloping diagonal may be read out of the pixel 700 together.
[0071] When charges accumulated by different photodetectors 702 are read out of the pixel 700 individually, the charges may be summed in various ways, or a processor may interpolate between the values read out of the photodetectors in different pixels of a pixel array (e.g., perform a de-mosaicing operation) to generate an image having an effective 4x resolution for the pixel array.
[0072] In some embodiments, a shared readout circuit per pixel may be configured differently for different pixels in a pixel array. For example, in a potentially lower cost image sensor, or in an image sensor implemented using front side illumination (FSI) technology, a single charge transfer transistor may be coupled to a pair of photodetectors, and may be operated by a processor to simultaneously read charges out of, and sum charges, integrated by a pair of photodetectors. For example, in one pixel of an image sensor, a single charge transfer transistor could replace both of the charge transfer transistors 706a and 706b and connect both of the photodetectors 702a and 702b to the shared readout circuit 704, and another charge transfer transistor could replace both of the charge transfer transistors 706c and 706d and connect both of the photodetectors 702c and 702d to the shared readout circuit 704. Similarly, in another pixel of the image sensor, a single charge transfer transistor could replace both of the charge transfer transistors 706a and 706c and connect both of the photodetectors 702a and 702c to the shared readout circuit 704, and another charge transfer transistor could replace both of the charge transfer transistors 706b and 706d and connect both of the photodetectors 702b and 702d to the shared readout circuit 704.
[0073] In some embodiments, an image capture device, such as a camera, may not include a shutter, and thus an image sensor of the image capture device may be constantly exposed to light. When the pixel 700 is used in these embodiments, the photodetectors 702 may have to be reset or depleted of charge before an image is captured (e.g., by applying drive signals (e.g., gate voltages) to the reset transistor 710 and charge transfer transistors 706). After the charge from the photodetectors 702 has been depleted, the charge transfer transistors 706 and reset transistor 710 may be turned off to isolate the photodetectors 702 from the shared readout circuit 704. The photodetectors 702 can then accumulate photongenerated charge during a charge integration period.
[0074] FIGs. 8A and 8B each show an array of pixels 800 (e.g., a 2x2 array of pixels). In some cases, the array of pixels 800 may represent a portion of a much larger array of pixels, such as an array of millions of pixels included in an image sensor. In some cases, the array of pixels 800 may be included in an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2, or a set of pixels included in the image sensor described with reference to FIG. 4. In some cases, each pixel 802 in the array of pixels 800 may be configured similarly to the pixel described with reference to FIGs. 5-6 and, in some cases, each pixel 802 may be associated with an instance of the shared readout circuit described with reference to FIG. 7.
[0075] By way of example, the array of pixels 800 includes a red pixel 800a, first and second green pixels 800b, 800c, and a blue pixel 800d arranged in a Bayer pattern. The Bayer pattern may be achieved by disposing a color filter array over the array of pixels 800. For example, different subsets of filter elements in the color filter array may be disposed over different subsets of pixels in the array of pixels 800, with each subset of filter elements having a different color (e.g., red filter elements 802a, green filter elements 802b, or blue filter elements 802c). In alternative embodiments, the different subsets of filter elements may be associated with different colors (e.g., cyan, yellow, and magenta filter elements; cyan, yellow, green, and magenta filter elements; red, green, blue, and white filter elements; and so on). In some alternative embodiments, a color filter array may not be provided, or all of the filter elements in the color filter array may have the same color.
[0076] Each pixel 800a, 800b, 800c, 800d may include a two-dimensional array of photodetectors 804. For example, each pixel 800a, 800b, 800c, 800d may include a first photodetector 804a and a second photodetector 804b arranged in a first row, and a third photodetector 804c and a fourth photodetector 804d arranged in a second row. The first photodetector 804a and the third photodetector 804c may be arranged in a first column, and the second photodetector 804b and the fourth photodetector 804d may be arranged in a second column.
[0077] Each photodetector 804a, 804b, 804c, 804d may be electrically isolated from each other photodetector 804a, 804b, 804c, 804d (e.g., by implant isolation or physical trench isolation).
[0078] An array of 1x2 OCLs 806 may be disposed over the array of pixels 800, with a pair of adjacent 1x2 OCLs 806a, 806b disposed over each pixel 800a, 800b, 800c, 800d. As shown, the pair of adjacent OCLs 806 may include a first 1x2 OCL 806a disposed over two adjacent photodetectors 802 (e.g., over the first and second photodetectors 804a, 804b), and a second 1x2 OCL 806b disposed over two other adjacent photodetectors 802 (e.g., over the third and fourth photodetectors 804c, 804d). [0079] The photodetectors 804a, 804b, 804c, 804d of a pixel 800a, 800b, 800c, or 800d may be connected to a shared readout circuit (i.e., a readout circuit shared by all of the photodetectors associated with the pixel, as described, for example, with reference to FIG. 7). A set of charge transfer transistors may be operable to connect the photodetectors 804a, 804b, 804c, 804d to the shared readout circuit (e.g., each charge transfer transistor in the set may be operable (e.g., by a processor) to connect a respective one of the photodetectors to, and disconnect the respective one of the photodetectors from, the shared readout circuit). In some cases, each charge transfer transistor may be operated individually. In some cases, pairs (or all) of the charge transfer transistors may be operated contemporaneously.
[0080] In some embodiments, all of the 1x2 OCLs 806 in the array of 1x2 OCLs may have a same orientation (e.g., a horizontal orientation, as shown; or alternatively, a vertical orientation). In other embodiments, and as shown in FIGs. 9C and 9D, the 1x2 OCLs disposed over alternating Bayer pattern rows, alternating Bayer pattern columns, or alternating 2x2 sets of Bayer pattern pixels may have different orientations.
[0081] By way of example, FIG. 8A shows each of the 1x2 OCLs 806 as having a similarly shaped and sized oval perimeter. Each of the 1x2 OCLs 806 may also have a similar curvature (i.e., curvature perpendicular to the plan view shown in FIG. 8A). In other embodiments, each of the 1x2 OCLs 806 may have a perimeter that is generally rectangular, or a perimeter having a different or non-symmetric shape. All of these perimeters may be referred to herein as oblong perimeters, providing oblong OCLs (e.g., 1x2 OCLs 806). In some cases, different 1x2 OCLs 806 may have somewhat different shapes as a result of manufacturing variance. More generally, each (or all) of the 1x2 OCLs 806 in the array of 1x2 OCLs 806 may have the same or different shape, size, or curvature. For purposes of this description, however, an OCL of any shape that extends over two adjacent photodetectors is considered a 1x2 OCL (and is also considered an oblong OCL).
[0082] Also by way of example, FIG. 8A shows each of the 1 x2 OCLs 806 to have a focus area 808 (e.g., the first 1x2 OCL 806a has a first focus area 808a, and the second 1x2 OCL 806b has a second focus area 808b). The focus area 808 of each 1x2 OCL is generally aligned with the perimeter of the OCL 806.
[0083] By way of further example, FIG. 8A shows that the perimeter of each 1x2 OCL 806 is generally aligned with respect to the pair of photodetectors over which it is disposed. For example, the first photodetector 804a has a first centroid 810a, the second photodetector 804b has a second centroid 810b, and the first 1x2 OCL 806a has a centroid 812 that is disposed in line with, and centered between, the first centroid 810a and the second centroid 810b. Similarly, FIG. 8A shows that the focus area 808 of each 1x2 OCL 806 is generally aligned with the perimeter of the 1x2 OCL 806, and generally aligned with respect to the pair of photodetectors over which it is disposed. For example, the first 1x2 OCL 806a has a focus centroid 814 that is disposed in line with, and centered between, the first centroid 810a and the second centroid 810b.
[0084] In some embodiments, the centroid of a 1x2 OCL (e.g., the centroid 812 of the first 1x2 OCL 806a) may not be aligned with respect to the pair of photodetectors over which it is disposed, and/or the focus centroid of a 1x2 OCL (e.g., the focus centroid 814 of the focus area 808a of the first 1x2 OCL 806a) may not be aligned with respect to the pair of photodetectors over which it is disposed. For example, and as shown in FIG. 8B, the focus area 808a of the first 1x2 OCL 806a may not be aligned with respect to the first and second photodetectors 804a, 804b (e.g., the focus centroid 814 of the first 1x2 OCL 806a is not aligned with, and is not centered between, the first centroid 810a then the second centroid 810b). In alternative embodiments, the focus centroid 814 may not be aligned with, but may be centered between, the first centroid 810a and the second centroid 810b; or, the focus centroid 814 may be aligned with, but not be centered between, the first centroid 810a and the second centroid 801 b.
[0085] Additionally or alternatively, the centroid of a 1x2 OCL may not be aligned with respect to the photodetectors over which it is disposed (e.g., because of manufacturing variances that affect the placement or shape of the perimeter of the 1x2 OCL).
[0086] FIGs. 9A-9D show an imaging area 900 of an image capture device (e.g., an image sensor), in which the pixels 902 of the image capture device are arranged in accordance with a Bayer pattern (i.e., a 2x2 pattern including red pixels and blue pixels along one diagonal, and green pixels along the other diagonal). The pixels 902 may be arranged in Bayer pattern rows 906a, 906b and Bayer pattern columns 908a, 908b. More generally, the pixels 902 may be arranged in rows extending in a first dimension, and in columns extending in a second dimension orthogonal to the first dimension.
[0087] FIGs. 9A-9D show each pixel 902 as having a 2x2 array of photodetectors 904, as described, for example, with reference to FIGs. 5-8B. In some embodiments, the imaging area 900 may be an example of an imaging area of an image sensor associated with one of the image capture devices or cameras described with reference to FIGs. 1 A-1 B and 2, or the imaging area of the image sensor described with reference to FIG. 4, or an imaging area including a plurality of the pixels described with reference to any of FIGs. 5-8B. [0088] An array of 1x2 OCLs 910 is disposed over the entirety of the imaging area 900, with a pair of adjacent 1x2 OCLs 910 disposed over each pixel 902, and with each 1x2 OCL 910 being disposed over a pair of adjacent photodetectors 904.
[0089] In FIG. 9A, the pixels 902 in all of the Bayer pattern rows 906a, 906b of the imaging area 900 are configured (or are operable) to detect a phase difference (e.g., an out- of-focus condition) in a first set of edges of an image (e.g., vertical edges). Each 1x2 OCL 910 in the array of 1 x2 OCLs 910 has a same orientation, with its longer dimension extending parallel to the Bayer pattern rows 906a, 906b.
[0090] In FIG. 9B, the pixels 902 in all of the Bayer pattern columns 908a, 908b of the imaging area 900 are configured (or are operable) to detect a phase difference (e.g., an out- of-focus condition) in a second set of edges of an image (e.g., horizontal edges). Each 1x2 OCL 910 in the array of 1x2 OCLs 910 has a same orientation, with its longer dimension extending parallel to the Bayer pattern columns 908a, 908b.
[0091] In FIG. 9C, the pixels 902 in the first Bayer pattern row 906a of the imaging area 900 are configured (or are operable) to detect a phase difference (e.g., an out-of-focus condition) in a first set of edges of an image (e.g., vertical edges), and the pixels 902 in the second Bayer pattern row 906b are configured (or are operable) to detect a phase difference in a second set of edges of the image (e.g., horizontal edges, or edges that are otherwise orthogonal to the first set of edges). The 1x2 OCLs 910 in the array of 1x2 OCLs 910 have different orientations, with a first subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern rows 906a, 906b), and a second subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a second orientation, orthogonal to the first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern columns 908a, 908b). The first subset of 1x2 OCLs 910 may be disposed over a first set of rows of pixels 902 (e.g., over the Bayer pattern row 906a, or over interspersed rows (e.g., interspersed Bayer pattern rows 906) when the imaging area 900 includes more Bayer pattern rows than are shown). The second subset of 1x2 OCLs 910 may be disposed over a second set of rows of pixels 902 (e.g., over the Bayer pattern row 906b, or over interspersed rows (e.g., interspersed Bayer pattern rows 906) when the imaging area 900 includes more Bayer pattern rows than are shown). Alternatively, the first and second subsets of 1x2 OCLs 910 may be disposed over interspersed columns, such as interspersed Bayer pattern columns.
[0092] In FIG. 9D, the pixels 902 in a first lattice of pixels are configured (or are operable) to detect a phase difference (e.g., an out-of-focus condition) in a first set of edges of an image e.g., vertical edges), and the pixels 902 in a second lattice of pixels are configured (or are operable) to detect a phase difference in a second set of edges of the image (e.g., horizontal edges, or edges that are otherwise orthogonal to the first set of edges). The lattices of pixels may be overlapping checkerboard lattices of pixels, or overlapping checkerboard lattices of Bayer pattern sets of pixels (e.g., each segment of each lattice may be a 2x2 array of pixels 902). The 1x2 OCLs 910 in the array of 1x2 OCLs 910 have different orientations, with a first subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern rows 906a, 906b), and a second subset of 1x2 OCLs 910 in the array of 1x2 OCLs 910 having a second orientation, orthogonal to the first orientation (e.g., with its longer dimension extending parallel to the Bayer pattern columns 908a, 908b). The first subset of 1x2 OCLs 910 may be disposed over the first lattice of pixels 902, and the second subset of 1x2 OCLs 910 may be disposed over the second lattice of pixels 902.
[0093] In the configuration shown in FIG. 9D, at least some of the 1x2 OCLs 910 in the first subset of 1x2 OCLs 910 and at least some of the 1x2 OCLs 910 in the second subset of 1x2 OCLs 910 are disposed over a subset of pixels disposed under a same-colored subset of filter elements. In this manner, PDAF information for detecting the focus of two orthogonal sets of edges may be collected from pixels 902 having the same color.
[0094] FIG. 10 shows a sample electrical block diagram of an electronic device 1000, which may be the electronic device described with reference to FIGs. 1 A-1 B, 2, 4, and so on. The electronic device 1000 may include a display 1002 (e.g., a light-emitting display), a processor 1004, a power source 1006, a memory 1008 or storage device, a sensor 1010, and an input/output (I/O) mechanism 1012 (e.g., an input/output device and/or input/output port). The processor 1004 may control some or all of the operations of the electronic device 1000. The processor 1004 may communicate, either directly or indirectly, with substantially all of the components of the electronic device 1000. For example, a system bus or other communication mechanism 1014 may provide communication between the processor 1004, the power source 1006, the memory 1008, the sensor 1010, and/or the input/output mechanism 1012.
[0095] The processor 1004 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the processor 1004 may be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of such devices. As described herein, the term “processor” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
[0096] It should be noted that the components of the electronic device 1000 may be controlled by multiple processors. For example, select components of the electronic device 1000 may be controlled by a first processor and other components of the electronic device 1000 may be controlled by a second processor, where the first and second processors may or may not be in communication with each other.
[0097] The power source 1006 may be implemented with any device capable of providing energy to the electronic device 1000. For example, the power source 1006 may be one or more batteries or rechargeable batteries. Additionally or alternatively, the power source 1006 may be a power connector or power cord that connects the electronic device 1000 to another power source, such as a wall outlet.
[0098] The memory 1008 may store electronic data that may be used by the electronic device 1000. For example, the memory 1008 may store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing signals, control signals, data structures or databases, image data, or focus settings. The memory 1008 may be configured as any type of memory. By way of example only, the memory 1008 may be implemented as random access memory, read-only memory, Flash memory, removable memory, other types of storage elements, or combinations of such devices.
[0099] The electronic device 1000 may also include one or more sensors 1010 positioned substantially anywhere on the electronic device 1000. The sensor(s) 1010 may be configured to sense substantially any type of characteristic, such as but not limited to, pressure, light, touch, heat, movement, relative motion, biometric data, and so on. For example, the sensor(s) 1010 may include a heat sensor, a position sensor, a light or optical sensor, an accelerometer, a pressure transducer, a gyroscope, a magnetometer, a health monitoring sensor, and so on. Additionally, the one or more sensors 1010 may utilize any suitable sensing technology, including, but not limited to, capacitive, ultrasonic, resistive, optical, ultrasound, piezoelectric, and thermal sensing technology.
[0100] The I/O mechanism 1012 may transmit and/or receive data from a user or another electronic device. An I/O device may include a display, a touch sensing input surface such as a track pad, one or more buttons (e.g., a graphical user interface “home” button), one or more cameras, one or more microphones or speakers, one or more ports such as a microphone port, and/or a keyboard. Additionally or alternatively, an I/O device or port may transmit electronic signals via a communications network, such as a wireless and/or wired network connection. Examples of wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, IR, and Ethernet connections.
[0101] The foregoing description, for purposes of explanation, uses specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.

Claims

CLAIMS What is claimed is:
1 . An image capture device, comprising: an array of pixels, each pixel comprising a 2x2 array of photodetectors; and an array of 1x2 on-chip lenses (OCLs) disposed over the array of pixels; wherein, for each pixel in the array of pixels, a respective pair of adjacent 1x2 OCLs is disposed over a pixel, the respective pair of adjacent 1x2 OCLs including, a respective first 1x2 OCL disposed over a first photodetector and a second photodetector in the 2x2 array of photodetectors for the pixel; and a respective second 1x2 OCL disposed over a third photodetector and a fourth photodetector in the 2x2 array of photodetectors for the pixel.
2. The image capture device of claim 1 , wherein all 1x2 OCLs in the array of 1x2 OCLs have a same orientation.
3. The image capture device of claim 1 , wherein: a first subset of 1x2 OCLs in the array of 1x2 OCLs has a first orientation; and a second subset of 1x2 OCLs in the array of 1x2 OCLs has a second orientation, orthogonal to the first orientation.
4. The image capture device of claim 3, wherein: the array of pixels has rows of pixels extending in a first dimension and columns of pixels extending in a second dimension; the first subset of 1x2 OCLs is disposed over a first set of the rows of pixels; and the second subset of 1x2 OCLs is disposed over a second set of the rows of pixels, the second set of the rows of pixels interspersed with the first set of rows of the pixels.
5. The image capture device of claim 3, wherein: the array of pixels has rows of pixels extending in a first dimension and columns of pixels extending in a second dimension; the first subset of 1x2 OCLs is disposed over a first set of columns of pixels; and the second subset of 1x2 OCLs is disposed over a second set of columns of pixels, the second set of columns of pixels interspersed with the first set of columns of pixels.
6. The image capture device of claim 3, wherein: the array of pixels has rows of pixels extending in a first dimension and columns of pixels extending in a second dimension;
- 25 - the first subset of 1x2 OCLs is disposed over a first lattice of pixels extending over the rows of pixels and the columns of pixels; and the second subset of 1x2 OCLs is disposed over a second lattice of pixels extending over the rows of pixels and the columns of pixels, the second lattice of pixels overlapping the first lattice of pixels.
7. The image capture device of claim 1 , further comprising: a color filter array disposed over the array of pixels and having different subsets of filter elements disposed over different subsets of pixels in the array of pixels; wherein, each subset of filter elements has a different color.
8. The image capture device of claim 7, wherein: a first subset of filter elements in the different subsets of filter elements is disposed over a first subset of pixels in the array of pixels; a first subset of 1x2 OCLs in the array of 1x2 OCLs has a first orientation; a second subset of 1x2 OCLs in the array of 1x2 OCLs has a second orientation, orthogonal to the first orientation; and at least some 1x2 OCLs in the first subset of 1x2 OCLs and at least some 1x2 OCLs in the second subset of 1x2 OCLs are disposed over the first subset of pixels.
9. The image capture device of claim 8, wherein the different subsets of filter elements include red filter elements, blue filter elements, and green filter elements arranged in Bayer pattern rows and Bayer pattern columns.
10. The image capture device of claim 9, wherein all 1x2 OCLs in the array of 1x2 OCLs have a same orientation.
11 . The image capture device of claim 7, wherein: a first subset of 1x2 OCLs in the array of 1x2 OCLs has a first orientation; and a second subset of 1x2 OCLs in the array of 1x2 OCLs has a second orientation, orthogonal to the first orientation.
12. The image capture device of claim 1 , wherein, for at least one pixel in the array of pixels, the first photodetector has a first centroid; the second photodetector has a second centroid; and a 1x2 OCL in the array of 1x2 OCLs has a focus centroid that is not aligned with the first centroid and the second centroid.
13. The image capture device of claim 1 , wherein, for at least one pixel in the array of pixels, the first photodetector has a first centroid; the second photodetector has a second centroid; and a 1x2 OCL in the array of 1x2 OCLs has a focus centroid that is not centered between the first centroid and the second centroid.
14. The image capture device of claim 1 , wherein at least two 1x2 OCLs in the array of 1x2 OCLs have at least one of: different shapes, different sizes, or different curvatures.
15. The image capture device of claim 1 , further comprising: a pixel readout circuit comprising, for each pixel in the array of pixels, a shared readout circuit associated with the 2x2 array of photodetectors for the pixel; and a set of charge transfer transistors, each charge transfer transistor operable to connect a photodetector in the 2x2 array of photodetectors to the shared readout circuit.
16. An image capture device, comprising: an array of photodetectors; and an array of oblong on-chip lenses (OCLs) disposed over the array of photodetectors; wherein, a different pair of photodetectors in the array of photodetectors is disposed under each of the oblong OCLs; and every photodetector in the array of photodetectors is disposed under a respective one of the oblong OCLs in the array of oblong OCLs.
PCT/US2021/043607 2020-08-04 2021-07-29 Image capture devices having phase detection auto-focus pixels WO2022031503A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202063061074P 2020-08-04 2020-08-04
US63/061,074 2020-08-04
US17/380,852 2021-07-20
US17/380,852 US11563910B2 (en) 2020-08-04 2021-07-20 Image capture devices having phase detection auto-focus pixels

Publications (1)

Publication Number Publication Date
WO2022031503A1 true WO2022031503A1 (en) 2022-02-10

Family

ID=80114363

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/043607 WO2022031503A1 (en) 2020-08-04 2021-07-29 Image capture devices having phase detection auto-focus pixels

Country Status (2)

Country Link
US (1) US11563910B2 (en)
WO (1) WO2022031503A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11019294B2 (en) 2018-07-18 2021-05-25 Apple Inc. Seamless readout mode transitions in image sensors
US11546532B1 (en) 2021-03-16 2023-01-03 Apple Inc. Dynamic correlated double sampling for noise rejection in image sensors

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150350583A1 (en) * 2014-06-03 2015-12-03 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with sub-pixel resolution capabilities
WO2019102887A1 (en) * 2017-11-22 2019-05-31 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging element and electronic device

Family Cites Families (312)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4373804A (en) 1979-04-30 1983-02-15 Diffracto Ltd. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
JPH07114472B2 (en) 1984-11-19 1995-12-06 株式会社ニコン Driving method for solid-state imaging device
US4686648A (en) 1985-12-03 1987-08-11 Hughes Aircraft Company Charge coupled device differencer
SE465551B (en) 1990-02-16 1991-09-30 Aake Oeberg DEVICE FOR DETERMINING A HEART AND RESPIRATORY FREQUENCY THROUGH PHOTOPLETISMOGRAPHICAL SEATING
US5105264A (en) 1990-09-28 1992-04-14 Eastman Kodak Company Color image sensor having an optimum exposure time for each color
US5329313A (en) 1992-04-01 1994-07-12 Intel Corporation Method and apparatus for real time compression and decompression of a digital motion video signal using a fixed Huffman table
US5550677A (en) 1993-02-26 1996-08-27 Donnelly Corporation Automatic rearview mirror system using a photosensor array
JP3358620B2 (en) 1993-04-09 2002-12-24 ソニー株式会社 Image encoding method and image encoding device
US5949483A (en) 1994-01-28 1999-09-07 California Institute Of Technology Active pixel sensor array with multiresolution readout
US5841126A (en) 1994-01-28 1998-11-24 California Institute Of Technology CMOS active pixel sensor type imaging system on a chip
US5471515A (en) 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
US5541402A (en) 1994-10-17 1996-07-30 At&T Corp. Imaging active pixel device having a non-destructive read-out gate
JP3284803B2 (en) 1994-12-16 2002-05-20 富士ゼロックス株式会社 Image input device
US5744807A (en) 1996-06-20 1998-04-28 Xerox Corporation Sensor array data line readout with reduced crosstalk
US6233013B1 (en) 1997-10-23 2001-05-15 Xerox Corporation Color readout system for an active pixel image sensor
US6714239B2 (en) 1997-10-29 2004-03-30 Eastman Kodak Company Active pixel sensor with programmable color balance
JP3667058B2 (en) 1997-11-19 2005-07-06 キヤノン株式会社 Photoelectric conversion device
US6008486A (en) 1997-12-31 1999-12-28 Gentex Corporation Wide dynamic range optical sensor
US6348929B1 (en) 1998-01-16 2002-02-19 Intel Corporation Scaling algorithm and architecture for integer scaling in video
US6040568A (en) 1998-05-06 2000-03-21 Raytheon Company Multipurpose readout integrated circuit with in cell adaptive non-uniformity correction and enhanced dynamic range
JP3697073B2 (en) 1998-08-05 2005-09-21 キヤノン株式会社 Imaging apparatus and imaging system using the same
US6956605B1 (en) 1998-08-05 2005-10-18 Canon Kabushiki Kaisha Image pickup apparatus
US7133073B1 (en) 1999-08-19 2006-11-07 Dialog Imaging Systems Gmbh Method and apparatus for color interpolation
US8310577B1 (en) 1999-08-19 2012-11-13 Youliza, Gehts B.V. Limited Liability Company Method and apparatus for color compensation
JP2001285717A (en) 2000-03-29 2001-10-12 Toshiba Corp Solid-state image pickup device
US6448550B1 (en) 2000-04-27 2002-09-10 Agilent Technologies, Inc. Method and apparatus for measuring spectral content of LED light source and control thereof
US6616613B1 (en) 2000-04-27 2003-09-09 Vitalsines International, Inc. Physiological signal monitoring system
JP3685686B2 (en) 2000-06-12 2005-08-24 三菱電機株式会社 Imaging area sensor and imaging apparatus
TW516184B (en) 2000-06-20 2003-01-01 Pixelplus Co Ltd CMOS active pixel for improving sensitivity
US6713796B1 (en) 2001-01-19 2004-03-30 Dalsa, Inc. Isolated photodiode
US7554067B2 (en) 2001-05-07 2009-06-30 Panavision Imaging Llc Scanning imager employing multiple chips with staggered pixels
JP4255223B2 (en) 2001-06-28 2009-04-15 イーストマン コダック カンパニー Correlated double sampling timing adjustment device
US7084914B2 (en) 2001-07-20 2006-08-01 Micron Technology, Inc. Variable pixel clock electronic shutter control
US6541751B1 (en) 2001-10-03 2003-04-01 Pixim Inc Time multiplexing image processing functions for noise reduction
KR100464821B1 (en) 2001-10-23 2005-01-17 임좌상 Method for estimating emotion using physiological signal
KR100455286B1 (en) 2002-01-11 2004-11-06 삼성전자주식회사 Method and apparatus for understanding the condition of animal using acquisition and analysis of physiological signal of the animal
US7906826B2 (en) 2002-02-05 2011-03-15 E-Phocus Many million pixel image sensor
KR100462182B1 (en) 2002-04-15 2004-12-16 삼성전자주식회사 Apparatus and method for detecting heart beat using ppg
US6816676B2 (en) 2002-04-19 2004-11-09 Hewlett-Packard Development Company, L.P. Adaptive control of LCD display utilizing imaging sensor measurements
US6670904B1 (en) 2002-08-22 2003-12-30 Micron Technology, Inc. Double-ramp ADC for CMOS sensors
US7786543B2 (en) 2002-08-27 2010-08-31 E-Phocus CDS capable sensor with photon sensing layer on active pixel circuit
US7525168B2 (en) 2002-08-27 2009-04-28 E-Phocus, Inc. CMOS sensor with electrodes across photodetectors at approximately equal potential
JP4403687B2 (en) 2002-09-18 2010-01-27 ソニー株式会社 Solid-state imaging device and drive control method thereof
US20040207836A1 (en) 2002-09-27 2004-10-21 Rajeshwar Chhibber High dynamic range optical inspection system and method
US7471315B2 (en) 2003-03-14 2008-12-30 Aptina Imaging Corporation Apparatus and method for detecting and compensating for illuminant intensity changes within an image
US7075049B2 (en) 2003-06-11 2006-07-11 Micron Technology, Inc. Dual conversion gain imagers
US20050026332A1 (en) 2003-07-29 2005-02-03 Fratti Roger A. Techniques for curvature control in power transistor devices
US6931269B2 (en) 2003-08-27 2005-08-16 Datex-Ohmeda, Inc. Multi-domain motion estimation and plethysmographic recognition using fuzzy neural-nets
US7115855B2 (en) 2003-09-05 2006-10-03 Micron Technology, Inc. Image sensor having pinned floating diffusion diode
JP4106554B2 (en) 2003-09-08 2008-06-25 ソニー株式会社 Imaging environment determination method and imaging apparatus
US7154075B2 (en) 2003-11-13 2006-12-26 Micron Technology, Inc. Method and apparatus for pixel signal binning and interpolation in column circuits of a sensor circuit
US7332786B2 (en) 2003-11-26 2008-02-19 Micron Technology, Inc. Anti-blooming storage pixel
JP4259998B2 (en) 2003-12-19 2009-04-30 三洋電機株式会社 Flicker detection device and imaging device
US7091466B2 (en) 2003-12-19 2006-08-15 Micron Technology, Inc. Apparatus and method for pixel binning in an image sensor
US7437013B2 (en) 2003-12-23 2008-10-14 General Instrument Corporation Directional spatial video noise reduction
US7446812B2 (en) 2004-01-13 2008-11-04 Micron Technology, Inc. Wide dynamic range operations for imaging
TW200607335A (en) 2004-04-21 2006-02-16 Qualcomm Inc Flicker detection for image sensing devices
KR100574890B1 (en) 2004-04-27 2006-04-27 매그나칩 반도체 유한회사 Image sensor and method for detection of flicker noise of the same
KR100578647B1 (en) 2004-04-27 2006-05-11 매그나칩 반도체 유한회사 Method for integration of image sensor
US7102117B2 (en) 2004-06-08 2006-09-05 Eastman Kodak Company Active pixel sensor cell with integrating varactor and method for using such cell
US7825973B2 (en) 2004-07-16 2010-11-02 Micron Technology, Inc. Exposure control for image sensors
US7880785B2 (en) 2004-07-21 2011-02-01 Aptina Imaging Corporation Rod and cone response sensor
JP4455215B2 (en) 2004-08-06 2010-04-21 キヤノン株式会社 Imaging device
JP4389737B2 (en) 2004-09-22 2009-12-24 セイコーエプソン株式会社 Solid-state imaging device and driving method thereof
US7259413B2 (en) 2004-09-28 2007-08-21 Micron Technology, Inc. High dynamic range image sensor
US20060103749A1 (en) 2004-11-12 2006-05-18 Xinping He Image sensor and pixel that has switchable capacitance at the floating node
US7555158B2 (en) 2004-12-07 2009-06-30 Electronics And Telecommunications Research Institute Apparatus for recovering background in image sequence and method thereof
US7502054B2 (en) 2004-12-20 2009-03-10 Pixim, Inc. Automatic detection of fluorescent flicker in video images
US7190039B2 (en) 2005-02-18 2007-03-13 Micron Technology, Inc. Microelectronic imagers with shaped image sensors and methods for manufacturing microelectronic imagers
JP4855704B2 (en) 2005-03-31 2012-01-18 株式会社東芝 Solid-state imaging device
JP4377840B2 (en) 2005-03-31 2009-12-02 イーストマン コダック カンパニー Digital camera
US7443421B2 (en) 2005-04-05 2008-10-28 Hewlett-Packard Development Company, L.P. Camera sensor
US20060244843A1 (en) 2005-04-29 2006-11-02 Bart Dierickx Illumination flicker detection
JP4207926B2 (en) 2005-05-13 2009-01-14 ソニー株式会社 Flicker correction method, flicker correction apparatus, and imaging apparatus
US7361877B2 (en) 2005-05-27 2008-04-22 Eastman Kodak Company Pinned-photodiode pixel with global shutter
TWI429066B (en) 2005-06-02 2014-03-01 Sony Corp Semiconductor image sensor module and manufacturing method thereof
US20060274161A1 (en) 2005-06-03 2006-12-07 Intel Corporation Method and apparatus to determine ambient light using a camera
US7415096B2 (en) 2005-07-26 2008-08-19 Jordan Valley Semiconductors Ltd. Curved X-ray reflector
KR100760142B1 (en) 2005-07-27 2007-09-18 매그나칩 반도체 유한회사 Stacked pixel for high resolution cmos image sensors
US8274715B2 (en) 2005-07-28 2012-09-25 Omnivision Technologies, Inc. Processing color and panchromatic pixels
US8139130B2 (en) 2005-07-28 2012-03-20 Omnivision Technologies, Inc. Image sensor with improved light sensitivity
JP4227152B2 (en) 2005-08-02 2009-02-18 三星電機株式会社 Active pixel array of CMOS image sensor
JP4904749B2 (en) 2005-09-08 2012-03-28 ソニー株式会社 Flicker reduction method, flicker reduction circuit, and imaging apparatus
KR100775058B1 (en) 2005-09-29 2007-11-08 삼성전자주식회사 Pixel Cell, Image Sensor Adopting The Pixel Cell, and Image Processing System Including The Image Sensor
US8032206B1 (en) 2005-10-20 2011-10-04 Pacesetter, Inc. Use of motion sensor for dynamic updating of heart detection threshold
KR100715932B1 (en) 2005-10-24 2007-05-08 (주) 픽셀플러스 Flicker Detecting Apparatus
US7211802B1 (en) 2005-12-30 2007-05-01 Eastman Kodak Company X-ray impingement event detection system and method for a digital radiography detector
US7626626B2 (en) 2006-01-13 2009-12-01 Micron Technology, Inc. Method and apparatus providing pixel storage gate charge sensing for electronic stabilization in imagers
US20070263099A1 (en) 2006-05-09 2007-11-15 Pixim Inc. Ambient Light Rejection In Digital Video Images
JP3996618B1 (en) 2006-05-11 2007-10-24 総吉 廣津 Semiconductor image sensor
US8026966B2 (en) 2006-08-29 2011-09-27 Micron Technology, Inc. Method, apparatus and system providing a storage gate pixel with high dynamic range
US7773138B2 (en) 2006-09-13 2010-08-10 Tower Semiconductor Ltd. Color pattern and pixel level binning for APS image sensor using 2×2 photodiode sharing scheme
US8194148B2 (en) 2006-09-14 2012-06-05 Nikon Corporation Image processing device, electronic camera and image processing program
WO2008055042A2 (en) 2006-10-30 2008-05-08 Wesleyan University Apparatus and method for real time image compression for particle tracking
KR20080041912A (en) 2006-11-08 2008-05-14 삼성전자주식회사 Pixel circuit of cmos image sensor capable of controlling sensitivity
KR100828943B1 (en) 2006-12-19 2008-05-13 (주)실리콘화일 3 transistors 4 shared step & repeat unit cell and 3 transistors 4 shared image sensor, data storage device, semiconductor process mask, semiconductor wafer including the unit cells
US7742090B2 (en) 2006-12-22 2010-06-22 Palo Alto Research Center Incorporated Flexible segmented image sensor
KR20080069851A (en) 2007-01-24 2008-07-29 삼성전자주식회사 Biosignal-measuring sensor instrument and headset having the sensor instrument and pendant having the sensor instrument
US7796171B2 (en) 2007-02-16 2010-09-14 Flir Advanced Imaging Systems, Inc. Sensor-based gamma correction of a digital camera
KR101085802B1 (en) 2007-03-05 2011-11-22 르네사스 일렉트로닉스 가부시키가이샤 Imaging apparatus and flicker detection method
KR100835892B1 (en) 2007-03-26 2008-06-09 (주)실리콘화일 Chip stacking image sensor
KR100853195B1 (en) 2007-04-10 2008-08-21 삼성전자주식회사 Image sensor
JP4935486B2 (en) 2007-04-23 2012-05-23 ソニー株式会社 Solid-state imaging device, driving method for solid-state imaging device, signal processing method for solid-state imaging device, and imaging device
JP5163935B2 (en) 2007-05-17 2013-03-13 ソニー株式会社 Image sensor
KR100871981B1 (en) 2007-06-25 2008-12-08 주식회사 동부하이텍 Image sensor and method for manufacturing thereof
KR100872991B1 (en) 2007-06-25 2008-12-08 주식회사 동부하이텍 Image sensor and method for manufacturing the same
JP2009021809A (en) 2007-07-11 2009-01-29 Canon Inc Driving method of imaging device, imaging device, and imaging system
US8098372B2 (en) 2007-07-23 2012-01-17 Applied Materials South East Asia Pte. Ltd. Optical inspection tool featuring multiple speed modes
JP2009054870A (en) 2007-08-28 2009-03-12 Sanyo Electric Co Ltd Imaging apparatus
US7873236B2 (en) 2007-08-28 2011-01-18 General Electric Company Systems, methods and apparatus for consistency-constrained filtered backprojection for out-of-focus artifacts in digital tomosythesis
KR100887887B1 (en) 2007-11-06 2009-03-06 주식회사 동부하이텍 An image sensor
JP5163068B2 (en) 2007-11-16 2013-03-13 株式会社ニコン Imaging device
JP4971956B2 (en) 2007-11-27 2012-07-11 キヤノン株式会社 Flicker correction apparatus, flicker correction method, and imaging apparatus
US20090146234A1 (en) 2007-12-06 2009-06-11 Micron Technology, Inc. Microelectronic imaging units having an infrared-absorbing layer and associated systems and methods
JP5180795B2 (en) 2007-12-10 2013-04-10 キヤノン株式会社 Imaging apparatus and control method thereof
US8259228B2 (en) 2007-12-10 2012-09-04 Ati Technologies Ulc Method and apparatus for high quality video motion adaptive edge-directional deinterlacing
US7952635B2 (en) 2007-12-19 2011-05-31 Teledyne Licensing, Llc Low noise readout apparatus and method with snapshot shutter and correlated double sampling
JP5026951B2 (en) 2007-12-26 2012-09-19 オリンパスイメージング株式会社 Imaging device driving device, imaging device driving method, imaging device, and imaging device
JP5111100B2 (en) 2007-12-28 2012-12-26 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
US9017748B2 (en) 2007-12-28 2015-04-28 Kraft Foods Group Brands Llc Potassium fortification in foodstuffs
EP2079229B1 (en) 2008-01-10 2011-09-14 Stmicroelectronics Sa Pixel circuit for global electronic shutter
US8227844B2 (en) 2008-01-14 2012-07-24 International Business Machines Corporation Low lag transfer gate device
US20090201400A1 (en) 2008-02-08 2009-08-13 Omnivision Technologies, Inc. Backside illuminated image sensor with global shutter and storage capacitor
KR20090087644A (en) 2008-02-13 2009-08-18 삼성전자주식회사 Pixel circuit array
JP2009212909A (en) 2008-03-05 2009-09-17 Sharp Corp Solid-state imaging apparatus, flicker detection method for solid-state imaging apparatus, control program, readable recording medium, and electronic information device
JP5568880B2 (en) 2008-04-03 2014-08-13 ソニー株式会社 Solid-state imaging device, driving method of solid-state imaging device, and electronic apparatus
US8116540B2 (en) 2008-04-04 2012-02-14 Validity Sensors, Inc. Apparatus and method for reducing noise in fingerprint sensing circuits
JP4494492B2 (en) 2008-04-09 2010-06-30 キヤノン株式会社 Solid-state imaging device and driving method of solid-state imaging device
CA2628792A1 (en) 2008-04-10 2009-10-10 Chaji G. Reza High dynamic range active pixel sensor
KR101647493B1 (en) 2008-05-14 2016-08-10 하트마일즈, 엘엘씨 Physical activity monitor and data collection unit
JP5188275B2 (en) 2008-06-06 2013-04-24 キヤノン株式会社 Solid-state imaging device, driving method thereof, and imaging system
KR20100008239A (en) 2008-07-15 2010-01-25 (주)에스엔티 Eliminating method of motion artifact from ppg signal
JP5300356B2 (en) 2008-07-18 2013-09-25 キヤノン株式会社 IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
JP2010080604A (en) 2008-09-25 2010-04-08 Panasonic Corp Solid state imaging apparatus and driving method thereof
JP2010113230A (en) 2008-11-07 2010-05-20 Sony Corp Pixel circuit, display device and electronic equipment
JP2010114834A (en) 2008-11-10 2010-05-20 Olympus Imaging Corp Imaging apparatus
JP5254762B2 (en) 2008-11-28 2013-08-07 キヤノン株式会社 Imaging apparatus, imaging system, and signal correction method in imaging apparatus
KR20100065084A (en) 2008-12-05 2010-06-15 한국전자통신연구원 Apparatus for measuring motion noise robust pulse wave and method thereof
US8164669B2 (en) 2008-12-19 2012-04-24 Truesense Imaging, Inc. Charge-coupled device image sensor with efficient binning of same-color pixels
US8340407B2 (en) 2009-01-14 2012-12-25 Cisco Technology, Inc. System and method for image demosaicing
US20120159996A1 (en) 2010-12-28 2012-06-28 Gary Edwin Sutton Curved sensor formed from silicon fibers
US8184188B2 (en) 2009-03-12 2012-05-22 Micron Technology, Inc. Methods and apparatus for high dynamic operation of a pixel cell
JP5347999B2 (en) 2009-03-12 2013-11-20 ソニー株式会社 Solid-state imaging device, manufacturing method thereof, and imaging apparatus
JP4835710B2 (en) 2009-03-17 2011-12-14 ソニー株式会社 Solid-state imaging device, method for manufacturing solid-state imaging device, driving method for solid-state imaging device, and electronic apparatus
US8140143B2 (en) 2009-04-16 2012-03-20 Massachusetts Institute Of Technology Washable wearable biosensor
US8089036B2 (en) 2009-04-30 2012-01-03 Omnivision Technologies, Inc. Image sensor with global shutter and in pixel storage transistor
JP2011004390A (en) 2009-05-18 2011-01-06 Canon Inc Imaging device, imaging system, and method for driving imaging device
US8350940B2 (en) 2009-06-08 2013-01-08 Aptina Imaging Corporation Image sensors and color filter arrays for charge summing and interlaced readout modes
CN101567977B (en) 2009-06-09 2013-09-18 北京中星微电子有限公司 Flicker detection method and device thereof
KR101597785B1 (en) 2009-07-14 2016-02-25 삼성전자주식회사 Image sensor and image processing method
KR101605046B1 (en) 2009-07-29 2016-03-21 삼성전자주식회사 Single gate pixel and operating method for single gate pixel
US8755854B2 (en) 2009-07-31 2014-06-17 Nellcor Puritan Bennett Ireland Methods and apparatus for producing and using lightly filtered photoplethysmograph signals
JP5625284B2 (en) 2009-08-10 2014-11-19 ソニー株式会社 Solid-state imaging device, driving method of solid-state imaging device, and electronic apparatus
TWI423246B (en) 2009-08-21 2014-01-11 Primax Electronics Ltd Image processing method and apparatus thereof
JP2011049697A (en) 2009-08-25 2011-03-10 Panasonic Corp Flicker detection device, imaging device, flicker detection program, and flicker detection method
US8619163B2 (en) 2009-09-18 2013-12-31 Canon Kabushiki Kaisha Solid state imaging using a correction parameter for correcting a cross talk between adjacent pixels
US9066660B2 (en) 2009-09-29 2015-06-30 Nellcor Puritan Bennett Ireland Systems and methods for high-pass filtering a photoplethysmograph signal
US8194165B2 (en) 2009-09-30 2012-06-05 Truesense Imaging, Inc. Methods for capturing and reading out images from an image sensor
US20110080500A1 (en) 2009-10-05 2011-04-07 Hand Held Products, Inc. Imaging terminal, imaging sensor having multiple reset and/or multiple read mode and methods for operating the same
JP4881987B2 (en) 2009-10-06 2012-02-22 キヤノン株式会社 Solid-state imaging device and imaging device
JP2011091775A (en) 2009-10-26 2011-05-06 Toshiba Corp Solid-state image pickup device
WO2011053711A1 (en) 2009-10-30 2011-05-05 Invisage Technologies, Inc. Systems and methods for color binning
JP5537905B2 (en) * 2009-11-10 2014-07-02 富士フイルム株式会社 Imaging device and imaging apparatus
FI20096232A0 (en) 2009-11-23 2009-11-23 Valtion Teknillinen Physical activity-based control for a device
TWI515885B (en) 2009-12-25 2016-01-01 新力股份有限公司 Semiconductor device and method of manufacturing the same, and electronic apparatus
US20110156197A1 (en) 2009-12-31 2011-06-30 Tivarus Cristian A Interwafer interconnects for stacked CMOS image sensors
US8330829B2 (en) 2009-12-31 2012-12-11 Microsoft Corporation Photographic flicker detection and compensation
US8982260B2 (en) 2010-02-11 2015-03-17 Idatamap Pty. Ltd. Image matching, data compression and tracking architectures
JP2011216673A (en) 2010-03-31 2011-10-27 Sony Corp Solid-state imaging device, method for manufacturing of solid-state imaging device, and electronic apparatus
US9451887B2 (en) 2010-03-31 2016-09-27 Nellcor Puritan Bennett Ireland Systems and methods for measuring electromechanical delay of the heart
CN101803925B (en) 2010-03-31 2012-01-04 上海交通大学 Monitoring device of blood oxygen saturation in motion state
JP5641287B2 (en) 2010-03-31 2014-12-17 ソニー株式会社 Solid-state imaging device, driving method of solid-state imaging device, and electronic apparatus
JP5516960B2 (en) 2010-04-02 2014-06-11 ソニー株式会社 Solid-state imaging device, driving method of solid-state imaging device, and electronic apparatus
JP5468133B2 (en) 2010-05-14 2014-04-09 パナソニック株式会社 Solid-state imaging device
KR101229600B1 (en) 2010-05-14 2013-02-04 가시오게산키 가부시키가이샤 Image capturing apparatus and camera shake correction method, and computer-readable medium
JP5644451B2 (en) 2010-05-25 2014-12-24 株式会社リコー Image processing apparatus, image processing method, and imaging apparatus
HUE039688T2 (en) 2010-06-01 2019-01-28 Boly Media Comm Shenzhen Co Multispectral photoreceptive device and sampling method thereof
KR101198249B1 (en) 2010-07-07 2012-11-07 에스케이하이닉스 주식회사 Column circuit and pixel binning circuit of image sensor
US8338856B2 (en) 2010-08-10 2012-12-25 Omnivision Technologies, Inc. Backside illuminated image sensor with stressed film
US20120092541A1 (en) 2010-10-19 2012-04-19 Nokia Corporation Method and apparatus for ambient light measurement system
JP5739640B2 (en) 2010-10-20 2015-06-24 キヤノン株式会社 Imaging device and imaging apparatus
CN102451160A (en) 2010-10-22 2012-05-16 夏落 Preparation method of long-circulating nanoparticle
US9857469B2 (en) 2010-10-22 2018-01-02 Heptagon Micro Optics Pte. Ltd. System and method for multi TOF camera operation using phase hopping
JP5589760B2 (en) 2010-10-27 2014-09-17 ソニー株式会社 Image processing apparatus, imaging apparatus, image processing method, and program.
US10120446B2 (en) 2010-11-19 2018-11-06 Apple Inc. Haptic input device
JP5721405B2 (en) 2010-11-22 2015-05-20 キヤノン株式会社 Imaging system, control method thereof, and program
JP5724322B2 (en) 2010-11-24 2015-05-27 ソニー株式会社 Method for manufacturing solid-state imaging device
JP5673063B2 (en) 2010-12-15 2015-02-18 ソニー株式会社 Solid-state imaging device, driving method, and electronic apparatus
US8723094B2 (en) 2010-12-21 2014-05-13 Sionyx, Inc. Photodetecting imager devices having correlated double sampling and associated methods
US8723975B2 (en) 2011-01-24 2014-05-13 Aptina Imaging Corporation High-dynamic-range imaging devices
US8803990B2 (en) 2011-01-25 2014-08-12 Aptina Imaging Corporation Imaging system with multiple sensors for producing high-dynamic-range images
US8742309B2 (en) * 2011-01-28 2014-06-03 Aptina Imaging Corporation Imagers with depth sensing capabilities
JP5426587B2 (en) 2011-01-31 2014-02-26 株式会社東芝 Solid-state imaging device and pixel averaging processing method thereof
WO2012105259A1 (en) 2011-02-04 2012-08-09 パナソニック株式会社 Solid-state image capture device and method of driving same
KR102025522B1 (en) 2011-03-10 2019-11-26 사이오닉스, 엘엘씨 Three dimensional sensors, systems, and associated methods
JP5430795B2 (en) 2011-04-01 2014-03-05 富士フイルム株式会社 Imaging apparatus and program
US9088727B2 (en) 2011-04-06 2015-07-21 Pelco, Inc. Spatially-varying flicker detection
KR101294386B1 (en) 2011-04-13 2013-08-08 엘지이노텍 주식회사 Pixel, pixel arrary, image sensor including the pixel arrary
US8575531B2 (en) 2011-04-26 2013-11-05 Aptina Imaging Corporation Image sensor array for back side illumination with global shutter using a junction gate photodiode
CN103503438A (en) 2011-05-24 2014-01-08 索尼公司 Solid-state image pick-up device and camera system
US8643132B2 (en) 2011-06-08 2014-02-04 Omnivision Technologies, Inc. In-pixel high dynamic range imaging
JP5885403B2 (en) 2011-06-08 2016-03-15 キヤノン株式会社 Imaging device
JP5821315B2 (en) 2011-06-21 2015-11-24 ソニー株式会社 Electronic device, driving method of electronic device
US8908062B2 (en) 2011-06-30 2014-12-09 Nikon Corporation Flare determination apparatus, image processing apparatus, and storage medium storing flare determination program
GB2492387B (en) 2011-06-30 2017-07-19 Cmosis Nv Pixel array with individual exposure control for a pixel or pixel region
TWI505453B (en) 2011-07-12 2015-10-21 Sony Corp Solid-state imaging device, method for driving the same, method for manufacturing the same, and electronic device
JP2013051523A (en) 2011-08-30 2013-03-14 Sharp Corp Flicker detection device, flicker detection method, control program, readable recording medium, solid-state imaging device, multi-eye imaging device, and electronic information apparatus
JP2013055500A (en) 2011-09-02 2013-03-21 Sony Corp Solid state imaging device and camera system
JP5935274B2 (en) 2011-09-22 2016-06-15 ソニー株式会社 Solid-state imaging device, control method for solid-state imaging device, and control program for solid-state imaging device
JP5945395B2 (en) 2011-10-13 2016-07-05 オリンパス株式会社 Imaging device
US8594170B2 (en) 2011-10-24 2013-11-26 Sigear Europe Sarl Clock masking scheme in a mixed-signal system
JP5764466B2 (en) 2011-11-04 2015-08-19 ルネサスエレクトロニクス株式会社 Solid-state imaging device
US8982237B2 (en) 2011-12-09 2015-03-17 Htc Corporation Portable electronic device with auto-exposure control adaptive to environment brightness and image capturing method using the same
CN103165103A (en) 2011-12-12 2013-06-19 深圳富泰宏精密工业有限公司 Brightness adjustment system of electronic device display screen and brightness adjustment method of electronic device display screen
JP6239820B2 (en) 2011-12-19 2017-11-29 キヤノン株式会社 Imaging apparatus and control method thereof
JP5497874B2 (en) 2011-12-22 2014-05-21 富士フイルム株式会社 Radiation image detector, radiation image capturing apparatus, and radiation image capturing system
KR101386649B1 (en) 2011-12-26 2014-09-23 전자부품연구원 Game Apparatus Applying User state And Method Providing Thereof
CN104041009B (en) 2011-12-28 2016-02-03 富士胶片株式会社 Imaging apparatus and camera head
EP2624569B1 (en) 2012-02-06 2016-09-28 Harvest Imaging bvba Method for correcting image data from an image sensor having image pixels and non-image pixels, and image sensor implementing the same
JP6151530B2 (en) 2012-02-29 2017-06-21 株式会社半導体エネルギー研究所 Image sensor, camera, and surveillance system
JP6164846B2 (en) 2012-03-01 2017-07-19 キヤノン株式会社 Imaging device, imaging system, and driving method of imaging device
US20150062391A1 (en) 2012-03-30 2015-03-05 Nikon Corporation Image sensor, photographing method, and image-capturing device
FR2989518A1 (en) 2012-04-13 2013-10-18 St Microelectronics Crolles 2 Method for manufacturing integrated image sensor, involves forming pixels at end of columns, and deforming structure such that another end of each of columns is brought closer or separated to form surface in shape of polyhedral cap
US9270906B2 (en) 2012-05-02 2016-02-23 Semiconductor Components Industries, Llc Exposure time selection using stacked-chip image sensors
GB201209412D0 (en) 2012-05-28 2012-07-11 Obs Medical Ltd Narrow band feature extraction from cardiac signals
GB201209413D0 (en) 2012-05-28 2012-07-11 Obs Medical Ltd Respiration rate extraction from cardiac signals
US9420208B2 (en) 2012-07-13 2016-08-16 Canon Kabushiki Kaisha Driving method for image pickup apparatus and driving method for image pickup system
US9521337B1 (en) 2012-07-13 2016-12-13 Rambus Inc. Reset-marking pixel sensor
US10334181B2 (en) 2012-08-20 2019-06-25 Microsoft Technology Licensing, Llc Dynamically curved sensor for optical zoom lens
US9349769B2 (en) 2012-08-22 2016-05-24 Taiwan Semiconductor Manufacturing Company, Ltd. Image sensor comprising reflective guide layer and method of forming the same
US8817154B2 (en) 2012-08-30 2014-08-26 Omnivision Technologies, Inc. Image sensor with fixed potential output transistor
JP2014057268A (en) 2012-09-13 2014-03-27 Toshiba Corp Imaging apparatus
JP6012375B2 (en) 2012-09-28 2016-10-25 株式会社メガチップス Pixel interpolation processing device, imaging device, program, and integrated circuit
US9700240B2 (en) 2012-12-14 2017-07-11 Microsoft Technology Licensing, Llc Physical activity inference from environmental metrics
US9380245B1 (en) 2013-02-14 2016-06-28 Rambus Inc. Conditional-reset image sensor with analog counter array
US8934030B2 (en) 2013-02-20 2015-01-13 Hewlett-Packard Development Company, L.P. Suppressing flicker in digital images
JP6087674B2 (en) 2013-02-27 2017-03-01 キヤノン株式会社 Imaging device
US9293500B2 (en) 2013-03-01 2016-03-22 Apple Inc. Exposure control for image sensors
US9276031B2 (en) 2013-03-04 2016-03-01 Apple Inc. Photodiode with different electric potential regions for image sensors
KR102034482B1 (en) 2013-03-04 2019-10-21 삼성전자주식회사 Image sensor and method of forming the same
US9041837B2 (en) 2013-03-05 2015-05-26 Apple Inc. Image sensor with reduced blooming
KR20140109668A (en) 2013-03-06 2014-09-16 삼성전자주식회사 Method and system for detecting flicker
US9741754B2 (en) 2013-03-06 2017-08-22 Apple Inc. Charge transfer circuit with storage nodes in image sensors
JP6172978B2 (en) 2013-03-11 2017-08-02 キヤノン株式会社 IMAGING DEVICE, IMAGING SYSTEM, SIGNAL PROCESSING DEVICE, PROGRAM, AND STORAGE MEDIUM
KR102009189B1 (en) 2013-03-12 2019-08-09 삼성전자주식회사 Image Sensor and Method of simultaneously Reading out Two - Row Pixels thereof
US9549099B2 (en) 2013-03-12 2017-01-17 Apple Inc. Hybrid image sensor
US9319611B2 (en) 2013-03-14 2016-04-19 Apple Inc. Image sensor with flexible pixel summing
EP2974280B1 (en) 2013-03-15 2021-11-24 Rambus Inc. Threshold-monitoring, conditional-reset image sensor
US9066017B2 (en) 2013-03-25 2015-06-23 Google Inc. Viewfinder display based on metering images
US9160949B2 (en) 2013-04-01 2015-10-13 Omnivision Technologies, Inc. Enhanced photon detection device with biased deep trench isolation
JP6104049B2 (en) 2013-05-21 2017-03-29 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
US9154750B2 (en) 2013-05-28 2015-10-06 Omnivision Technologies, Inc. Correction of image sensor fixed-pattern noise (FPN) due to color filter pattern
JP6639385B2 (en) 2013-06-11 2020-02-05 ラムバス・インコーポレーテッド Reset image sensor with split gate condition
JP2015012127A (en) 2013-06-28 2015-01-19 ソニー株式会社 Solid state image sensor and electronic apparatus
US9344649B2 (en) 2013-08-23 2016-05-17 Semiconductor Components Industries, Llc Floating point image sensors with different integration times
US9001251B2 (en) 2013-09-10 2015-04-07 Rambus Inc. Oversampled image sensor with conditional pixel readout
EP3481055B1 (en) 2013-10-02 2022-07-13 Nikon Corporation Imaging element and imaging apparatus
US9596423B1 (en) 2013-11-21 2017-03-14 Apple Inc. Charge summing in an image sensor
WO2015084991A1 (en) 2013-12-04 2015-06-11 Rambus Inc. High dynamic-range image sensor
US9596420B2 (en) 2013-12-05 2017-03-14 Apple Inc. Image sensor having pixels with different integration periods
US9473706B2 (en) 2013-12-09 2016-10-18 Apple Inc. Image sensor flicker detection
KR102135586B1 (en) 2014-01-24 2020-07-20 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6480712B2 (en) 2014-02-06 2019-03-13 キヤノン株式会社 Imaging apparatus and control method thereof
US10285626B1 (en) 2014-02-14 2019-05-14 Apple Inc. Activity identification using an optical heart rate monitor
US9277144B2 (en) 2014-03-12 2016-03-01 Apple Inc. System and method for estimating an ambient light condition using an image sensor and field-of-view compensation
US9232150B2 (en) 2014-03-12 2016-01-05 Apple Inc. System and method for estimating an ambient light condition using an image sensor
US9584743B1 (en) 2014-03-13 2017-02-28 Apple Inc. Image sensor with auto-focus and pixel cross-talk compensation
JP6483725B2 (en) 2014-04-07 2019-03-13 サムスン エレクトロニクス カンパニー リミテッド Method for sensing optical events, optical event sensor therefor, and distance measuring mobile device
US9497397B1 (en) 2014-04-08 2016-11-15 Apple Inc. Image sensor with auto-focus and color ratio cross-talk comparison
US9538106B2 (en) 2014-04-25 2017-01-03 Apple Inc. Image sensor having a uniform digital power signature
US9445018B2 (en) 2014-05-01 2016-09-13 Semiconductor Components Industries, Llc Imaging systems with phase detection pixels
US9686485B2 (en) 2014-05-30 2017-06-20 Apple Inc. Pixel binning in an image sensor
JP6380974B2 (en) 2014-06-18 2018-08-29 オリンパス株式会社 Imaging device, imaging device
US10044954B2 (en) 2014-07-25 2018-08-07 Sony Corporation Solid-state imaging device, AD converter, and electronic apparatus
KR102269600B1 (en) 2014-08-05 2021-06-25 삼성전자주식회사 An imaging sensor capable of detecting phase difference of focus
US20160050379A1 (en) 2014-08-18 2016-02-18 Apple Inc. Curved Light Sensor
US9894304B1 (en) 2014-08-18 2018-02-13 Rambus Inc. Line-interleaved image sensors
KR102212138B1 (en) 2014-08-19 2021-02-04 삼성전자주식회사 Unit pixel for image sensor and pixel array comprising the same
US9654689B2 (en) 2014-08-19 2017-05-16 Apple Inc. Method and apparatus for camera actuator driver mode control synchronized with imaging sensor frame
JP6369233B2 (en) * 2014-09-01 2018-08-08 ソニー株式会社 Solid-state imaging device, signal processing method thereof, and electronic device
KR102336665B1 (en) 2014-10-02 2021-12-07 삼성전자 주식회사 CMOS Image Sensor for reducing dead zone
JP2016127389A (en) 2014-12-26 2016-07-11 キヤノン株式会社 Image processor and control method thereof
CN107533210A (en) 2015-01-14 2018-01-02 因维萨热技术公司 Phase-detection focuses on automatically
US10217889B2 (en) 2015-01-27 2019-02-26 Ladarsystems, Inc. Clamped avalanche photodiode
US9455285B2 (en) 2015-02-04 2016-09-27 Semiconductors Components Industries, Llc Image sensors with phase detection pixels
KR20160103302A (en) 2015-02-24 2016-09-01 에스케이하이닉스 주식회사 Ramp voltage generator and image sensing device with the same
KR20160109002A (en) 2015-03-09 2016-09-21 에스케이하이닉스 주식회사 Preamplifier using output polarity changing, and comparator and analog-digital converting apparatus using that
US9749556B2 (en) 2015-03-24 2017-08-29 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with phase detection capabilities
JP6551882B2 (en) 2015-06-08 2019-07-31 パナソニックIpマネジメント株式会社 Imaging apparatus and signal processing circuit
US9584744B2 (en) 2015-06-23 2017-02-28 Semiconductor Components Industries, Llc Image sensors with voltage-biased trench isolation structures
KR20170019542A (en) 2015-08-11 2017-02-22 삼성전자주식회사 Auto-focus image sensor
US9819890B2 (en) 2015-08-17 2017-11-14 Omnivision Technologies, Inc. Readout circuitry to mitigate column fixed pattern noise of an image sensor
US9936151B2 (en) 2015-10-16 2018-04-03 Capsovision Inc Single image sensor for capturing mixed structured-light images and regular images
KR102386471B1 (en) 2015-10-28 2022-04-15 에스케이하이닉스 주식회사 Ramp voltage generator, image sensing device with the ramp voltage generator and method of driving the image sensing device
KR20170056909A (en) 2015-11-16 2017-05-24 삼성전자주식회사 Image sensor and electronic device having the same
KR102545170B1 (en) 2015-12-09 2023-06-19 삼성전자주식회사 Image sensor and method of fabricating the same
JP6748454B2 (en) 2016-03-10 2020-09-02 キヤノン株式会社 Imaging device, control method thereof, program, and storage medium
JP6735582B2 (en) 2016-03-17 2020-08-05 キヤノン株式会社 Imaging device, driving method thereof, and imaging device
US9912883B1 (en) 2016-05-10 2018-03-06 Apple Inc. Image sensor with calibrated column analog-to-digital converters
US10775605B2 (en) 2016-06-16 2020-09-15 Intel Corporation Combined biometrics capture system with ambient free IR
US10271037B2 (en) 2017-01-20 2019-04-23 Semiconductor Components Industries, Llc Image sensors with hybrid three-dimensional imaging
US10431608B2 (en) 2017-04-13 2019-10-01 Omnivision Technologies, Inc. Dual conversion gain high dynamic range readout for comparator of double ramp analog to digital converter
CN107105141B (en) 2017-04-28 2019-06-28 Oppo广东移动通信有限公司 Imaging sensor, image processing method, imaging device and mobile terminal
JP2018196083A (en) 2017-05-22 2018-12-06 オリンパス株式会社 Image processing system
US10440301B2 (en) 2017-09-08 2019-10-08 Apple Inc. Image capture device, pixel, and method providing improved phase detection auto-focus performance
JP7067907B2 (en) 2017-12-01 2022-05-16 キヤノン株式会社 Solid-state image sensor and signal processing device
WO2019130963A1 (en) 2017-12-26 2019-07-04 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging element, comparator, and electronic device
US10848693B2 (en) 2018-07-18 2020-11-24 Apple Inc. Image flare detection using asymmetric pixels
US11019294B2 (en) 2018-07-18 2021-05-25 Apple Inc. Seamless readout mode transitions in image sensors
US10854647B2 (en) 2018-11-30 2020-12-01 Taiwan Semiconductor Manufacturing Co., Ltd. Photo diode with dual backside deep trench isolation depth
CN111918002B (en) 2019-05-10 2022-02-18 华为技术有限公司 Control method of shape memory alloy motor in camera device and camera device
US11252381B2 (en) * 2019-12-11 2022-02-15 Omnivision Technologies, Inc. Image sensor with shared microlens

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150350583A1 (en) * 2014-06-03 2015-12-03 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with sub-pixel resolution capabilities
WO2019102887A1 (en) * 2017-11-22 2019-05-31 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging element and electronic device
EP3716618A1 (en) * 2017-11-22 2020-09-30 Sony Semiconductor Solutions Corporation Solid-state imaging element and electronic device

Also Published As

Publication number Publication date
US20220046196A1 (en) 2022-02-10
US11563910B2 (en) 2023-01-24

Similar Documents

Publication Publication Date Title
US10440301B2 (en) Image capture device, pixel, and method providing improved phase detection auto-focus performance
US10015416B2 (en) Imaging systems with high dynamic range and phase detection pixels
US10284769B2 (en) Image sensor with in-pixel depth sensing
US10498990B2 (en) Imaging systems with high dynamic range and phase detection pixels
US10158843B2 (en) Imaging pixels with depth sensing capabilities
US8478123B2 (en) Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
US10032810B2 (en) Image sensor with dual layer photodiode structure
US20170374306A1 (en) Image sensor system with an automatic focus function
US20180301484A1 (en) Image sensors with high dynamic range and autofocusing hexagonal pixels
US10593712B2 (en) Image sensors with high dynamic range and infrared imaging toroidal pixels
CN108810430B (en) Imaging system and forming method thereof
US20190081098A1 (en) Image sensors with in-pixel lens arrays
US20110109776A1 (en) Imaging device and imaging apparatus
KR20160062725A (en) Rgbc color filter array patterns to minimize color aliasing
US11563910B2 (en) Image capture devices having phase detection auto-focus pixels
US20150116527A1 (en) Compact array camera modules having an extended field of view from which depth information can be extracted
US9386203B2 (en) Compact spacer in multi-lens array module
CN212323001U (en) Image sensor pixel and image sensor pixel array
US9392198B2 (en) Backside illuminated imaging systems having auto-focus pixels
US20230319435A1 (en) Image sensing device including light shielding pattern
US20230090827A1 (en) Image Capture Devices Having Phase Detection Auto-Focus Pixels
US20210280623A1 (en) Phase detection pixels with stacked microlenses
JP2011061081A (en) Image sensor
CN113259557A (en) Ultraviolet image sensor and ultraviolet imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21755326

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21755326

Country of ref document: EP

Kind code of ref document: A1