WO2016194577A1 - Élément d'imagerie, procédé d'imagerie, programme, et dispositif électronique - Google Patents

Élément d'imagerie, procédé d'imagerie, programme, et dispositif électronique Download PDF

Info

Publication number
WO2016194577A1
WO2016194577A1 PCT/JP2016/064213 JP2016064213W WO2016194577A1 WO 2016194577 A1 WO2016194577 A1 WO 2016194577A1 JP 2016064213 W JP2016064213 W JP 2016064213W WO 2016194577 A1 WO2016194577 A1 WO 2016194577A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase difference
photoelectric conversion
pixel
difference detection
conversion layer
Prior art date
Application number
PCT/JP2016/064213
Other languages
English (en)
Japanese (ja)
Inventor
杉崎 太郎
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016194577A1 publication Critical patent/WO2016194577A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • This technology relates to an image sensor, an imaging method, a program, and an electronic device.
  • the present invention relates to an imaging device, an imaging method, a program, and an electronic device that detect a focus with higher accuracy without degrading image quality.
  • the autofocus method for digital cameras mainly includes a contrast method and a phase difference method.
  • the contrast method is a method in which the lens is moved so that the point with the highest contrast is in focus.
  • auto-focusing can be performed by reading a part of the image of the image sensor, and no other auto-focus optical system is required.
  • the phase difference method is a method in which a secondary imaging lens is used to decompose the image into two images and detect the amount of focus shift from the interval between the two images.
  • a secondary imaging lens is used to decompose the image into two images and detect the amount of focus shift from the interval between the two images.
  • images of light passing through different parts of the lens for example, light beams on the right and left sides of the lens are used.
  • Image plane phase difference autofocus performs autofocus by phase difference method using an image sensor.
  • the imaging element is provided with a condensing microlens, and an imaging element for phase difference autofocus can be obtained by adding a diaphragm member that restricts light incident on the microlens (for example, Patent Document 1).
  • the phase difference method does not require time for moving the lens back and forth in order to find the focal position, so that high-speed autofocus can be realized.
  • a diaphragm member that restricts light incident on the microlens for example, a light shielding film is provided, and the amount of light incident on the image sensor is limited, so that sensitivity is deteriorated. That is, in the image plane phase difference method, for example, the right and left light fluxes of the lens are used, and a part of the light incident on the lens is shielded. Will deteriorate.
  • Patent Document 1 the light passing through the left side of the lens is not incident on the right side image sensor, and the light passing through the right side of the lens is not incident on the left side image sensor.
  • a reflecting plate is used and unnecessary light is reflected.
  • some of the plurality of image sensors arranged on the imaging surface are used as pixels for phase difference detection, so a part of the captured image is lost and the image quality is low. There was a possibility that it would fall.
  • the oblique incident light that can detect the phase difference is limited to about 10 to 15 degrees, and it has been desired that it can cope with a wider oblique incident light.
  • the present technology has been made in view of such a situation, and makes it possible to perform phase difference detection corresponding to wide oblique incident light without degrading image quality.
  • two or more photoelectric conversion layers each having a photoelectric conversion unit divided into pixel units are stacked, and one pixel of the first photoelectric conversion layer on the side close to the optical lens is provided.
  • the incident light is configured to include a state of being received by the photoelectric conversion unit of the second photoelectric conversion layer far from the optical lens, and obtained by the photoelectric conversion unit of the second photoelectric conversion layer. The phase difference is detected using the obtained signal.
  • the photoelectric conversion unit having three or more pixels of the second photoelectric conversion layer can be provided.
  • the second photoelectric conversion layer may include a plurality of phase difference detection pixels with respect to one of the on-chip lenses provided in the first photoelectric conversion layer.
  • the second photoelectric conversion layer may include a plurality of phase difference detection pixels and detect a phase difference using two of the plurality of phase difference detection pixels.
  • the phase difference can be detected using a pixel set according to the chief ray angle among the plurality of phase difference detection pixels.
  • the phase difference can be detected using a pixel set according to the optical lens among the plurality of pixels for detecting the phase difference.
  • the phase difference can be detected using a pixel set according to the image height of the optical lens among the plurality of pixels for detecting the phase difference.
  • the phase difference can be detected using a pixel set according to the focal length of the optical lens among the plurality of pixels for detecting the phase difference.
  • a pixel used as a phase difference detection pixel is set with reference to a correspondence table in which information for identifying the optical lens and a pixel to be used are associated with each other. be able to.
  • the pixel used as the phase difference detection pixel is set with reference to a correspondence table in which the image height of the optical lens is associated with the pixel to be used. Can do.
  • the region of the substrate through which light received by the pixels of the second photoelectric conversion layer is transmitted is a material that does not have an absorption band in the visible light region. It can be formed.
  • the material can be a silicon oxide film or a silicon nitride film.
  • the photoelectric conversion unit of the second photoelectric conversion layer can receive light from a red on-chip color filter provided in the first photoelectric conversion layer.
  • the photoelectric conversion unit of the second photoelectric conversion layer can receive light from a white on-chip color filter provided in the first photoelectric conversion layer.
  • two or more photoelectric conversion layers each having a photoelectric conversion unit divided into pixel units are stacked, and one pixel of the first photoelectric conversion layer on the side close to the optical lens is provided.
  • An imaging method of an imaging device configured to include a state in which incident light is received by the photoelectric conversion unit of a second photoelectric conversion layer far from the optical lens, and the second photoelectric
  • a plurality of phase difference detection pixels for detecting a phase difference are arranged in the conversion layer, and the phase difference detection used according to the optical lens or the image height of the optical lens among the plurality of phase difference detection pixels. Setting a pixel for use.
  • the program according to one aspect of the present technology is such that two or more photoelectric conversion layers each having a photoelectric conversion unit divided into pixel units are stacked and incident on one pixel of the first photoelectric conversion layer on the side close to the optical lens.
  • the program for controlling the imaging of the imaging device configured to include a state in which the received light is received by the photoelectric conversion unit of the second photoelectric conversion layer on the side far from the optical lens,
  • a plurality of phase difference detection pixels for detecting a phase difference are arranged in the photoelectric conversion layer, and among the plurality of phase difference detection pixels, the phase difference used according to the optical lens or the image height of the optical lens.
  • a computer is caused to execute a process including a step of setting detection pixels.
  • two or more photoelectric conversion layers each having a photoelectric conversion unit divided into pixel units are stacked, and one pixel of the first photoelectric conversion layer on the side close to the optical lens is provided.
  • the incident light is configured to include a state of being received by the photoelectric conversion unit of the second photoelectric conversion layer far from the optical lens, and obtained by the photoelectric conversion unit of the second photoelectric conversion layer.
  • An image sensor that detects a phase difference using a signal obtained is provided.
  • two or more photoelectric conversion layers each having a photoelectric conversion unit divided into pixel units are stacked, and the first side closer to the optical lens is provided. It is configured to include a state in which light incident on one pixel of the photoelectric conversion layer is received by the photoelectric conversion unit of the second photoelectric conversion layer far from the optical lens.
  • a plurality of phase difference detection pixels for detecting a phase difference are arranged in the second photoelectric conversion layer, and the optical lens or the optical lens among the plurality of phase difference detection pixels is used depending on the image height.
  • a phase difference detection pixel is set.
  • An electronic apparatus includes the image sensor.
  • phase difference detection corresponding to a wide oblique incident light can be performed without degrading the image quality.
  • the present technology described below can be applied to an autofocus mechanism such as a digital camera.
  • an autofocus mechanism such as a digital camera.
  • the present technology can be applied to a phase difference method, and in the following description, an explanation will be given by taking image plane phase difference autofocus as an example. .
  • Image plane phase difference autofocus is an image capture unit (such as an imaging device such as a digital still camera or a video camera, a portable terminal device having an imaging function such as a mobile phone, or a copier using the imaging device for an image reading unit).
  • the present invention can be applied to all electronic devices using a semiconductor package for the photoelectric conversion unit.
  • FIG. 1 is a block diagram illustrating an example of a configuration of an electronic apparatus according to the present technology, for example, an imaging apparatus.
  • an imaging apparatus 10 according to the present technology includes an optical system including a lens group 21 and the like, an imaging element (imaging device) 22, a DSP (Digital Signal Processor) circuit 23, a frame memory 24, a display unit 25, A recording unit 26, an operation unit 27, a power supply unit 28, and the like are included.
  • the DSP circuit 23, the frame memory 24, the display unit 25, the recording unit 26, the operation unit 27, and the power supply unit 28 are connected to each other via a bus line 29.
  • the lens group 21 takes in incident light (image light) from a subject and forms an image on the imaging surface of the imaging element 22.
  • the imaging element 22 converts the amount of incident light imaged on the imaging surface by the lens group 21 into an electrical signal for each pixel and outputs the electrical signal as a pixel signal.
  • the DSP circuit 23 processes a signal from the image sensor 22.
  • the image sensor 22 has pixels for detecting a focus, and processes a signal from such a pixel to detect a focus.
  • the image sensor 22 has pixels for constructing an image of a photographed subject, and processes such as processing a signal from such a pixel and developing it in the frame memory 24 are also performed.
  • the display unit 25 includes a panel type display device such as a liquid crystal display device or an organic EL (electroluminescence) display device, and displays a moving image or a still image captured by the image sensor 22.
  • the recording unit 26 records a moving image or a still image captured by the image sensor 22 on a recording medium such as a video tape or a DVD (Digital Versatile Disk).
  • the operation unit 27 issues operation commands for various functions of the imaging apparatus under operation by the user.
  • the power supply unit 28 appropriately supplies various power sources serving as operation power sources for the DSP circuit 23, the frame memory 24, the display unit 25, the recording unit 26, and the operation unit 27 to these supply targets.
  • the imaging device to which the present technology is applied is configured by a plurality of layers, each of which has an imaging element in two layers, for example, two layers, and one of them has an imaging element for obtaining an image.
  • the other layer is used as an image sensor for performing autofocus by the phase difference method.
  • an imaging apparatus to which the present technology is applied and a conventional imaging apparatus
  • a case where autofocus is realized by an image plane phase difference method in a conventional imaging apparatus will be described.
  • a conventional imaging device a case will be described in which an imaging element is arranged in one layer, and an imaging element for obtaining an image and an imaging element for phase difference detection are mixed in that layer.
  • FIG. 2 is a diagram for explaining image plane phase difference autofocus.
  • a pixel array unit 51 in which pixels are two-dimensionally arranged in a matrix is provided on the imaging surface of the image sensor 22, and a predetermined number of pixels in the pixel array unit 51 are assigned to phase difference detection pixels.
  • a plurality of phase difference detection pixels are provided at predetermined positions in the pixel array unit 51.
  • the pixel array unit 51 shown in FIG. 2 shows a part of the pixel array unit 51 constituting the image sensor 22.
  • four pixels are provided as phase difference detection pixels.
  • the phase difference detection pixel 61-1 and the phase difference detection pixel 61-2 are right side light shielding pixels, and the phase difference detection pixel 62-1 and the phase difference detection pixel 62-2 are left side light shielding pixels. Yes.
  • the right light-shielding pixel has a structure in which the right part is shielded from light
  • the left light-shielding pixel has a structure in which the left part is shielded from light.
  • the right light-shielding pixel and the left light-shielding pixel are used as a pair of phase difference detection pixels.
  • the right side or the left side is shielded so that the function of selecting and receiving the light incident angle (hereinafter referred to as separation ability) is realized.
  • phase difference detection pixels are different from each other as shown in FIG. 2. They may be provided at different positions, at different positions, or adjacent to each other.
  • FIG. 3 shows the structure of the pixel 61-1 of the right light-shielding pixel.
  • the pixel 61-1 has a configuration in which an on-chip lens 81, an on-chip color filter 82, a light shielding film 83, and a photodiode 84-1 are arranged from the upper side in the drawing.
  • the configuration of pixels that are not phase difference detection pixels (hereinafter referred to as normal pixels 71) is shown on the left side of FIG.
  • the normal pixel 71 has a configuration in which an on-chip lens 81, an on-chip color filter 82, and a photodiode 84-2 are arranged. That is, the normal pixel 71 has a configuration in which the light shielding film 83 is deleted from the phase difference detection pixel 61.
  • the left side of the lens group 21 is provided.
  • the incoming light and the light coming from the right part can be separated and received.
  • the focus position can be detected as shown in FIG.
  • the output from the right-side light-shielded photodiode does not match the output from the left-side light-shielded photodiode (the output of the paired phase difference detection pixels does not match).
  • the outputs from the two photodiodes match (the outputs of the pair of phase difference detection pixels match).
  • the focus position When the focus position is detected by such a phase difference method, the focus position can be detected at a relatively high speed, and high-speed autofocus can be realized. However, there is a possibility that the sensitivity may be lowered, for example, in a dark place. In such cases, it may be difficult to detect the focal position.
  • irregular reflection may affect normal pixels around the phase difference detection pixels.
  • a part of the incident light incident on the phase difference detection pixel 61-1 hits the side surface of the light shielding film 83 and is incident on the photodiode 84-2 of the normal pixel 71. If there is such irregular reflection at the light shielding film 83 and the irregularly reflected light is incident on the normal pixel 71, the characteristics of the normal pixel 71 may fluctuate and the image quality may deteriorate.
  • FIG. 5 is a graph showing the incident angle response of the image plane phase difference, where the vertical axis represents the output value from the phase difference detection pixels 61 and 62, and the horizontal axis represents the incident angle. Referring to FIG. 5, it can be seen that there are peaks at the incident angle of ⁇ 10 degrees and at 10 degrees. In this manner, the phase difference detection pixels 61 and 62 can obtain the largest output with incident light having an incident angle of around ⁇ 10 degrees (10 degrees).
  • the output value becomes small for incident light with an incident angle greater than this, for example, an incident angle of 15 degrees or greater, making it difficult to detect the phase difference.
  • CRA Chief Ray Angle
  • the phase difference detection pixel since the range of the incident angle of light incident on the phase difference detection pixel may be increased by increasing the chief ray angle, the phase difference can be detected corresponding to a wide incident angle. It is necessary to do so.
  • phase difference detection pixel By applying the phase difference detection pixel described below, it is possible to detect a phase difference corresponding to a wide incident angle, and it is possible to detect a phase difference without deteriorating image quality.
  • FIG. 6 is a diagram illustrating an image sensor according to the present disclosure. As illustrated in FIG. 6, the imaging element 101 according to the present disclosure receives light of the subject 103 collected by the optical lens 102. The optical lens 102 corresponds to the lens group 21 of the imaging device 10 of FIG.
  • the image sensor 101 is, for example, a composite image sensor in which two semiconductor substrates 111A and 111B are stacked. In each of the semiconductor substrates 111A and 111B, a photoelectric conversion layer having a photoelectric conversion unit divided into pixel units is formed.
  • the semiconductor of the semiconductor substrates 111A and 111B is, for example, silicon (Si).
  • the semiconductor substrate 111A closer to the optical lens 102 is referred to as the upper substrate 111A
  • the semiconductor substrate 111B farther from the optical lens 102 is referred to as the lower substrate 111B.
  • the two semiconductor substrates 111A and 111B are not particularly distinguished, they are also simply referred to as the substrate 111.
  • FIG. 7 is a diagram showing a schematic configuration of the upper substrate 111A and the lower substrate 111B of the image sensor 101. As shown in FIG. On the upper substrate 111A, a plurality of pixels on which an on-chip lens 131, an R (red), G (green), or B (blue) on-chip color filter 132 and a photodiode 133 are formed are arranged in a two-dimensional array. Has been.
  • the lower substrate 111B has a phase difference detection pixel group 151 in a state in which a plurality of pixels are arranged in a form corresponding to some of the pixel rows or pixel columns of the plurality of pixel rows formed on the upper substrate 111A. Is formed.
  • a logic circuit including a signal processing circuit for processing a signal detected by the upper substrate 111A or the lower substrate 111B is formed in a region where the phase difference detection pixel group 151 is not formed on the lower substrate 111B. May be.
  • the color filters formed in each pixel of the upper substrate 111A are arranged in a Bayer array, for example. Accordingly, the upper substrate 111A also functions as a color image sensor that outputs R, G, and B color signals.
  • an image of the subject 103 (FIG. 6) is obtained.
  • This image may be a color image or an image obtained by receiving special light such as infrared rays.
  • the on-chip color filter 132 is also a filter suitable for the special light.
  • the on-chip color filter 132 is described by taking a Bayer arrangement as an example, but the color arrangement of the on-chip color filter 132 is not limited to the Bayer arrangement, and other arrangement methods can be adopted.
  • a pixel in which the R color filter is formed is referred to as an R pixel
  • a pixel in which the G color filter is formed is referred to as a G pixel
  • a pixel in which the B color filter is formed is referred to as a B pixel. .
  • the lower substrate 111B also functions as an image sensor that outputs a signal of any one of R, G, and B colors.
  • the description is continued assuming that the pixel functions as a phase difference detection pixel.
  • a substrate 171 on which logic circuits, memories, and the like are arranged is provided.
  • a logic circuit, a memory, or the like may be disposed in a portion of the lower substrate 111B where the phase difference detection pixels are not disposed, or a third layer may be disposed on the substrate 171. Further, it may be arranged to be distributed on the lower substrate 111B and the substrate 171.
  • the imaging device 101 configured as described above acquires an image of the subject 103 with the first upper substrate 111A, and detects a phase difference for focusing on the subject 103 with the second lower substrate 111B. .
  • the imaging device 101 In the imaging device 101 to which the present technology is applied, light that has passed through the pixels (sensors) on the upper substrate 111A is received by the pixels (sensors) on the lower substrate 111B, and is disposed on the lower substrate 111B. Thus, the phase difference is detected.
  • the light shielding film 83 (FIG. 3) constituting the pixel for phase difference detection is not necessary, and the light shielding film 83 is eliminated. It is possible to prevent the image quality from being deteriorated due to the influence of the irregularly reflected light.
  • the pixels arranged on the lower substrate 111B are used as phase difference detection pixels, and pixels for picking up an image are not arranged around the phase difference detection pixels.
  • an image that is not affected by light due to irregular reflection can be taken, and the image quality can be improved.
  • phase difference detection pixels 152-1 and 152-2 are arranged as the phase difference detection pixel group 151 of the lower substrate 111B.
  • the phase difference is composed of two pixels that respectively receive light from the right side and light from the left side. Therefore, in FIG. 7, two phase difference detection pixels 152-1 and a phase difference detection pixel are used.
  • Pixel 152-2 is arranged.
  • the phase difference detection pixel 152-1 corresponds to the right light shielding pixel 61 (FIG. 2)
  • the phase difference detection pixel 152-2 corresponds to the left light shielding pixel 62.
  • phase difference detection pixel groups 151 for detecting a phase difference composed of two pixels are arranged on the lower substrate 111B, and each phase difference detection pixel group 151 detects a phase difference. By performing control so as to be minimized, autofocus is performed.
  • one phase difference detection pixel group 151 is illustrated and described as in FIG.
  • the phase difference detection pixel group 151 may be configured by two phase difference detection pixels. However, as shown in FIG. It may be arranged as a phase difference detection pixel. As shown in FIG. 8, one phase difference detection pixel group 151 may be composed of ten phase difference detection pixels 152-1 to 152-10.
  • phase difference detection pixels 152 are arranged.
  • the present technology is not limited to ten and can be applied to any number.
  • the range of incident angles of light that can detect the phase difference can be widened. In order to widen the range of the incident angle of light that can be detected, it is preferable that three or more pixels are arranged.
  • FIG. 9 is a cross-sectional view of the imaging pixel 101.
  • an upper substrate 111A on which an on-chip lens 131, an on-chip color filter 132, and a photodiode 133 are disposed.
  • the photoelectric conversion layer on which the photodiode 133 is disposed may be formed of an organic photoelectric conversion film or the like. Moreover, when comprised with an organic photoelectric converting film, you may make it comprise from several photoelectric converting layers.
  • the upper substrate 111A formed by the backside illumination type will be described as an example, but the present technology can be applied to either the backside illumination type or the front side illumination type.
  • a multilayer wiring layer 201 including a plurality of wiring layers and an interlayer insulating film is formed below the photodiode 133 of the upper substrate 111A.
  • the multilayer wiring layer 201 may be configured such that a plurality of transistor circuits constituting a readout circuit for reading signal charges accumulated in the photodiode 133 are formed in the vicinity of the upper substrate 111A.
  • the lower substrate 111B is configured by a silicon layer which is an n-type (first conductivity type) semiconductor region, and a photodiode as a photoelectric conversion unit is formed for each pixel by a pn junction in the silicon layer. Can be configured.
  • the pixel size of the lower substrate 111B may be different from the pixels of the upper substrate 111A.
  • a multilayer wiring layer including a plurality of transistor circuits, a plurality of wiring layers, an interlayer insulating film, and the like constituting a readout circuit for reading out signal charges accumulated in the photodiode is formed on the upper side or the lower side of the lower substrate 111B. May be.
  • the upper substrate 111A and the lower substrate 111B are bonded (adhered) via the protective film 202.
  • phase difference detection pixels 152-1 to 152-10 are arranged on the lower substrate 111B.
  • each of the phase difference detection pixels 152-1 to 152-10 is a photodiode.
  • the phase difference detection pixels 152-11 to 152-10 receive light transmitted through one on-chip lens 131.
  • the light transmitted through one on-chip lens 131 provided in the upper layer is received by a plurality of photodiodes provided in the lower layer, and the phase difference is detected. Yes.
  • the wiring layer of the multilayer wiring layer 201 includes wiring composed of copper or the like, and also functions as a light shielding film and a diaphragm. As shown in FIG. 9, when the on-chip lens 131 shown in the center of the figure is an on-chip lens 131A, the light transmitted through the on-chip lens 131A is emitted from the phase difference detection pixels 152-1 to 152-10. It is configured to receive light in either one.
  • the on-chip lens 131 different from the on-chip lens 131A for example, the on-chip lens 131B adjacent to the right of the on-chip lens 131A is not transmitted by the multilayer wiring layer 201, and the phase difference detection pixel 152- 1 to 152-10 is configured not to receive light.
  • the multilayer wiring layer 201 blocks light so that light transmitted through one on-chip lens 131 is received by the phase difference detection pixel group 151 of the lower substrate 111B.
  • a light shielding film may be provided separately from the multilayer wiring layer 201.
  • one on-chip lens 131 has a plurality of colors, for example, They may be arranged across colors (for example, R, G, G, B) corresponding to 2 ⁇ 2 four pixels, or may be arranged across a plurality of photodiodes 133.
  • the light transmitted from one on-chip lens 131 is a phase difference detection pixel on the lower substrate 111B.
  • the group 151 is configured to receive light.
  • phase difference detection pixel group 151 when the light transmitted through one on-chip lens 131 is received by the phase difference detection pixel group 151 and the phase difference is detected, a plurality of pixels in the phase difference detection pixel group 151 are detected. For example, two pixels are used to detect the phase difference. This phase difference detection will be further described.
  • FIG. 10 is a diagram illustrating an example of a pixel used for phase difference detection.
  • the phase difference detection pixel 152-3 and the phase difference detection pixel 152-4 that are circled in FIG. 10 are selected as the phase difference detection pixels, and the phase difference depends on the amount of light received by the two pixels. Detected and autofocus is executed.
  • the phase difference is detected.
  • the pixel 152-4 and the phase difference detection pixel 152-5 are selected.
  • the phase difference detection pixel 152-2 and the phase difference detection pixel 152-3 are selected.
  • phase difference detection pixel 152 to be selected may be determined according to the image height of the lens, as will be described later. .
  • the chief ray angle may vary depending on the type of lens itself, for example, the type of lens such as a telephoto lens or a wide-angle lens, so that the phase difference detection pixel 152 selected depending on the type of lens is determined. May be. Telephoto lenses with a long focal length tend to have a small chief ray angle, and wide-angle lenses with a short focal length tend to have a chief ray angle.
  • phase difference detection pixel 152-5 and a phase difference detection pixel 152- arranged near the center in the phase difference detection pixel group 151. 6 is selected, and the phase difference can be detected by these two pixels.
  • phase difference detection pixel 152-1 and the phase difference detection pixel 152 arranged near the end in the phase difference detection pixel group 151, as shown in FIG. -2 is selected, and the phase difference can be detected by these two pixels.
  • phase difference detection pixel 152-5 and the phase difference detection pixel 152-6 are selected, and the wide angle lens side is selected.
  • the phase difference detection pixel 152 may be selected in accordance with the focal length such that the phase difference detection pixel 152-1 and the phase difference detection pixel 152-2 are selected. good.
  • the phase difference detection pixel 152 is selected according to the focus. May be.
  • the light transmitted through one on-chip lens 131 in the upper layer is received by each of the phase difference detection pixels 152 in the phase difference detection pixel group 151 in the lower layer, and the amount of light received by two of them. Accordingly, the phase difference is detected as described above.
  • the light transmitted through the different on-chip lenses 131 in the upper layer is respectively transmitted to the phase difference detection pixels 152 in the lower-layer phase difference detection pixel group 151 arranged at positions corresponding to the different on-chip lenses 131.
  • the phase difference may be detected according to the amount of light received by one pixel.
  • the two pixels that detect the phase difference may be two adjacent pixels that receive light transmitted through one on-chip lens 131, or on-chip lenses that are arranged at different positions. Two pixels arranged at different positions for receiving light transmitted through 131 may be used.
  • phase difference detection pixels 152 out of the ten phase difference detection pixels 152 are selected, but they are not adjacent to each other. Two pixels may be selected. For example, referring to FIG. 14 again, pixels arranged at distant positions such as the phase difference detection pixel 152-1 and the phase difference detection pixel 152-3 may be selected.
  • the phase difference detection pixel 152 for obtaining the phase difference may be selected depending on which of the separation ratio characteristic and the sensitivity ratio characteristic has priority. For example, when priority is given to the separation ratio characteristic, two phase difference detection pixels 152 at a distant position are selected from the ten phase difference detection pixels 152, and when priority is given to the sensitivity ratio characteristic, 10 Two adjacent phase difference detection pixels 152 can be selected from the number of phase difference detection pixels 152.
  • a correspondence table is held on the camera side or the lens side and a pixel for phase difference detection is selected by referring to the correspondence table will be described as an example.
  • the correspondence table is held on the camera side or the lens side, for example, in the imaging apparatus 10 illustrated in FIG. 1, the lens group 21 (corresponding to the optical lens 102 in FIG. 6) is detachable from the imaging apparatus 10.
  • a correspondence table is held in a storage unit (not shown) of the lens group 21 (optical lens 102) or the main body of the imaging device 10 is illustrated.
  • a correspondence table is held in a storage unit that does not exist.
  • the lens group 21 is not detachable from the image pickup apparatus 10 but is an integral type, a correspondence table is stored in the storage unit of the image pickup apparatus 10 and is referred to as appropriate.
  • a case where the lens group 21 is detachable from the imaging device 10 will be described as an example. Further, although described as the lens group 21, the number of lenses may be one.
  • Such a correspondence table is created based on the processing of the flowchart shown in FIG. Further, it is created when the lens group 21 is manufactured, when the imaging device 10 is manufactured, or the like. Alternatively, when the lens group 21 is attached to the imaging device 10 for the first time, a correspondence table regarding the lens group 21 may be created.
  • step S11 information on the chief ray angle for each lens is acquired. For example, when there are ten lens groups 21 that can be attached to the imaging apparatus 10, information on principal ray angles of the ten lens groups 21 is acquired. Information for identifying the lens group 21 is also acquired.
  • step S12 information on the phase difference detection pixel corresponding to the chief ray angle is acquired.
  • the chief ray angle is large as described with reference to FIG. 14, information such as using the phase difference detection pixel 152-1 and the phase difference detection pixel 152-2 as the phase difference detection pixel is provided.
  • Information of the phase difference detection pixel corresponding to the chief ray angle is acquired by performing measurement or the like in advance.
  • a correspondence table is created.
  • the correspondence table is, for example, a correspondence table in which information for identifying the lens group 21, principal ray angle information, and phase difference detection pixel information are associated with each other.
  • a correspondence table in which only information for identifying the lens group 21 and information on phase difference detection pixels is associated may be used.
  • the correspondence table created in this way is stored in the lens group 21 or the imaging device 10 in step S14.
  • a correspondence table in which at least the information allocated to identify the stored lens group 21 and the information on the phase difference detection pixels are associated with each other may be stored. good.
  • information for identifying a plurality of lens groups 21 and information on phase difference detection pixels associated with the respective lens groups 21 are stored at least in correspondence tables. You may be made to do.
  • the correspondence table when the correspondence table is stored in the lens group 21, information regarding the lens group 21 is stored, and when the correspondence table is stored in the imaging device 10, information regarding the plurality of lens groups 21 is stored. It is possible to make it. Also, it is possible to provide a mechanism in which the correspondence table once stored is updated via a network or the like.
  • the imaging device 10 When the lens group 21 is attached to the imaging device 10 in step S31, the imaging device 10 recognizes the attached lens group 21 in step S32. In step S ⁇ b> 33, the imaging device 10 reads a correspondence table (information) regarding the attached lens group 21.
  • the correspondence table is stored in the lens group 21, when the lens group 21 is attached to the imaging device 10, the correspondence table is supplied from the lens group 21 to the imaging device 10. Alternatively, the imaging device 10 accesses the lens group 21 and reads the correspondence table.
  • the imaging device 10 When the correspondence table is stored in the imaging device 10, when the lens group 21 is attached to the imaging device 10, information for identifying the lens group 21 is supplied from the lens group 21 to the imaging device 10. .
  • the imaging apparatus 10 reads a correspondence table related to the attached lens group 21 from the identification information supplied from the lens group 21 from the stored correspondence table.
  • step S34 the phase difference detection pixels are selected and set based on the read correspondence table.
  • the phase difference is detected by the set phase difference detection pixel, and autofocus is executed.
  • the same processing is performed on the newly mounted lens group 21, thereby setting a phase difference detection pixel corresponding to the newly mounted lens group 21.
  • phase difference detection pixels are set corresponding to the lens group 21, an appropriate phase difference detection pixel is selected according to the principal ray angle that differs depending on the lens, and autofocus is executed. Is possible.
  • phase difference detection pixel selection of a phase difference detection pixel according to the image height will be described.
  • the chief ray angle may vary depending on the image height of the lens. Therefore, the phase difference detection pixel may be selected according to the image height of the lens.
  • the upper diagram of FIG. 17 is a graph representing the chief ray angle characteristic
  • the middle diagram is a diagram representing an image image
  • the lower diagram is a graph representing the phase difference characteristic.
  • the horizontal axis of the graph representing the chief ray angle characteristics shown in the upper diagram of FIG. 17 represents the image height (Image height), and the vertical axis represents the chief ray angle (CRA). It can be seen from this graph that the chief ray angle increases as the image height increases.
  • the image shown in the middle diagram of FIG. 17 represents an image of an image to be captured. In the image image, a pixel with an image height of 100%, a pixel with an image height of 60%, and a pixel with an image height of 0% are each square. It is represented by a line and a corresponding portion of the graph representing the chief ray angle characteristic.
  • the horizontal axis of the graph showing the phase difference characteristic shown in the lower diagram of FIG. 17 represents the incident angle (angle) of the light beam, and the vertical axis represents the sensitivity characteristic (Pabs total).
  • the numbers in the figure represent the phase difference detection pixels 152-1 to 152-10 (for example, FIG. 14), and for example, “1” represents the phase difference detection pixel 152-1.
  • the pixel from which the peak value is obtained differs depending on the incident angle of the light beam. For example, when the incident angle of the light beam is 0 degree, the phase difference detection pixel 152-5 has the highest peak value, and when the incident angle of the light beam is -15 degrees, the phase difference detection pixel 152-4 has the highest peak value. Is expensive.
  • the phase difference detection pixel 152-5 that can obtain the highest peak value is suitable as the phase difference detection pixel, and the incident angle of the light beam is -15 degrees. Therefore, it can be said that the phase difference detection pixel 152-4 that can obtain the highest peak value is suitable as the phase difference detection pixel.
  • the upper left of the image has an image height of 100%, and the portion of such an image height of 100% has a larger principal ray angle from the upper diagram (in the upper diagram, , About 30 degrees).
  • a phase difference is used as the phase difference detection pixel.
  • a detection pixel 152-2 is used.
  • phase difference detection pixel 152-2 When the phase difference detection pixel 152-2 is used, the adjacent phase difference detection pixel 152-1 or the phase difference detection pixel 152-3 is combined with the phase difference detection pixel 152-2 to detect the phase difference. It is used as a pixel.
  • the image height near the center of the image is 0%, and the portion with such an image height of 0% has a smaller chief ray angle from the upper diagram (upper diagram). Then, it can be read as about 0 degree).
  • the phase difference detection pixel group 151 arranged at the position where the image pickup device that takes an image with a low image height and a small chief ray angle is arranged as the phase difference detection pixel, for example, the phase difference A detection pixel 152-5 is used.
  • phase difference detection pixel 152-5 When the phase difference detection pixel 152-5 is used, the adjacent phase difference detection pixel 152-4 or the phase difference detection pixel 152-6 is combined with the phase difference detection pixel 152-5 to detect the phase difference. It is used as a pixel.
  • the image height is 60% between the upper side and the center of the image image. It can be read as a medium size (about 25 degrees in the figure above).
  • the phase difference detection pixel group 151 arranged at the position where the image sensor for imaging a portion having such a medium image height and a medium chief ray angle is medium the phase difference
  • a phase difference detection pixel 152-3 is used as the detection pixel.
  • phase difference detection pixel 152-3 When the phase difference detection pixel 152-3 is used, the adjacent phase difference detection pixel 152-2 or the phase difference detection pixel 152-4 is combined with the phase difference detection pixel 152-3 to detect the phase difference. It is used as a pixel.
  • the example shown in FIG. 17 is a case where the maximum value of the chief ray angle is 30 degrees.
  • the phase difference detection pixels can be selected as described with reference to FIG.
  • the phase difference detection pixels can be selected as described with reference to FIG.
  • the upper diagram, the middle diagram, and the lower diagram of FIG. 18 are respectively a graph representing the principal ray angle characteristic, a screen image diagram, and a phase difference characteristic graph, similar to the upper diagram, the middle diagram, and the lower diagram of FIG. is there.
  • a phase difference detection pixel 152-4 is used as a phase difference detection pixel at an image height of 100%.
  • the vicinity of the center of the image image in other words, the vicinity of the center of the lens of the lens group 21 has an image height of 0%.
  • phase difference detection is performed as a phase difference detection pixel.
  • the pixel 152-5 is used.
  • phase difference detection pixel used at the image height of 100% when the lens group 21 having the maximum chief ray angle of 30 degrees shown in FIG. 17 is used is the phase difference detection pixel 152-2.
  • the phase difference detection pixels 152 in the phase difference detection pixel group 151 may be different even when the image height is the same. .
  • phase difference detection pixels 152 may be set in portions where the image height is different in the predetermined lens group 21. Further, in a different lens group 21, when the lens group 21 is exchanged, a phase difference detection pixel 152 suitable for the exchanged lens group 21 may be set.
  • phase difference detection pixels corresponding to the lens group 21 are set when the lens group 21 is attached.
  • the same phase difference detection pixels 152 may be set for the phase difference detection pixels set at this time regardless of the image portion, that is, regardless of the image height.
  • the phase difference detection pixel 152-4 and the phase difference detection pixel 152-5 are used as the phase difference detection pixels in the correspondence table, Regardless of the image height, the phase difference detection pixel 152-4 and the phase difference detection pixel 152-5 are used as phase difference detection pixels in any part.
  • the correspondence table describes the lens group 21 and the phase difference detection pixels used in a one-to-one correspondence.
  • phase difference detection pixel 152 corresponding to the lens group 21 when the lens group 21 is attached and the phase difference detection pixel 152 corresponding to the lens group 21 is set, it depends on the portion of the image, that is, on the image height. Different phase difference detection pixels 152 may be set.
  • the phase difference detection pixel 152-2 and the phase difference detection pixel 152-3 are used as the phase difference detection pixels, and the image height is increased. Is 0%, and if the phase difference detection pixel 152-4 and the phase difference detection pixel 152-5 are used as the phase difference detection pixels and are described in the correspondence table, the image height is 100%. In the meantime, the phase difference detection pixel 152-2 and the phase difference detection pixel 152-3 are used as the phase difference detection pixel. The phase difference detection pixel 152-5 is used as the phase difference detection pixel.
  • the correspondence table describes the lens group 21 and the phase difference detection pixels used in a one-to-multiple correspondence.
  • phase difference detection pixel 152 it is possible to set the phase difference detection pixel 152 only by the image height, regardless of the lens group 21.
  • phase difference detection pixel By setting the phase difference detection pixel to be used for each lens group 21, phase difference detection using an appropriate phase difference detection pixel can be performed for each lens, and autofocus is performed with high accuracy. It becomes possible.
  • phase difference detection pixels By setting the phase difference detection pixels to be used for each image height, it is possible to perform phase difference detection using appropriate phase difference detection pixels for each image height, and to perform autofocus with high accuracy. It becomes.
  • phase difference detection pixels used for each lens group 21 and for each image height it is possible to perform phase difference detection using an appropriate phase difference detection pixel corresponding to the lens and the image height. It is possible to perform autofocus well.
  • the upper substrate 111A and the lower substrate 111B are provided, and the phase difference detection pixels are provided on the lower substrate 111B.
  • the example in which the upper substrate 111A is provided in the first layer, the lower substrate 111B in the second layer, and the substrate 171 provided with a circuit or the like in the third layer is described.
  • the application of the present technology is not limited to such a configuration.
  • the present technology can be applied to a configuration as described below.
  • FIG. 19 is a diagram showing another configuration of the image sensor.
  • the upper substrate 111A is disposed in the first layer
  • the substrate 171 is disposed in the second layer
  • the lower substrate 111B is disposed in the third layer.
  • the substrate 171 may be disposed in the second layer
  • the lower substrate 111B may be disposed in the third layer.
  • the light transmitted through the predetermined pixel of the upper substrate 111A of the first layer is further transmitted through the substrate 171 of the second layer and received by the pixel 152 for phase difference detection of the third layer.
  • a predetermined pixel of the first-layer upper substrate 111A through which light received by the third-layer phase difference detection pixel 152 is transmitted can be a red on-chip color filter 132.
  • Red light has a lower rate of absorption by the second-layer substrate 171 than blue and green light, so that characteristic deterioration due to absorption by the second-layer substrate 171 can be minimized.
  • the second-layer substrate 171 is made of silicon, silicon has a characteristic that light is easily transmitted toward the longer wavelength side.
  • the red on-chip color filter 132 detects a long wavelength region such as red light, the light transmitted through the red on-chip color filter 132 becomes light with a long wavelength. For this reason, as described above, the on-chip color filter 132 of a predetermined pixel on the upper substrate 111A in the first layer is set to red, and the light transmitted through the pixel is arranged on the lower substrate 111B in the third layer. By configuring so as to receive light by the phase difference detection pixel 152, it is possible to suppress the absorption of the light by the substrate 171 formed of the second layer of silicon and the deterioration of the characteristics.
  • a hole 172 may be provided in a region where light of the substrate 171 is transmitted.
  • the hole 172 in the substrate 171 the light transmitted through the first upper substrate 111A is prevented from being absorbed by the second substrate 171, and the phase difference detection pixel provided in the third layer The light can be received by 152.
  • the hole 172 may be physically provided in the substrate 171.
  • a part of the substrate 171 formed of silicon may be replaced with a material that hardly has an absorption band in the visible light region such as a silicon oxide film or a silicon nitride film, and the part may be used as the hole 172.
  • the color of the on-chip color filter 132 may be other colors instead of red. For example, as shown in FIG. 21, it may be white or clear.
  • the on-chip color filter 132 shown in FIG. 21 is a filter in which white (W) is arranged in addition to red (R), green (G), and blue (B).
  • a white pixel functions as a spectral sensitivity pixel having total color matching, and a red pixel, a green pixel, and a blue pixel function as spectral sensitivity pixels having characteristics of the respective colors.
  • the present technology can also be applied to an image sensor (image sensor) in which such four types of spectral sensitivity pixels including a spectral sensitivity that has total color matching are arranged on the imaging surface.
  • the light transmitted from the chromatic pixel may be received by the phase difference detection pixel 152.
  • FIG. 22 is a diagram illustrating another configuration of the image sensor 101.
  • the image sensor 101 shown in FIG. 22 has, for example, the same configuration as the image sensor 101 shown in FIG. 20, but predetermined pixels (indicated by hatching in the figure) of the first-layer upper substrate 111A.
  • the pixel is made of a material different from the other pixels, for example, a material having almost no absorption band in the visible light region such as a silicon oxide film or a silicon nitride film.
  • phase difference detection pixel 152 It is configured such that light transmitted through a portion made of a material having almost no absorption band in the visible light region is received by the phase difference detection pixel 152.
  • the upper color can be green (G).
  • silicon absorbs most of the green light, so if the first layer is made of silicon, almost no light reaches the lower layer, but as described above, the first layer has a visible light region. A portion composed of a material having almost no absorption band is provided, and the light transmitted through the portion is received by the third layer phase difference detection pixel 152, thereby detecting the phase difference in the lower layer. It is possible to adopt a configuration in which light can also reach the use pixel 152.
  • the color on the pixel that transmits the light received by the phase difference detection pixel 152 in the on-chip color filter 132 arranged on the upper substrate 111A may be blue.
  • the color on the pixel that transmits the light received by the phase difference detection pixel 152 in the on-chip color filter 132 arranged on the upper substrate 111A may be red.
  • the sensitivity can be further improved.
  • the example in which the hole 172 is provided in the second layer substrate 171 is illustrated. However, since the amount of light absorption in the first layer is reduced, the second layer hole 172 may not be provided. Since light can reach the third layer phase difference detection pixel 152, the second layer hole 172 may be omitted.
  • the color on the pixel that transmits the light received by the phase difference detection pixel 152 may be white (transparent). By making it white (transparent), it becomes possible to further improve the sensitivity.
  • the on-chip color filter 132 of the pixel that transmits light to the lower layer white (transparent) By making the on-chip color filter 132 of the pixel that transmits light to the lower layer white (transparent), light in the entire visible light band is transmitted through the on-chip color filter 132. Even if such visible light in the entire band is absorbed by silicon, a relatively large amount of light passes through the light on the long wavelength side. Therefore, as shown in FIG. 24, a hole 172 may be provided in the substrate 171, but a configuration in which such a hole 172 is not provided is also possible.
  • the above configuration is an example and does not indicate a limitation. Moreover, it is also possible to make the structure which combined the above-mentioned structure.
  • the case where the pixel for detecting the phase difference is selected only from the light transmitted through the single on-chip lens 131 has been described. It is also possible to configure so that phase difference detection is performed by a combination with pixels that detect light transmitted through different on-chip lenses 131 (positioned).
  • an image pickup element using a silicon semiconductor has been described as an example, but a sensor portion having a photodiode or a photoelectric conversion portion is GaAs-based, InGaAs-based, PbS, PbSe, Ge-based, InAs,
  • An imaging element made of a substance containing a material such as InSb or HgCdTe, or a substance containing a GaN, InGaN, or AlGaN material may be used.
  • phase difference detection pixel can be changed according to the chief ray angle, and it becomes possible to deal with a wide chief ray angle.
  • the series of processes described above can be executed by hardware or can be executed by software.
  • a program constituting the software is installed in the computer.
  • the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
  • FIG. 25 is a block diagram illustrating an example of a hardware configuration of a computer that executes the above-described series of processes using a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input / output interface 305 is further connected to the bus 304.
  • An input unit 306, an output unit 307, a storage unit 308, a communication unit 309, and a drive 310 are connected to the input / output interface 305.
  • the input unit 306 includes a keyboard, a mouse, a microphone, and the like.
  • the output unit 307 includes a display, a speaker, and the like.
  • the storage unit 308 includes a hard disk, a nonvolatile memory, and the like.
  • the communication unit 309 includes a network interface and the like.
  • the drive 310 drives a removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 301 loads the program stored in the storage unit 308 to the RAM 303 via the input / output interface 305 and the bus 304 and executes the program, for example. Is performed.
  • the program executed by the computer (CPU 301) can be provided by being recorded in, for example, a removable medium 311 as a package medium or the like.
  • the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 308 via the input / output interface 305 by attaching the removable medium 311 to the drive 310. Further, the program can be received by the communication unit 309 via a wired or wireless transmission medium and installed in the storage unit 308. In addition, the program can be installed in the ROM 302 or the storage unit 308 in advance.
  • the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
  • FIG. 26 is a diagram illustrating a usage example in which the above-described imaging device and the electronic device including the imaging device are used.
  • the imaging device described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays as follows.
  • Devices for taking images for viewing such as digital cameras and mobile devices with camera functions
  • Devices used for traffic such as in-vehicle sensors that capture the back, surroundings, and interiors of vehicles, surveillance cameras that monitor traveling vehicles and roads, and ranging sensors that measure distances between vehicles, etc.
  • Equipment used for home appliances such as TVs, refrigerators, air conditioners, etc. to take pictures and operate the equipment according to the gestures ⁇ Endoscopes, equipment that performs blood vessel photography by receiving infrared light, etc.
  • Equipment used for medical and health care ⁇ Security equipment such as security surveillance cameras and personal authentication cameras ⁇ Skin measuring instrument for photographing skin and scalp photography Such as a microscope to do beauty Equipment used for sports such as action cameras and wearable cameras for sports applications etc.
  • Equipment used for agriculture such as cameras for monitoring the condition of fields and crops
  • system represents the entire apparatus composed of a plurality of apparatuses.
  • this technique can also take the following structures.
  • a photoelectric conversion layer having a photoelectric conversion unit divided into pixel units is laminated in two or more layers, A configuration in which light incident on one pixel of the first photoelectric conversion layer on the side close to the optical lens is received by the photoelectric conversion unit of the second photoelectric conversion layer on the side far from the optical lens is configured.
  • An image sensor that detects a phase difference using a signal obtained by the photoelectric conversion unit of the second photoelectric conversion layer.
  • the imaging device according to (1) including the photoelectric conversion unit including three or more pixels of the second photoelectric conversion layer.
  • the second photoelectric conversion layer includes a plurality of phase difference detection pixels with respect to one of the on-chip lenses provided in the first photoelectric conversion layer.
  • the second photoelectric conversion layer includes a plurality of phase difference detection pixels, The imaging device according to any one of (1) to (3), wherein a phase difference is detected using two of the plurality of pixels for phase difference detection. (5) The image sensor according to (4), wherein a phase difference is detected using a pixel set according to a chief ray angle among the plurality of pixels for phase difference detection. (6) The imaging device according to (4), wherein a phase difference is detected using a pixel set according to the optical lens among the plurality of pixels for phase difference detection. (7) The imaging device according to (4), wherein a phase difference is detected using a pixel set according to an image height of the optical lens among the plurality of pixels for phase difference detection.
  • a phase difference detection pixel is set with reference to a correspondence table in which information for identifying the optical lens and a pixel to be used are associated with each other (1 ) To (8).
  • a pixel used as a phase difference detection pixel is set with reference to a correspondence table in which an image height of the optical lens and a pixel to be used are associated with each other (1) The imaging device according to any one of (9) to (9).
  • the imaging device according to any one of (1) to (10), further including at least one substrate between the first photoelectric conversion layer and the second photoelectric conversion layer. (12) Of the light incident on the first photoelectric conversion layer, the region of the substrate through which light received by the pixels of the second photoelectric conversion layer is transmitted is a material that does not have an absorption band in the visible light region.
  • the imaging device according to (11), wherein the imaging device is formed.
  • the imaging element according to (12), wherein the material is a silicon oxide film or a silicon nitride film.
  • the photoelectric conversion unit of the second photoelectric conversion layer receives light from a red on-chip color filter provided in the first photoelectric conversion layer. Any one of (1) to (13) The imaging device described.
  • the photoelectric conversion unit of the second photoelectric conversion layer receives light from a white on-chip color filter provided in the first photoelectric conversion layer. Any one of (1) to (14) The imaging device described. (16) A photoelectric conversion layer having a photoelectric conversion unit divided into pixel units is laminated in two or more layers, A configuration in which light incident on one pixel of the first photoelectric conversion layer on the side close to the optical lens is received by the photoelectric conversion unit of the second photoelectric conversion layer on the side far from the optical lens is configured.
  • An imaging method of an image sensor that is In the second photoelectric conversion layer a plurality of phase difference detection pixels for detecting a phase difference are arranged, An imaging method including a step of setting a phase difference detection pixel to be used in accordance with the optical lens or an image height of the optical lens among the plurality of phase difference detection pixels.
  • a photoelectric conversion layer having a photoelectric conversion unit divided into pixel units is laminated in two or more layers, A configuration in which light incident on one pixel of the first photoelectric conversion layer on the side close to the optical lens is received by the photoelectric conversion unit of the second photoelectric conversion layer on the side far from the optical lens is configured.
  • a program for controlling imaging of an image sensor that is In the second photoelectric conversion layer a plurality of phase difference detection pixels for detecting a phase difference are arranged,
  • a program for causing a computer to execute processing including a step of setting a phase difference detection pixel to be used according to the optical lens or an image height of the optical lens among the plurality of phase difference detection pixels.
  • a photoelectric conversion layer having a photoelectric conversion unit divided into pixel units is laminated in two or more layers, A configuration in which light incident on one pixel of the first photoelectric conversion layer on the side close to the optical lens is received by the photoelectric conversion unit of the second photoelectric conversion layer on the side far from the optical lens is configured.
  • An electronic apparatus comprising: an image sensor that detects a phase difference by the photoelectric conversion unit of the second photoelectric conversion layer.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

La présente technologie se rapporte à un élément d'imagerie, un procédé d'imagerie, un programme et un dispositif électronique qui permettent de mettre en œuvre une précision de commande de focalisation. L'élément d'imagerie a deux couches de conversion photoélectrique stratifiées ou plus qui comprennent une partie de conversion photoélectrique divisée de façon à correspondre à chaque pixel. L'élément d'imagerie est conçu pour comprendre un état dans lequel la lumière incidente sur un pixel dans une première couche de conversion photoélectrique à proximité d'une lentille optique est reçue au niveau de la partie de conversion photoélectrique dans une seconde couche de conversion photoélectrique à l'opposé de la lentille optique. L'élément d'imagerie détecte un déphasage à l'aide d'un signal acquis à partir de la partie de conversion photoélectrique dans la seconde couche de conversion photoélectrique. La seconde couche de conversion photoélectrique comprend une partie de conversion photoélectrique correspondant à trois pixels ou plus. La seconde couche de conversion photoélectrique comprend une pluralité de pixels pour une détection de déphasage pour l'une des lentilles sur puce disposées dans la première couche de conversion photoélectrique. L'invention peut être appliquée, par exemple, à un élément d'imagerie destiné à la photographie.
PCT/JP2016/064213 2015-05-29 2016-05-13 Élément d'imagerie, procédé d'imagerie, programme, et dispositif électronique WO2016194577A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-109338 2015-05-29
JP2015109338 2015-05-29

Publications (1)

Publication Number Publication Date
WO2016194577A1 true WO2016194577A1 (fr) 2016-12-08

Family

ID=57442266

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/064213 WO2016194577A1 (fr) 2015-05-29 2016-05-13 Élément d'imagerie, procédé d'imagerie, programme, et dispositif électronique

Country Status (1)

Country Link
WO (1) WO2016194577A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007133087A (ja) * 2005-11-09 2007-05-31 Nikon Corp 焦点検出装置、光学システムおよび焦点検出方法
WO2008132812A1 (fr) * 2007-04-20 2008-11-06 Nikon Corporation Élément d'imagerie à l'état solide et dispositif d'imagerie l'utilisant
JP2011103335A (ja) * 2009-11-10 2011-05-26 Fujifilm Corp 撮像素子及び撮像装置
JP2013187475A (ja) * 2012-03-09 2013-09-19 Olympus Corp 固体撮像装置およびカメラシステム
JP2014130890A (ja) * 2012-12-28 2014-07-10 Canon Inc 光電変換装置
US20150146056A1 (en) * 2013-11-27 2015-05-28 Sony Corporation Solid-state imaging device and electronic apparatus
WO2016051594A1 (fr) * 2014-10-03 2016-04-07 オリンパス株式会社 Dispositif d'imagerie à l'état solide, et dispositif d'imagerie
WO2016111004A1 (fr) * 2015-01-09 2016-07-14 オリンパス株式会社 Dispositif d'imagerie à semi-conducteurs

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007133087A (ja) * 2005-11-09 2007-05-31 Nikon Corp 焦点検出装置、光学システムおよび焦点検出方法
WO2008132812A1 (fr) * 2007-04-20 2008-11-06 Nikon Corporation Élément d'imagerie à l'état solide et dispositif d'imagerie l'utilisant
JP2011103335A (ja) * 2009-11-10 2011-05-26 Fujifilm Corp 撮像素子及び撮像装置
JP2013187475A (ja) * 2012-03-09 2013-09-19 Olympus Corp 固体撮像装置およびカメラシステム
JP2014130890A (ja) * 2012-12-28 2014-07-10 Canon Inc 光電変換装置
US20150146056A1 (en) * 2013-11-27 2015-05-28 Sony Corporation Solid-state imaging device and electronic apparatus
WO2016051594A1 (fr) * 2014-10-03 2016-04-07 オリンパス株式会社 Dispositif d'imagerie à l'état solide, et dispositif d'imagerie
WO2016111004A1 (fr) * 2015-01-09 2016-07-14 オリンパス株式会社 Dispositif d'imagerie à semi-conducteurs

Similar Documents

Publication Publication Date Title
US10204949B2 (en) Backside illumination image sensor and image-capturing device
KR102523203B1 (ko) 고체 화상 센서 및 그 제조 방법, 및 전자 장치
US10827112B2 (en) Image pickup device and electronic apparatus with an image plane phase detection difference detection pixel
US7483065B2 (en) Multi-lens imaging systems and methods using optical filters having mosaic patterns
US20170034456A1 (en) Sensor assembly with selective infrared filter array
JP2023014112A (ja) 固体撮像素子、および電子機器
US10686004B2 (en) Image capturing element and image capturing device image sensor and image-capturing device
CN104681572A (zh) 固态成像装置和电子设备
WO2017126376A1 (fr) Capteur d'image, procédé de fabrication et dispositif électronique
WO2015159727A1 (fr) Dispositif de détection de mise au point et dispositif électronique
US10319764B2 (en) Image sensor and electronic device
JP2015167219A (ja) 撮像素子、製造装置、電子機器
US10854664B2 (en) Solid-state image pickup device and method for manufacturing the same, and electronic apparatus
WO2016052249A1 (fr) Élément de formation d'image à semi-conducteurs, procédé de production et dispositif électronique
US8654224B2 (en) Composite imaging element and imaging device equipped with same
JP2016127043A (ja) 固体撮像素子及び電子機器
JP6764571B2 (ja) 固体撮像素子、撮像装置、並びに電子機器
WO2017130725A1 (fr) Dispositif de détection de point focal et un dispositif d'imagerie
JP2013057769A (ja) 固体撮像装置、撮像装置、合焦制御方法、及び、プログラム
WO2016194577A1 (fr) Élément d'imagerie, procédé d'imagerie, programme, et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16803014

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 16803014

Country of ref document: EP

Kind code of ref document: A1