WO2017169479A1 - 撮像素子、及び、撮像装置 - Google Patents
撮像素子、及び、撮像装置 Download PDFInfo
- Publication number
- WO2017169479A1 WO2017169479A1 PCT/JP2017/007936 JP2017007936W WO2017169479A1 WO 2017169479 A1 WO2017169479 A1 WO 2017169479A1 JP 2017007936 W JP2017007936 W JP 2017007936W WO 2017169479 A1 WO2017169479 A1 WO 2017169479A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- photoelectric conversion
- unit
- light
- imaging
- Prior art date
Links
- 238000006243 chemical reaction Methods 0.000 claims abstract description 148
- 238000003384 imaging method Methods 0.000 claims description 271
- 230000003287 optical effect Effects 0.000 claims description 35
- 230000003595 spectral effect Effects 0.000 claims description 14
- 238000000926 separation method Methods 0.000 claims description 12
- 229910052751 metal Inorganic materials 0.000 claims description 11
- 239000002184 metal Substances 0.000 claims description 11
- 238000002834 transmittance Methods 0.000 claims 4
- 238000001514 detection method Methods 0.000 description 177
- 238000012986 modification Methods 0.000 description 32
- 230000004048 modification Effects 0.000 description 32
- 239000010410 layer Substances 0.000 description 29
- 210000001747 pupil Anatomy 0.000 description 29
- 239000000758 substrate Substances 0.000 description 22
- 238000010586 diagram Methods 0.000 description 20
- 239000004020 conductor Substances 0.000 description 10
- 230000004907 flux Effects 0.000 description 10
- 238000009792 diffusion process Methods 0.000 description 8
- 239000004065 semiconductor Substances 0.000 description 8
- 230000006870 function Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 150000004767 nitrides Chemical class 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 3
- 229920005591 polysilicon Polymers 0.000 description 3
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 2
- 229910052802 copper Inorganic materials 0.000 description 2
- 239000010949 copper Substances 0.000 description 2
- 239000012535 impurity Substances 0.000 description 2
- 229910052581 Si3N4 Inorganic materials 0.000 description 1
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 239000012790 adhesive layer Substances 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000010030 laminating Methods 0.000 description 1
- 238000003475 lamination Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 1
- 229910052814 silicon oxide Inorganic materials 0.000 description 1
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 1
- 229910052721 tungsten Inorganic materials 0.000 description 1
- 239000010937 tungsten Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1464—Back illuminated imager structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
- H01L27/14605—Structural or functional details relating to the position of the pixel elements, e.g. smaller pixel elements in the center of the imager compared to pixel elements at the periphery
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14629—Reflectors
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
Definitions
- the present invention relates to an imaging element and an imaging apparatus.
- An imaging device in which a reflective layer is provided below a photoelectric conversion unit, and light reflected by the photoelectric conversion unit is reflected by the reflection layer to the photoelectric conversion unit (Patent Document 1).
- the reflective layer is provided only at the same position with respect to the photoelectric conversion unit.
- the imaging device is an imaging device in which a plurality of pixels are arranged in the first direction, and a first photoelectric conversion unit that photoelectrically converts incident light to generate charges; A part of the light that is provided at least in part in a region on the first direction side with respect to the center of the first photoelectric conversion unit on the plane intersecting the direction in which the light is incident, and transmitted through the first photoelectric conversion unit Crossing the direction in which the light is incident, a first pixel having a first reflection unit that reflects the light to the first photoelectric conversion unit, a second photoelectric conversion unit that photoelectrically converts incident light to generate charges A portion of the light transmitted through the second photoelectric conversion unit is at least partially provided in a region on the opposite side of the first direction from the center of the second photoelectric conversion unit.
- the imaging apparatus is configured to convert a signal output from the first pixel and a signal output from the second pixel of the imaging element that has captured an image by an optical system having a focus lens. And a control unit that controls the position of the focus lens so that an image formed by the optical system is focused on the image sensor.
- FIG. 1 is a main part configuration diagram of an imaging apparatus according to a first embodiment.
- FIG. FIG. 3 is a diagram showing an example of pixel arrangement of the image sensor according to the first embodiment. The figure for demonstrating the light beam which injects into the image pick-up element which concerns on 1st Embodiment.
- FIG. 3 is a diagram illustrating an example of a cross-sectional structure of the image sensor according to the first embodiment.
- FIG. 9 is a diagram illustrating an example of a cross-sectional structure of an image sensor according to Modification Example 1.
- FIG. 9 is a diagram illustrating an example of a cross-sectional structure of an image sensor according to Modification Example 2.
- FIG. 9 is a diagram illustrating an example of pixel arrangement of an image sensor according to Modification 3.
- FIG. 10 is a diagram illustrating an example of a cross-sectional structure of an image sensor according to Modification 3.
- FIG. 10 is a diagram illustrating another example of a cross-sectional structure of an image sensor according to Modification 3.
- FIG. 10 is a diagram illustrating an example of a cross-sectional structure of an image sensor according to Modification Example 4.
- FIG. 10 is a diagram showing another example of a cross-sectional structure of an image sensor according to Modification Example 4.
- FIG. 1 is a main part configuration diagram of a digital camera 1 (hereinafter referred to as a camera 1) which is an imaging apparatus according to the first embodiment.
- the camera 1 includes a camera body 2 and an interchangeable lens 3.
- the interchangeable lens 3 is attached to the camera body 2 via a mount portion (not shown).
- the connection portion 202 on the camera body 2 side and the connection portion 302 on the interchangeable lens 3 side are connected, and communication between the camera body 2 and the interchangeable lens 3 becomes possible.
- the interchangeable lens 3 includes an imaging optical system (imaging optical system) 31, a lens control unit 32, and a lens memory 33.
- the imaging optical system 31 includes a plurality of lenses including a focus adjustment lens (focus lens) and an aperture, and forms a subject image on the imaging surface of the imaging element 22 of the camera body 2.
- the lens control unit 32 adjusts the focal position of the imaging optical system 31 by moving the focus adjustment lens forward and backward in the direction of the optical axis L1 based on a signal output from the body control unit 21 of the camera body 2.
- the signal output from the body control unit 21 includes a signal representing the movement amount, movement direction, movement speed, and the like of the focus adjustment lens.
- the lens memory 33 is configured by a non-volatile storage medium or the like, and stores information related to the interchangeable lens 3, for example, lens information such as information related to the position of the exit pupil of the imaging optical system 31.
- the lens information stored in the lens memory 33 is read by the lens control unit 32 and transmitted to the body control unit 21.
- the camera body 2 includes a body control unit 21, an image sensor 22, a memory 23, a display unit 24, and an operation unit 25.
- the image sensor 22 is an image sensor such as a CCD or a CMOS, and a plurality of pixels are arranged two-dimensionally (row direction and column direction) on the image sensor 22.
- the imaging element 22 receives the light beam that has passed through the exit pupil of the imaging optical system 31, generates a signal corresponding to the amount of received light, and outputs the generated signal to the body control unit 21.
- the plurality of pixels of the image sensor 22 have, for example, R (red), G (green), and B (blue) color filters, respectively. Each pixel captures a subject image through a color filter.
- a signal output from the image sensor 22 and RGB color information are input to the body control unit 21.
- the body control unit 21 includes a CPU, a ROM, a RAM, and the like, and controls each unit of the camera 1 based on a control program.
- the body control unit 21 performs various signal processing.
- the body control unit 21 supplies a control signal to the image sensor 22 to control the operation of the image sensor 22.
- the body control unit 21 includes a focus detection unit 21a and an image data generation unit 21b.
- the image data generation unit 21b performs various kinds of image processing on the signal output from the image sensor 22 to generate image data.
- the focus detection unit 21 a calculates a defocus amount by a pupil division type phase difference detection method using a signal from the image sensor 22, and transmits the defocus amount to the lens control unit 32.
- the focus detection unit 21 a calculates a defocus amount that is a signal for adjusting the focus position of the imaging optical system 31 and transmits the defocus amount to the lens control unit 32.
- the focus detection unit 21 a calculates the amount of deviation between the image formation surface of the image pickup optical system 31 and the image pickup surface of the image pickup device 22 using the signal output from the image pickup device 22.
- the body control unit 21 calculates the movement amount and movement direction of the focus adjustment lens from the deviation amount.
- the body control unit 21 transmits information regarding the calculated amount and direction of movement of the focus adjustment lens to the lens control unit 32 via the connection unit 202 and the connection unit 302.
- the lens control unit 32 drives a motor (not shown) based on the information transmitted from the body control unit 21 to position the focus adjustment lens so that an image formed by the imaging optical system 31 is formed on the imaging surface of the imaging element 22. That is, it is moved to the in-focus position.
- the memory 23 is a recording medium such as a memory card, and the body control unit 21 writes and reads image data and audio data.
- the display unit 24 displays an image generated from the image data generated by the body control unit 21. Further, the display unit 24 displays information related to shooting such as a shutter speed and an aperture value, a menu screen, and the like.
- the operation unit 25 includes a release button, a recording button, various setting switches, and the like, and outputs an operation signal corresponding to the operation of the operation unit 25 to the body control unit 21.
- FIG. 2 is a diagram illustrating an arrangement example of pixels of the image sensor 22 according to the first embodiment.
- the pixels are arranged two-dimensionally (row direction and column direction).
- Each pixel is provided with one of three color filters having different spectral characteristics of R (red), G (green), and B (blue), for example.
- the R color filter mainly transmits light in the red wavelength region
- the G color filter mainly transmits light in the green wavelength region
- the B color filter mainly transmits light in the blue wavelength region. .
- Each pixel has different spectral characteristics depending on the color filter to be arranged.
- a pixel group 402 in which R pixels, G pixels, and B pixels) are alternately arranged is repeatedly arranged two-dimensionally.
- the R pixel, the G pixel, and the B pixel are arranged according to the Bayer array.
- the imaging element 22 includes the R, G, and B imaging pixels 12 arranged in a Bayer array as described above, and the imaging and focus detection pixels 11 and 13 that are arranged by being replaced with a part of the imaging pixel 12.
- the imaging pixel 12 outputs a signal for the image data generation unit 21b to generate image data, that is, an imaging signal.
- the imaging and focus detection pixels 11 and 13 output an imaging signal and a signal for the focus detection unit 21a to calculate a defocus amount, that is, a focus detection signal.
- the imaging and focus detection pixels 11 and 13 may be focus detection pixels that output only a focus detection signal for the focus detection unit 21a to calculate the defocus amount.
- the focus detection unit 21 a calculates the defocus amount based on the focus detection signals output from the focus detection pixels 11 and 13.
- the imaging and focus detection pixels 11 and 13 are arranged by being replaced with G imaging pixels arranged in the row direction, they have G spectral characteristics. In other words, the imaging and focus detection pixels 11 and 13 each output an imaging signal based on charges generated by photoelectrically converting light in the G wavelength range. Further, the imaging and focus detection pixels 11 and 13 are alternately arranged in the row direction (X-axis direction shown in FIG. 2) with the R imaging pixel 12 interposed therebetween. The signal of the imaging / focus detection pixel 11 and the signal of the imaging / focus detection pixel 13 are used as a pair of focus detection signals for phase difference focus detection described later. Note that the pixel 11 and the pixel 13 may be arranged at an arbitrary interval. Further, the pixel 11, the pixel 12, and the pixel 13 may be arranged as a unit at an arbitrary interval in the row direction or the column direction, or the row direction and the column direction.
- the imaging pixel 12 includes a microlens 40 and a photoelectric conversion unit 41.
- the imaging and focus detection pixels 11 and 13 include reflection units 42A and 42B, respectively.
- the reflection units 42A and 42B are provided at positions in the order of the microlens 40, the photoelectric conversion unit 41, and the reflection units 42A and 42B in the light incident direction.
- the imaging and focus detection pixel 11 and the imaging and focus detection pixel 13 are different in the positions of the reflecting portions 42A and 42B.
- the imaging and focus detection pixels 11 and 13 may be alternately arranged in the row direction with the two R imaging pixels and the G imaging pixel 12 interposed therebetween. Good. In this case, the G imaging pixel 12 sandwiched between the imaging and focus detection pixels 11 and 13 does not have the reflecting portion 42A or the reflecting portion 42B.
- the reflection part 42 ⁇ / b> A of the imaging / focus detection pixel 11 is arranged corresponding to the almost left half region of the photoelectric conversion part 41.
- the reflection part 42 ⁇ / b> B of the imaging / focus detection pixel 13 is arranged corresponding to the almost right half region of the photoelectric conversion part 41.
- the reflection unit 42A is arranged corresponding to one region obtained by dividing the photoelectric conversion unit 41 into two parts (dividing the Y-axis direction shown in FIG. 2 as an axis) in the direction in which the imaging and focus detection pixels 11 and 13 are arranged.
- the reflection part 42B is arranged corresponding to the other region obtained by dividing the photoelectric conversion part 41 into two parts.
- the imaging and focus detection pixels 11 and 13 are arranged in the row direction (X-axis direction shown in FIG. 2), that is, in the horizontal direction, but in the column direction (Y-axis direction shown in FIG. 2), That is, they may be arranged in the vertical direction.
- the reflection part 42A is arranged corresponding to one of the upper and lower halves of the photoelectric conversion part 41
- the reflection part 42B is photoelectric. It is arranged corresponding to the other region of the upper half and the lower half of the conversion unit 41.
- the reflection unit 42A corresponds to one region obtained by dividing the photoelectric conversion unit 41 into two parts (dividing the X-axis direction shown in FIG. 2 as an axis) in a direction crossing the arrangement direction of the imaging and focus detection pixels 11 and 13.
- the reflection part 42B is arranged corresponding to the other region obtained by dividing the photoelectric conversion part 41 into two parts.
- FIG. 3 is a diagram for explaining a light beam incident on the image sensor according to the first embodiment.
- FIG. 3 shows one imaging pixel 12, one imaging / focus detection pixel 11, and one imaging / focus detection pixel 13.
- the imaging pixel 12 includes the microlens 40 and the photoelectric conversion unit 41 that receives the light beam transmitted through the microlens 40.
- the imaging and focus detection pixels 11 and 13 photoelectrically convert the microlens 40, the photoelectric conversion unit 41 into which the light beam transmitted through the microlens 40 is incident, and the light beam transmitted through a part of the photoelectric conversion unit 41. Reflecting portions 42 ⁇ / b> A and 42 ⁇ / b> B that reflect toward the portion 41.
- the imaging pixel 12 illustrated in FIG. 3 is the G imaging pixel 12.
- the G imaging pixel 12 is arranged around the imaging and focus detection pixel 11 or the imaging and focus detection pixel 13.
- the G imaging pixel 12 includes the imaging and focus detection pixel 11, the imaging and focus detection pixel 13, and the like.
- G imaging pixels 12 arranged between the two.
- the G imaging pixel 12 may be arranged around the imaging / focus detection pixel 11 or the imaging / focus detection pixel 13.
- the first light beam that has passed through the first pupil region 61 and the second light beam that has passed through the second pupil region 62 are incident on the photoelectric conversion unit 41 via the microlens 40.
- the second light beam among the first and second light beams incident on the photoelectric conversion unit 41 passes through the photoelectric conversion unit 41, is reflected by the reflection unit 42 ⁇ / b> A, and reenters the photoelectric conversion unit 41.
- the imaging and focus detection pixel 13 the first light flux that has passed through the first pupil region 61 and the second light flux that has passed through the second pupil region 62 are transmitted to the photoelectric conversion unit 41 via the microlens 40. Incident.
- a broken line 65 indicates the first light flux that has passed through the first pupil region 61 and transmitted through the microlens 40 and the photoelectric conversion unit 41 of the imaging / focus detection pixel 13 and reflected by the reflection unit 42B. This is schematically shown.
- the imaging pixel 12 the light beam that has passed through both the first pupil region 61 and the second pupil region 62 of the exit pupil 60 of the imaging optical system 31 in FIG. 1 enters the photoelectric conversion unit 41 through the microlens 40. .
- the imaging pixel 12 outputs a signal S1 related to the light flux that has passed through both the first and second pupil regions 61 and 62. That is, the imaging pixel 12 photoelectrically converts light that has passed through both the first and second pupil regions 61 and 62, and outputs a signal S1 based on the charge generated by the photoelectric conversion.
- the imaging / focus detection pixel 11 includes a signal S1 based on the electric charge obtained by photoelectrically converting the first and second light fluxes that have passed through the first pupil region 61 and the second pupil region 62, and the first reflected by the reflecting unit 42A.
- a signal (S1 + S2) obtained by adding the signal S2 based on the charge obtained by photoelectrically converting the two luminous fluxes is output.
- the imaging / focus detection pixel 13 includes the signal S1 based on the charge obtained by photoelectrically converting the first and second light fluxes that have passed through the first pupil region 61 and the second pupil region 62, and the first reflected by the reflecting unit 42B.
- a signal (S1 + S3) obtained by adding the signal S3 based on the charge obtained by photoelectrically converting one light beam is output.
- the image data generation unit 21b of the body control unit 21 generates image data related to the subject image based on the signal S1 of the imaging pixel 12 and the signals (S1 + S2) and (S1 + S3) of the imaging and focus detection pixels 11 and 13.
- the gains of the signals (S1 + S2) and (S1 + S3) of the imaging / focus detection pixels 11 and 13 are set to the signals of the imaging pixel 12. It is better to make it smaller than the gain of S1.
- the focus detection unit 21a of the body control unit 21 determines the focus position based on the signal S1 of the imaging pixel 12, the signal (S1 + S2) of the imaging / focus detection pixel 11 and the signal (S1 + S3) of the imaging / focus detection pixel 13. Correlation calculation is performed to obtain it.
- the focus detection unit 21a calculates the amount of deviation between the image of the first light beam that has passed through the first pupil region 61 and the image of the second light beam that has passed through the second pupil region 62 by this correlation calculation, A defocus amount is calculated based on the image shift amount.
- the focus detection unit 21a obtains the difference between the output of the imaging pixel 12 and the output of the imaging / focus detection pixel 11, and the difference between the output of the imaging pixel 12 and the imaging / focus detection pixel 13.
- the focus detection unit 21a calculates an image shift amount between the image of the first light beam that has passed through the first pupil region 61 and the image of the second light beam that has passed through the second pupil region 62 from the obtained difference. To do.
- a defocus amount is calculated based on the calculated image shift amount. For example, from the signal S1 and the signal (S1 + S2), the signal S2 based on the electric charge obtained by photoelectrically converting the second light beam reflected by the reflecting portion 42A is obtained.
- the focus detection unit 21a obtains a signal S3 based on the charge obtained by photoelectrically converting the first light beam reflected by the reflection unit 42B from the signal S1 and the signal (S1 + S3).
- the focus detection unit 21a detects the phase difference between the signal S2 and the signal S3 to obtain the defocus amount.
- the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13 are provided with reflecting portions 42A and 42B at positions different from each other in the direction intersecting the light incident direction.
- the reflective portion 42A reflects light incident through the pupil region 62
- the reflective portion 42B of the imaging / focus detection pixel 13 reflects light incident through the pupil region 61.
- the amount of light received through the pupil region 62 is increased in the photoelectric conversion unit 41 of the imaging / focus detection pixel 11.
- the photoelectric conversion unit 41 of the imaging / focus detection pixel 13 increases.
- the signal output from the imaging / focus detection pixel 11 increases the signal component due to light passing through the pupil region 62.
- the signal component due to the light passing through the pupil region 61 increases.
- the phase difference information of the subject image can be obtained, and the defocus amount can be calculated.
- the phase difference of the subject image is detected using the reflected light without providing a light-shielding film for detecting the phase difference on the light incident surface as in the prior art. For this reason, it can avoid that the opening which a pixel has becomes small.
- the sensitivity (quantum efficiency) of the photoelectric conversion unit 41 of the pixel can be improved.
- FIG. 4 is a diagram illustrating an example of a cross-sectional structure of the image sensor according to the first embodiment.
- the image sensor 22 shown in FIG. 4 is a back-illuminated image sensor.
- the image sensor 22 includes a first substrate (also referred to as a semiconductor layer) 111 and a second substrate 114.
- the first substrate 111 is composed of a semiconductor substrate
- the second substrate 114 is composed of a semiconductor substrate, a glass substrate, or the like.
- the second substrate 114 functions as a support substrate for the first substrate 111.
- the first substrate 111 is stacked on the second substrate 114 via the adhesive layer 113.
- the incident light is incident mainly in the positive direction of the Z-axis indicated by a white arrow.
- the imaging / focus detection pixel 11 (hereinafter also referred to as imaging / focus detection pixel 11A), the imaging pixel 12 (hereinafter also referred to as imaging pixel 12A), and imaging / focus detection toward the X-axis plus direction.
- a pixel 13, an imaging pixel 12 (hereinafter also referred to as an imaging pixel 12B), and an imaging and focus detection pixel 11 (hereinafter also referred to as an imaging and focus detection pixel 11B) are disposed.
- the imaging / focus detection pixel 11, the imaging pixel 12, and the imaging / focus detection pixel 13 are provided with a microlens 40, a color filter 43, a light shielding film 44, an antireflection film 45, a p + layer 46, and a diffusion separation unit 57, respectively. .
- the microlens 40 condenses incident light on the photoelectric conversion unit 41.
- a G color filter is provided in the imaging / focus detection pixel 11
- an R color filter is provided in the imaging pixel 12
- a G color filter is provided in the imaging / focus detection pixel 13.
- the light shielding film 44 prevents light from leaking to adjacent pixels.
- the p + layer 46 is formed using p-type impurities, and reduces the mixing of dark current into the photoelectric conversion unit 41.
- the diffusion separation unit 57 separates the photoelectric conversion units 41.
- the first substrate 111 has a first surface 105a on which an electrode and an insulating film are provided, and a second surface 105b different from the first surface.
- the second surface 105b is an incident surface on which light is incident.
- a wiring layer 112 is stacked on the first surface 105 a of the first substrate 111.
- the imaging element 22 includes a photoelectric conversion unit 41 and an output unit 70, and a plurality of photoelectric conversion units 41 and output units 70 are arranged in the X-axis direction and the Y-axis direction.
- the photoelectric conversion unit 41 is, for example, a photodiode (PD), and converts incident light into electric charges.
- the output unit 70 generates a signal from the charge photoelectrically converted by the photoelectric conversion unit 41 and outputs the signal.
- the output unit 70 outputs the generated signal to the wiring layer 112.
- the output unit 70 includes a transfer transistor (transfer unit) 50, a transistor such as an amplification transistor, and the like.
- the n + region 47 and the n + region 48 formed in the semiconductor layer 111 are each formed using n-type impurities and function as the source / drain regions of the transfer transistor 50.
- the electrode 49 formed on the wiring layer 112 with an insulating film functions as a gate electrode (transfer gate) of the transfer transistor 50.
- n + region 47 also functions as a part of the photoelectric conversion unit 41.
- the electrode 49 is connected to the wiring 52 of the output unit 70 provided on the metal layer 115 via the contact 51.
- the imaging and focus detection pixels 11 and 13 and the imaging pixel 12 are arranged in the row direction (X-axis direction), and the wiring 52 of the imaging and focus detection pixels 11 and 13 and the imaging pixel 12 are connected to each other and shared. .
- the wiring layer 112 is a wiring layer including a conductor film (metal film) and an insulating film, and a plurality of wirings and vias are arranged therein. Copper, aluminum, or the like is used for the conductor film.
- the insulating film includes an insulating film or an insulating film between the conductor films, and is configured by an oxide film, a nitride film, or the like.
- the wiring layer 112 is provided with a reflective portion 42A and a reflective portion 42B. The reflective portion 42A and the reflective portion 42B are provided in the metal layer 115.
- the reflecting portion 42A and the reflecting portion 42B are made of a conductor film or the like, and are, for example, aluminum, copper, tungsten, or a multilayer film of these films.
- the reflective portion 42 ⁇ / b> A and the reflective portion 42 ⁇ / b> B are configured by a conductor film that covers almost half of the photoelectric conversion portion 41 in the metal layer 115.
- a part of the wiring formed in the wiring layer 112, for example, a part of the signal line connected to the output unit 70 can be used as the reflecting unit 42.
- the reflection part 42A and the reflection part 42B are shared by a conductor film for reflecting light and a signal line for transmitting a signal.
- a part of the insulating film or metal layer used for the output unit 70 can be used as the reflection unit 42.
- the reflective portion 42A and the reflective portion 42B are provided at different intervals from the adjacent pixels in the direction intersecting with the light incident direction.
- the reflection part 42A of the imaging / focus detection pixel 11A is provided at a predetermined first interval D1 from the imaging pixel 12A adjacent to the imaging / focus detection pixel 11A in the X-axis direction.
- the reflection part 42B of the imaging / focus detection pixel 13 is provided at a predetermined second interval D2 different from the first interval D1 from the imaging pixel 12B adjacent to the imaging / focus detection pixel 13 in the X-axis direction.
- the distance from the adjacent pixel may be the distance from the diffusion separation unit 57.
- the first interval D1 may be the interval between the diffusion separation unit 57 and the reflection unit 42A of the imaging pixel 12A.
- the second interval D2 may be the interval between the diffusion separation unit 57 and the reflection unit 42B of the imaging pixel 12B. Further, the first interval D1 and the second interval D2 may have no interval (zero).
- the reflection unit 42A and the reflection unit 42B are provided between the output unit 70 of the imaging / focus detection pixel 11 and the output unit 70 of the imaging / focus detection pixel 13.
- the reflection unit 42 ⁇ / b> A of the imaging / focus detection pixel 11 ⁇ / b> B and the reflection unit 42 ⁇ / b> B of the imaging / focus detection pixel 13 include the output unit 70 of the imaging / focus detection pixel 11 ⁇ / b> B and the output unit 70 of the imaging / focus detection pixel 13. It is provided between.
- the output unit 70 of the imaging / focus detection pixel 11 and the output unit 70 of the imaging / focus detection pixel 13 are provided between the reflection unit 42A and the reflection unit 42B.
- the output unit 70 of the imaging / focus detection pixel 11 ⁇ / b> A and the output unit 70 of the imaging / focus detection pixel 13 include a reflection unit 42 ⁇ / b> A of the imaging / focus detection pixel 11 ⁇ / b> A and a reflection unit 42 ⁇ / b> B of the imaging / focus detection pixel 13. It is provided between.
- the reflection part 42A and the reflection part 42B are provided at different intervals from the diffusion separation part 57 between the photoelectric conversion part of the adjacent pixel.
- the reflection unit 42A of the imaging / focus detection pixel 11A is a diffusion separation unit 57 between the photoelectric conversion unit 41 of the imaging / focus detection pixel 11A and the photoelectric conversion unit 41 of the adjacent imaging pixel 12A.
- the reflection part 42B of the imaging / focus detection pixel 13 is different from the diffusion separation part 57 between the photoelectric conversion part 41 of the imaging / focus detection pixel 13 and the photoelectric conversion part 41 of the adjacent imaging pixel 12B. They are provided at different predetermined fourth intervals.
- the photoelectric conversion unit 41 of the imaging / focus detection pixel 11 is provided between the second surface 105b serving as the light incident surface of the imaging / focus detection pixel 11 and the reflection unit 42A.
- the photoelectric conversion unit 41 is provided between the second surface 105b serving as the light incident surface of the imaging / focus detection pixel 13 and the reflection unit 42B.
- the photoelectric conversion unit 41 and the reflection unit 42 ⁇ / b> A of the imaging / focus detection pixel 11 are sequentially provided in the light incident direction (Z-axis plus direction).
- the photoelectric conversion unit 41 and the reflection unit 42B of the imaging / focus detection pixel 13 are sequentially provided in the direction in which light enters.
- the reflection unit 42 ⁇ / b> A and the reflection unit 42 ⁇ / b> B are provided between the wiring 52 of the output unit 70 of the imaging / focus detection pixel 11 and the wiring 52 of the output unit 70 of the imaging / focus detection pixel 13.
- the reflection unit 42A of the imaging / focus detection pixel 11B and the reflection unit 42B of the imaging / focus detection pixel 13 are the wiring 52 of the output unit 70 of the imaging / focus detection pixel 11B and the output of the imaging / focus detection pixel 13. It is provided between the wirings 52 of the unit 70.
- the output units 70 of the imaging pixel 12A and the imaging pixel 12B are provided between the reflecting unit 42A and the reflecting unit 42B, respectively.
- the wiring 52 of the output unit 70 of the imaging pixel 12A and the imaging pixel 12B is provided between the reflecting unit 42A and the reflecting unit 42B, respectively.
- the imaging pixel 12A is provided between the reflection unit 42A of the imaging / focus detection pixel 11A and the reflection unit 42B of the imaging / focus detection pixel 13, and the imaging pixel 12B is a reflection unit 42A of the imaging / focus detection pixel 11B.
- the reflective portion 42A has a surface (XY plane) that intersects the direction in which light is incident (Z-axis plus direction) and passes through the center of the photoelectric conversion portion 41 and is divided by a line parallel to the Y-axis.
- the reflection unit 42B has a right half (in the region divided by a line passing through the center of the photoelectric conversion unit 41 and parallel to the Y axis on the plane (XY plane) intersecting the light incident direction (Z-axis plus direction). At least a part is provided in the region on the X axis plus direction side. Note that the reflecting portion 42A is located above the region divided by a line that passes through the center of the photoelectric conversion portion 41 and is parallel to the X axis on the plane (XY plane) that intersects the light incident direction (Z-axis plus direction). At least a part may be provided in the region on the half (Y-axis minus direction) side.
- the reflection part 42B has a lower half (in the region divided by a line passing through the center of the photoelectric conversion part 41 and parallel to the X axis on the plane (XY plane) intersecting the light incident direction (Z-axis plus direction) ( At least a part may be provided in the region on the Y axis plus direction side.
- the reflection part 42A and the reflection part 42B reflect the light transmitted through the photoelectric conversion part 41 to the photoelectric conversion part 41 side.
- the photoelectric conversion unit 41 of the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13 receives the light incident through the microlens 40 and the light reflected by the reflection unit 42A and the reflection unit 42B, and the received light amount. The electric charge according to is generated.
- the photoelectric conversion unit 41 of the imaging pixel 12 receives light incident through the microlens 40 and generates a charge corresponding to the amount of received light.
- the output unit 70 outputs a signal based on charges from the photoelectric conversion unit 41 to the wiring layer 112. A signal from each pixel output to the wiring layer 112 is subjected to signal processing such as AD conversion by a peripheral circuit of the pixel and the like, and is output to the body control unit 21 shown in FIG.
- the focus detection unit 21a of the body control unit 21 calculates the defocus amount using a signal based on the photoelectrically converted charge output from the imaging element 22. For example, the focus detection unit 21a subtracts the signal (S1 + S2) of the imaging / focus detection pixel 11 and the signal S1 of the imaging pixel 12 to obtain the signal S2. The focus detection unit 21a subtracts the signal (S1 + S3) from the imaging / focus detection pixel 13 and the signal S1 from the imaging pixel 12 to obtain a signal S3. The focus detection unit 21a performs correlation calculation based on the signal S2 and the signal S3, thereby obtaining phase difference information of an image by a pair of light beams incident through different pupil regions of the imaging optical system 31. The defocus amount can be calculated by the phase difference detection method. Further, the lens control unit 32 can adjust the focal position of the imaging optical system 31 using the defocus amount output from the body control unit 21.
- the image pickup device 22 has passed through the first photoelectric conversion unit 41 that photoelectrically converts incident light to generate charges, and the first photoelectric conversion unit 41 provided at the first interval D1 from the adjacent pixel 12A.
- the first pixel 11 having the first reflection part 42A that reflects light
- the second photoelectric conversion part 41 that photoelectrically converts incident light to generate charges
- the first distance D1 from the adjacent pixel 12B
- a second pixel 13 having a second reflection part 42B that reflects light that has passed through the second photoelectric conversion part 41 and is provided at a different second interval D2.
- the pixel 11 and the pixel 13 that are different from each other in the position where the reflecting portion 42 is disposed are provided. Therefore, the phase difference information of the subject image can be obtained by using signals from the pixels 11 and 13.
- the imaging device 22 includes a first photoelectric conversion unit 41 that photoelectrically converts incident light to generate a charge, and a first output unit 70 that outputs a signal based on the charge generated by the first photoelectric conversion unit 41.
- a first reflection unit 42A that is provided between the first output unit 70 and the second output unit 70 and reflects light that has passed through the first photoelectric conversion unit 41;
- a second reflection unit 42B provided between the output unit 70 and the second output unit 70 and configured to reflect the light that has passed through the second photoelectric conversion unit 41; Since it did in this way, the phase difference information of a to-be-photographed image can be obtained by using the signal by the pixel 11 and the pixel 13.
- the third photoelectric conversion unit 41 is adjacent to the first or second pixel in the row direction (X direction) and photoelectrically converts incident light to generate charges, and is generated by the third photoelectric conversion unit 41.
- the first reflecting unit 42A or the second reflecting unit 42B includes the third output unit 70 and the first output unit 70.
- the output unit 70 or between the third output unit 70 and the second output unit 70 may be provided.
- the first output unit 70 may be provided between the third output unit 70 and the first reflection unit 42A, or the second output unit 70 may be provided between the third output unit 70 and the second reflection unit 42B.
- the first reflection unit 42A is provided at a third interval from the separation unit 57 between the first photoelectric conversion unit 41 and the photoelectric conversion unit 41 of the adjacent pixel, and the second reflection unit 42B is the second reflection unit 42B.
- the focus control apparatus includes the image sensor 22, a signal output from the first output unit 70 of the image sensor 22 that has received light incident via the optical system (imaging optical system 31), and the second output unit 70. And a control unit (lens control unit 32) that adjusts the focal position of the optical system on the basis of the signal output from. Since it did in this way, the phase difference information of a to-be-photographed image can be obtained using reflected light, and a focus position can be adjusted.
- the focus detection apparatus includes a signal from the first photoelectric conversion unit 41 and a signal from the second photoelectric conversion unit 41 that have received light that has entered through the imaging element 22 and the optical system (imaging optical system 31).
- a focus detection unit (body control unit 21) for detecting the focus of the optical system. Since it did in this way, the phase difference information of a to-be-photographed image can be obtained using reflected light, and the focus detection about an optical system can be performed.
- the size of the aperture of the pixel becomes smaller (shorter) than the wavelength of light, and light may not enter the photoelectric conversion unit of the pixel. Since the amount of light received at the photoelectric conversion unit is reduced, the charge that is photoelectrically converted is also reduced. Therefore, it is difficult to detect the focus of the optical system from the phase difference information of the signal generated from the charge and to adjust the focus of the optical system. In order to detect the phase difference, the amount of light received by the pixel provided with the light shielding film on the light incident surface is further reduced. Therefore, it becomes more difficult to detect the focus of the optical system from the phase difference information and to adjust the focus of the optical system.
- the present embodiment it is possible to detect the focus of the optical system from the phase difference information and adjust the focus of the optical system without providing a light-shielding film on the light incident surface.
- the pixel 11 and the pixel 13 that function as focus detection pixels of the present embodiment have the same sensitivity as a pixel that does not have a reflecting portion. For this reason, the defect correction etc. with respect to the signal from the pixel 11 and the pixel 13 become easy, and it can suppress that it becomes a defective pixel as an imaging pixel for producing
- FIG. 5 is a diagram illustrating an example of a cross-sectional structure of the image sensor according to the first modification.
- the reflective portion 42A and the reflective portion 42B are provided by being directly stacked on the semiconductor layer 111.
- the reflecting part 42A and the reflecting part 42B are an oxide film, a nitride film, or the like.
- a silicon oxide film, a silicon nitride film, a silicon oxynitride film, or a multilayer film of these films is used.
- the reflective portion may be stacked on the semiconductor layer 111 with an insulating film interposed therebetween. In this modification, the reflective portion is provided at a position closer to the semiconductor layer 111 as compared to the case where the reflective portion is provided using the conductor film of the metal layer 115. For this reason, it can suppress that the reflected light by a reflection part injects into an adjacent pixel. As a result, it is possible to suppress noise from being mixed into signals from adjacent pixels.
- FIG. 6 is a diagram illustrating an example of a cross-sectional structure of an image sensor according to the second modification.
- the light shielding portions 55 (light shielding portions 55A to 55D), 56 (Light shielding part 56A to light shielding part 56D) are further provided.
- the light shielding portion 55 shown in FIG. 6A is made of a conductor film (metal film), polysilicon, or the like.
- the light shielding unit 55A is provided between the photoelectric conversion unit 41 and the reflection unit 42A of the imaging / focus detection pixel 11, and the light shielding unit 55D is provided between the photoelectric conversion unit 41 and the reflection unit 42B of the imaging / focus detection pixel 13.
- the light shielding portion 56 shown in FIG. 6B is configured by DTI (Deep Trench Isolation). That is, in the example shown in FIG. 6B, a groove is formed between the pixels, and an oxide film, a nitride film, polysilicon, or the like is embedded in the groove.
- the light shielding unit 56 is provided between the adjacent photoelectric conversion units 41.
- the light shielding unit 56B is provided between the photoelectric conversion unit 41 of the imaging and focus detection pixel 11 and the photoelectric conversion unit 41 of the imaging pixel 12, and the light shielding unit 56C is the photoelectric conversion unit 41 of the imaging and focus detection pixel 13. And the photoelectric conversion unit 41 of the imaging pixel 12.
- the light-shielding part 55 and the light-shielding part 56 are arranged, it is possible to suppress the reflected light from the reflective part 42A and the reflective part 42B from entering adjacent pixels and to suppress crosstalk between the pixels. it can.
- the sensitivity of the photoelectric conversion unit 41 can be improved, and the accuracy of focus detection can be improved.
- the light shielding part 55 (light shielding part 55A to light shielding part 55D) and the light shielding part 56 (light shielding part 56A to light shielding part 56D) are the reflection part 55 (reflection part 55A to reflection part 55D) and the reflection part 56 (reflection part 56A to reflection part). 56D).
- FIG. 7 is a diagram illustrating an arrangement example of pixels of the image sensor according to the third modification
- FIG. 8 is a diagram illustrating an example of a cross-sectional structure of the image sensor according to the third modification.
- the imaging and focus detection pixels 11 and 13 are alternately arranged in the row direction with the R imaging pixels 12 interposed therebetween.
- the imaging and focus detection pixels 11 and 13 are alternately arranged in the row direction (X-axis direction) with the two R imaging pixels 12 and the G imaging pixels 12 interposed therebetween. It may be arranged.
- FIG. 8 and 9 are diagrams showing an example of a cross-sectional structure of the image sensor according to the third modification.
- the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13 are arranged so that the positions of the photoelectric conversion unit 41 and the output unit 70 are symmetrical (symmetrical) with respect to the optical axis. ing.
- the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13 may have the same configuration except that the positions of the reflection portions 42A and 42B are different.
- the signal based on the electric charge obtained by photoelectrically converting the light reflected by the reflector may be calculated by subtracting the signal of the imaging / focus detection pixel and the average value of the signals of the two imaging pixels.
- signals from the G imaging pixels 12A, 12B, and 12C arranged as shown in FIG. 7 and signals from the imaging and focus detection pixels 11 and 13 are used. Use.
- the focus detection unit 21a subtracts the signal of the imaging / focus detection pixel 11 from the average value of the signals of the imaging pixels 12A and 12B, thereby converting the second light beam reflected by the reflection unit 42A into a photoelectrically converted charge. Based on this, the signal S2 is calculated. Similarly, the focus detection unit 21a performs subtraction between the signal of the imaging / focus detection pixel 13 and the average value of the signals of the G imaging pixels 12B and 12C, and the first light flux reflected by the reflecting unit 42B. A signal S3 is calculated based on the charge obtained by photoelectric conversion of The focus detection unit 21a can calculate the defocus amount by performing correlation calculation using the signal S2 and the signal S3.
- the charge obtained by photoelectrically converting the light reflected by the reflection unit by subtracting the signal of the imaging / focus detection pixel from the average value of the signals of the plurality of imaging pixels arranged around the imaging / focus detection pixel.
- the focus detection unit 21a subtracts the signal of the imaging / focus detection pixel 11 and the average value of the signals of the imaging pixels 12D, 12E, 12F, and 12G, thereby photoelectrically converting the second light flux reflected by the reflection unit 42A.
- a signal S2 based on the converted charge is calculated.
- the focus detection unit 21a performs subtraction between the signal of the imaging / focus detection pixel 13 and the average value of the signals of the imaging pixels 12H, 12I, 12J, and 12K, and the first reflected by the reflection unit 42B.
- a signal S3 based on the charge obtained by photoelectrically converting the luminous flux is calculated.
- the focus detection unit 21a can calculate the defocus amount by performing correlation calculation using the signal S2 and the signal S3.
- FIG. 10 is a diagram illustrating an example of a cross-sectional structure of an image sensor according to the fourth modification.
- the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13 are arranged so that the positions of the photoelectric conversion unit 41 and the output unit 70 are symmetric (laterally symmetric) with respect to the optical axis. ing.
- the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13 may have the same configuration except that the positions of the reflection portions 42A and 42B are different.
- the reflection portion 42B is formed in two layers.
- the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13 have the same configuration except for the positions of the reflection part 42 ⁇ / b> A and the reflection part 42 ⁇ / b> B, and are the same as in the first modification.
- the reflecting portion 42A and the reflecting portion 42B may be formed using an oxide film, a nitride film, or the like. In this case, the reflecting portions 42A and 42B may be formed using a plurality of insulating films.
- the reflecting portion 42 ⁇ / b> B is formed using a plurality of insulating films such as a gate insulating film used for the output unit 70 and an insulating film between wirings.
- the light that has passed through the photoelectric conversion unit 41 is reflected by the reflection unit 42A and the reflection unit 42B, and focus detection is performed using the signal component of the reflected light.
- the G color filter is provided in the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13
- an R color filter is provided in the imaging / focus detection pixel 11 and the imaging / focus detection pixel 13. Also good.
- a signal of light in a long wavelength region that easily passes through the photoelectric conversion unit 41 for example, light such as infrared light or near infrared light, for focus detection.
- the present invention is also applied to industrial cameras and medical cameras in which infrared or near-infrared images are used. can do.
- the image sensor 22 has been described as an example of a backside illumination type configuration.
- the imaging element 22 may have a surface irradiation type configuration in which the wiring layer 112 is provided on the incident surface side on which light is incident.
- the image sensor is output from a first substrate on which a pixel having a photoelectric conversion unit is arranged, a second substrate on which an AD conversion unit that converts a signal from the pixel into a digital signal is arranged, and the AD conversion unit A configuration may be employed in which a third substrate on which an image processing unit that performs various types of image processing on a digital signal is disposed is stacked.
- the first substrate, the second substrate, and the third substrate are provided from the light incident side.
- the image sensor 22 described in the above-described embodiment may be applied to a camera, a smartphone, a tablet, a camera built in a PC, a vehicle-mounted camera, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Power Engineering (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Electromagnetism (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Focusing (AREA)
- Automatic Focus Adjustment (AREA)
- Blocking Light For Cameras (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Description
本発明の第2の態様によると、撮像装置は、フォーカスレンズを有する光学系による像を撮像した前記撮像素子の前記第1画素から出力された信号と前記第2画素から出力された信号とに基づいて、前記光学系による像が前記撮像素子に合焦するよう前記フォーカスレンズの位置を制御する制御部と、を備える。
図1は、第1の実施の形態に係る撮像装置であるデジタルカメラ1(以下、カメラ1と呼ぶ)の要部構成図である。カメラ1は、カメラボディ2と交換レンズ3とにより構成される。交換レンズ3は、マウント部(不図示)を介してカメラボディ2に装着される。カメラボディ2に交換レンズ3が装着されると、カメラボディ2側の接続部202と交換レンズ3側の接続部302とが接続され、カメラボディ2および交換レンズ3間の通信が可能となる。
後述の図7または図8に示すように撮像兼焦点検出画素11、13と撮像画素12が配置されている場合、Gの撮像画素12は撮像兼焦点検出画素11と撮像兼焦点検出画素13との間に配置されるGの撮像画素12である。なお、撮像兼焦点検出画素11または撮像兼焦点検出画素13の周辺に配置されるGの撮像画素12であってもよい。
また、反射部42Aは、光が入射する方向(Z軸プラス方向)と交差する面(XY平面)において、光電変換部41の中心を通りY軸と平行な線で分割された領域のうち左半分(X軸マイナス方向)側の領域に少なくとも一部が設けられている。反射部42Bは、光が入射する方向(Z軸プラス方向)と交差する面(XY平面)において、光電変換部41の中心を通りY軸と平行な線で分割された領域のうち右半分(X軸プラス方向)側の領域に少なくとも一部が設けられている。なお、反射部42Aは、光が入射する方向(Z軸プラス方向)と交差する面(XY平面)において、光電変換部41の中心を通りX軸と平行な線で分割された領域のうち上半分(Y軸マイナス方向)側の領域に少なくとも一部が設けられていてもよい。反射部42Bは、光が入射する方向(Z軸プラス方向)と交差する面(XY平面)において、光電変換部41の中心を通りX軸と平行な線で分割された領域のうち下半分(Y軸プラス方向)側の領域に少なくとも一部が設けられていてもよい。
(1)撮像素子22は、入射した光を光電変換して電荷を生成する第1光電変換部41と、隣の画素12Aから第1間隔D1で設けられる、第1光電変換部41を通過した光を反射する第1反射部42Aと、を有する第1画素11と、入射した光を光電変換して電荷を生成する第2光電変換部41と、隣の画素12Bから第1間隔D1とは異なる第2間隔D2で設けられる、第2光電変換部41を通過した光を反射する第2反射部42Bと、を有する第2画素13と、を備える。本実施の形態では、反射部42が配置される位置が互いに異なる画素11および画素13を備える。このため、画素11および画素13による信号を用いることで、被写体像の位相差情報を得ることができる。
また、上記第1または第2画素と行方向(X方向)において隣接し、入射した光を光電変換して電荷を生成する第3光電変換部41と、第3光電変換部41で生成された電荷による信号を出力する第3出力部70と、を有する撮像のみを行う画素を第3画素とした場合、上記第1反射部42Aまたは第2反射部42Bは、第3出力部70と第1出力部70または第3出力部70と第2出力部70の間にあってもよい。
もしくは、第3出力部70と第1反射部42Aの間に第1出力部70、または、第3出力部70と第2反射部42Bの間に第2出力部70があってもよい。
(3)第1反射部42Aは、第1光電変換部41と隣の画素の光電変換部41との間にある分離部57から第3間隔で設けられ、第2反射部42Bは、第2光電変換部41と隣の画素の光電変換部41との間にある分離部57から第4間隔で設けられる。このようにしたので、異なる瞳領域を介して入射された一対の光束による像の位相差情報を得ることができる。
(5)焦点検出装置は、撮像素子22と、光学系(撮像光学系31)を介して入射した光を受光した第1光電変換部41からの信号と第2光電変換部41からの信号とに基づき光学系について焦点検出する焦点検出部(ボディ制御部21)と、を備える。このようにしたので、反射光を用いて被写体像の位相差情報を得ることができ、光学系についての焦点検出を行うことができる。
(7)本実施の形態の焦点検出用の画素として機能する画素11および画素13は、反射部を有しない画素と同等の感度を有する。このため、画素11および画素13からの信号に対する欠陥補正等が容易となり、画像を生成するための撮像用画素としては欠陥画素となることを抑制することができる。
態と組み合わせることも可能である。
上述した実施の形態では、反射部42Aおよび反射部42Bをメタル層115の導体膜を用いて形成する例について説明した。しかし、メタル層115とは異なる位置に反射部を設けるようにしてもよい。また、導体(金属)とは異なる材料を用いて反射部を設けるようにしてもよい。図5は、変形例1に係る撮像素子の断面構造の一例を示す図である。変形例1に係る撮像素子では、反射部42Aおよび反射部42Bは、半導体層111に直接積層して設けられる。例えば、反射部42Aおよび反射部42Bは、酸化膜や窒化膜などである。具体的には、シリコン酸化膜、シリコン窒化膜、シリコン酸窒化膜、又はこれらの膜の多層膜などである。なお、ポリシリコンを用いて反射部を形成するようにしてもよい。また、反射部は、半導体層111に絶縁膜を介して積層してもよい。本変形例では、メタル層115の導体膜を用いて反射部を設ける場合と比較して、半導体層111に近い位置に反射部が設けられる。このため、隣接する画素に反射部による反射光が入射することを抑制することができる。この結果、隣接する画素による信号にノイズが混入することを抑制することができる。
図6は、変形例2に係る撮像素子の断面構造の一例を示す図である。変形例2に係る撮像素子22では、図6(a)および図6(b)に示すように、反射部42Aおよび反射部42Bに加えて遮光部55(遮光部55A~遮光部55D)、56(遮光部56A~遮光部56D)が更に設けられる。図6(a)に示す遮光部55は、導体膜(金属膜)やポリシリコン等により構成される。遮光部55Aは、撮像兼焦点検出画素11の光電変換部41と反射部42Aとの間に設けられ、遮光部55Dは、撮像兼焦点検出画素13の光電変換部41と反射部42Bとの間に設けられる。また、図6(b)に示す遮光部56は、DTI(Deep Trench Isolation)により構成される。すなわち、図6(b)に示す例では、画素間に溝が形成され、その溝に酸化膜、窒化膜、ポリシリコン等が埋め込まれる。遮光部56は、隣り合う光電変換部41の間に設けられる。例えば、遮光部56Bは、撮像兼焦点検出画素11の光電変換部41と撮像画素12の光電変換部41との間に設けられ、遮光部56Cは、撮像兼焦点検出画素13の光電変換部41と撮像画素12の光電変換部41との間に設けられる。
図7は変形例3に係る撮像素子の画素の配置例を示す図であり、図8は変形例3に係る撮像素子の断面構造の一例を示す図である。上述した実施の形態では、撮像兼焦点検出画素11、13は互いに、Rの撮像画素12を挟んで、行方向に交互に配置される例について説明した。しかし、図7および図8に示すように、撮像兼焦点検出画素11、13を互いに、2つのRの撮像画素12およびGの撮像画素12を挟んで、行方向(X軸方向)に交互に配置するようにしてもよい。
図10は、変形例4に係る撮像素子の断面構造の一例を示す図である。上述した実施の形態では、撮像兼焦点検出画素11と撮像兼焦点検出画素13とでは、光電変換部41および出力部70の位置が光軸に対して対称(左右対称)となるように配置されている。しかし、図10に示すように、撮像兼焦点検出画素11と撮像兼焦点検出画素13とは、その反射部42A、42Bの位置が異なる以外は同様の構成としてもよい。また、反射部42A、42Bを、複数の層に設けるようにしてもよい。図10(b)に示す例では、反射部42Bは、2つの層に形成されている。さらに、図11に示すように、撮像兼焦点検出画素11と撮像兼焦点検出画素13とは、反射部42Aおよび反射部42Bの位置が異なる以外は同様の構成とし、変形例1の場合と同様にして反射部42Aおよび反射部42Bを酸化膜、窒化膜等を用いて形成するようにしてもよい。この場合に、反射部42A、42Bを、複数の絶縁膜を用いて形成してもよい。図11に示す撮像兼焦点検出画素13では、出力部70に用いられるゲート絶縁膜や配線間の絶縁膜等の複数の絶縁膜を用いて反射部42Bを形成している。
上述した実施の形態および変形例では、反射部42Aおよび反射部42Bにより光電変換部41を通過した光を反射させ、反射光による信号成分を用いて焦点検出を行う。撮像兼焦点検出画素11および撮像兼焦点検出画素13にGのカラーフィルタを設ける例について説明したが、撮像兼焦点検出画素11および撮像兼焦点検出画素13に例えばRのカラーフィルタを設けるようにしてもよい。これにより、光電変換部41を透過しやすい長波長領域の光、例えば赤外光や近赤外光等の光による信号を焦点検出に用いることができる。また、光電変換部41を透過しやすい長波長領域の光を用いることがより好適であるため、本発明を、赤外線や近赤外線の画像が用いられる産業用のカメラや医療用のカメラにも適用することができる。
上述した実施の形態では、算出されたデフォーカス量に基づいて焦点調節レンズを制御する例について説明したが、本発明はこれに限定されない。デフォーカス量に基づいてズームレンズや絞り等の動作を制御するようにしてもよい。
上述した実施の形態では、撮像素子22は、裏面照射型の構成とする例について説明した。しかし、撮像素子22を、光が入射する入射面側に配線層112を設ける表面照射型の構成としてもよい。
上述した実施の形態では、撮像兼焦点検出画素11および撮像兼焦点検出画素13にGのカラーフィルタを設ける例について説明したが、例えばRのカラーフィルタやBのカラーフィルタを設けるようにしてもよい。また、撮像兼焦点検出画素11および撮像兼焦点検出画素13に入射する光の全波長域を透過させるフィルタ(白色フィルタ)を設けるようにしてもよい。白色フィルタを設けることにより、焦点検出用の画素に入射する光の光量を増加させることができ、光電変換部41の感度を向上させることができる。また、白色フィルタを配置した画素を、焦点検出のための専用の画素として用いるようにしてもよい。
上述した実施の形態および変形例で説明した撮像素子を、複数の基板(例えば、複数の半導体基板)を積層して構成される積層センサ(積層型の撮像素子)に適用してもよい。例えば、撮像素子を、光電変換部を有する画素が配置された第1基板と、画素からの信号をデジタル信号に変換するAD変換部が配置された第2基板と、AD変換部から出力されるデジタル信号に対して各種の画像処理を行う画像処理部が配置された第3基板とを積層した構成としてもよい。この場合には、例えば、光が入射する側から、第1基板と第2基板と第3基板とを設けられる。
上述の実施の形態で説明した撮像素子22は、カメラ、スマートフォン、タブレット、PCに内臓のカメラ、車載カメラ等に適用されてもよい。
日本国特許出願2016年第70959号(2016年3月31日出願)
Claims (23)
- 複数の画素が第1方向に配列される撮像素子であって、
入射した光を光電変換して電荷を生成する第1光電変換部と、前記光が入射する方向と交差する面において、前記第1光電変換部の中心よりも前記第1方向側の領域に少なくとも一部に設けられ、前記第1光電変換部を透過した光の一部を前記第1光電変換部へ反射する第1反射部と、を有する第1画素と、
入射した光を光電変換して電荷を生成する第2光電変換部と、前記光が入射する方向と交差する面において、前記第2光電変換部の中心よりも前記第1方向と逆方向側の領域に少なくとも一部に設けられ、前記第2光電変換部を透過した光の一部を前記第2光電変換部へ反射する第2反射部と、を有する第2画素と、を備える撮像素子。 - 請求項1に記載の撮像素子において、
前記第1画素は、前記第1光電変換部で生成された電荷による信号を出力する第1出力部を有し、
前記第2画素は、前記第2光電変換部で生成された電荷による信号を出力する第2出力部を有し、
前記第1反射部と前記第2反射部とは前記第1出力部と前記第2出力部との間、または前記第1出力部と前記第2出力部とは前記第1反射部と前記第2反射部との間に設けられる撮像素子。 - 請求項2に記載の撮像素子において、
前記第1出力部は、第1配線を有し、
前記第2出力部は、第2配線を有し、
前記第1反射部と前記第2反射部とは前記第1配線と前記第2配線との間、または前記第1配線と前記第2配線とは前記第1反射部と前記第2反射部との間に設けられる撮像素子。 - 請求項1から請求項3までのいずれか一項に記載の撮像素子において、
前記第1反射部は、前記第1画素と隣の画素との間にある分離部から前記第1方向において第1間隔で設けられ、
前記第2反射部は、前記第2画素と隣の画素との間にある分離部から前記第1方向において前記第1間隔とは異なる第2間隔で設けられる撮像素子。 - 請求項1から請求項4までのいずれか一項に記載の撮像素子において、
入射した光を光電変換して電荷を生成する第3光電変換部を有する第3画素を備え、
前記第1画素と前記第2画素と第3画素とは、第1分光特性を有する第1フィルタを有する撮像素子。 - 請求項5に記載の撮像素子において、
入射した光を光電変換して電荷を生成する第4光電変換部を有する第4画素を備え、
前記第4画素は、前記第1分光特性よりも短い波長の光の透過率が高い第2分光特性を有する第2フィルタを有する撮像素子。 - 請求項6に記載の撮像素子において、
前記第1フィルタは、第1波長の光を透過し、
前記第2フィルタは、前記第1波長よりも短い波長の光を透過する撮像素子。 - 請求項1から請求項4までのいずれか一項に記載の撮像素子において、
入射した光を光電変換して電荷を生成する第3光電変換部を有する第3画素を備え、
前記第1画素と前記第2画素とは、第1分光特性を有する第1フィルタを有し、
前記第3画素は、前記第1分光特性よりも短い波長の光の透過率が高い第2分光特性を有する第2フィルタを有する撮像素子。 - 請求項8に記載の撮像素子において、
前記第1フィルタは、第1波長の光を透過し、
前記第2フィルタは、前記第1波長よりも短い波長の光を透過する撮像素子。 - 請求項6から請求項9までのいずれか一項に記載の撮像素子において、
前記第1フィルタは、赤色の光を透過し、
前記第2フィルタは、緑色の光を透過する撮像素子。 - 請求項6から請求項9までのいずれか一項に記載の撮像素子において、
前記第1フィルタは、緑色の光を透過し、
前記第2フィルタは、青色の光を透過する撮像素子。 - 請求項1から請求項4までのいずれか一項に記載の撮像素子において、
入射した光を光電変換して電荷を生成する第3光電変換部を有する第3画素と、
入射した光を光電変換して電荷を生成する第4光電変換部を有する第4画素と、を備え、
前記第1画素と前記第2画素とは、第1分光特性を有する第1フィルタを有し、
前記第3画素は、前記第1分光特性よりも短い波長の光の透過率が高い第2分光特性を有する第2フィルタを有し、
前記第4画素は、前記第2分光特性よりも短い波長の光の透過率が高い第3分光特性を有する第3フィルタを有する撮像素子。 - 請求項12に記載の撮像素子において、
前記第1フィルタは、第1波長の光を透過し、
前記第2フィルタは、前記第1波長よりも短い第2波長の光を透過し、
前記第3フィルタは、前記第2波長よりも短い波長の光を透過する撮像素子。 - 請求項13に記載の撮像素子において、
前記第1フィルタは、赤色の光を透過し、
前記第2フィルタは、緑色の光を透過し、
前記第3フィルタは、青色の光を透過する撮像素子。 - 請求項1から請求項14までのいずれか一項に記載の撮像素子において、
前記第1画素は、前記第2画素の隣に設けられ、
前記第1光電変換部と前記第2光電変換部との間に遮光部または反射部が設けられる撮像素子。 - 請求項5から請求項14までのいずれか一項に記載の撮像素子において、
前記第3画素は、前記第1画素と前記第2画素の間に設けられ、
前記第1光電変換部と前記第3光電変換部との間と、前記第2光電変換部と前記第3光電変換部との間との少なくとも一方に遮光部または反射部が設けられる撮像素子。 - 請求項16に記載の撮像素子において、
複数の前記第3画素が、前記第1画素と前記第2画素の間に設けられ、
複数の前記第3画素のうちの少なくとも1つの前記第3光電変換部と前記第1光電変換部との間と、複数の前記第3画素のうちの少なくとも1つの前記第3光電変換部と前記第2光電変換部との間と、の少なくとも一方に遮光部または反射部が設けられる撮像素子。 - 請求項1から請求項17までのいずれか一項に記載の撮像素子において、
前記第1画素は、前記第1光電変換部と前記第1反射部との間に遮光部または反射部を有し、
前記第2画素は、前記第2光電変換部と前記第2反射部との間に遮光部または反射部を有する撮像素子。 - 請求項1から請求項18までのいずれか一項に記載の撮像素子において、
前記第1反射部と前記第2反射部とは、金属膜または絶縁膜である撮像素子。 - 請求項1から請求項19までのいずれか一項に記載の撮像素子と、
フォーカスレンズを有する光学系による像を撮像する前記撮像素子の前記第1画素から出力された信号と前記第2画素から出力された信号とに基づいて、前記光学系による像が前記撮像素子に合焦するよう前記フォーカスレンズの位置を制御する制御部と、
を備える撮像装置。 - 請求項5に記載の撮像素子と、
フォーカスレンズを有する光学系による像を撮像する前記撮像素子の前記第1画素から出力された信号と前記第2画素から出力された信号と前記第3画素から出力された信号とに基づいて、前記光学系による像が前記撮像素子に合焦するよう前記フォーカスレンズの位置を制御する制御部と、
を備える撮像装置。 - 請求項21に記載の撮像装置において、
前記制御部は、前記第1画素から出力された信号と前記第3画素から出力された信号との差分と、前記第2画素から出力された信号と前記第3画素から出力された信号との差分と、に基づいて前記フォーカスレンズの位置を制御する撮像装置。 - 請求項20から請求項22までのいずれか一項に記載の撮像装置において、
前記撮像素子の前記第1画素から出力された信号と前記第2画素から出力された信号とに基づいて、画像データを生成する画像生成部を備える撮像装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/085,168 US10686004B2 (en) | 2016-03-31 | 2017-02-28 | Image capturing element and image capturing device image sensor and image-capturing device |
EP17774024.8A EP3439038A4 (en) | 2016-03-31 | 2017-02-28 | IMAGE CAPTURE ELEMENT AND IMAGE CAPTURE DEVICE |
CN201780029752.2A CN109155323A (zh) | 2016-03-31 | 2017-02-28 | 拍摄元件以及拍摄装置 |
JP2018508832A JP6791243B2 (ja) | 2016-03-31 | 2017-02-28 | 撮像素子、及び、撮像装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016070959 | 2016-03-31 | ||
JP2016-070959 | 2016-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017169479A1 true WO2017169479A1 (ja) | 2017-10-05 |
Family
ID=59964020
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/007936 WO2017169479A1 (ja) | 2016-03-31 | 2017-02-28 | 撮像素子、及び、撮像装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10686004B2 (ja) |
EP (1) | EP3439038A4 (ja) |
JP (1) | JP6791243B2 (ja) |
CN (1) | CN109155323A (ja) |
WO (1) | WO2017169479A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019134145A (ja) * | 2018-02-02 | 2019-08-08 | 株式会社ニコン | 撮像素子、及び、撮像装置 |
WO2020017341A1 (ja) * | 2018-07-18 | 2020-01-23 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子および測距モジュール |
JP2020013909A (ja) * | 2018-07-18 | 2020-01-23 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子、測距モジュール、および、電子機器 |
WO2020195825A1 (ja) * | 2019-03-25 | 2020-10-01 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および電子機器 |
EP3644366A4 (en) * | 2018-07-18 | 2021-08-04 | Sony Semiconductor Solutions Corporation | LIGHT RECEIVING ELEMENT AND DISTANCE MEASURING MODULE |
WO2021166672A1 (ja) * | 2020-02-20 | 2021-08-26 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、電子機器 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018056518A (ja) * | 2016-09-30 | 2018-04-05 | 株式会社ニコン | 撮像素子および焦点調節装置 |
CN117577654A (zh) | 2017-05-29 | 2024-02-20 | 索尼半导体解决方案公司 | 固态摄像装置和电子设备 |
JP7316764B2 (ja) | 2017-05-29 | 2023-07-28 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、及び電子機器 |
EP3410486B1 (en) * | 2017-06-02 | 2022-08-24 | ams AG | Resonant cavity enhanced image sensor |
WO2019065291A1 (ja) * | 2017-09-28 | 2019-04-04 | ソニーセミコンダクタソリューションズ株式会社 | 撮像素子および撮像装置 |
CN108965704B (zh) * | 2018-07-19 | 2020-01-31 | 维沃移动通信有限公司 | 一种图像传感器、移动终端及图像拍摄方法 |
EP3671837B1 (en) * | 2018-12-21 | 2023-11-29 | ams Sensors Belgium BVBA | Pixel of a semiconductor image sensor and method of manufacturing a pixel |
TWI734294B (zh) * | 2019-12-11 | 2021-07-21 | 香港商京鷹科技股份有限公司 | 影像感測器 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070045513A1 (en) * | 2005-08-30 | 2007-03-01 | Lee Ji S | Image sensors with optical trench |
JP2013055159A (ja) * | 2011-09-01 | 2013-03-21 | Canon Inc | 固体撮像装置 |
WO2013147199A1 (ja) * | 2012-03-30 | 2013-10-03 | 株式会社ニコン | 撮像素子、撮影方法、および撮像装置 |
WO2014156933A1 (ja) * | 2013-03-29 | 2014-10-02 | ソニー株式会社 | 撮像素子および撮像装置 |
WO2016063727A1 (ja) * | 2014-10-20 | 2016-04-28 | ソニー株式会社 | 固体撮像素子及び電子機器 |
JP2016127043A (ja) * | 2014-12-26 | 2016-07-11 | ソニー株式会社 | 固体撮像素子及び電子機器 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5092685B2 (ja) * | 2007-10-23 | 2012-12-05 | 株式会社ニコン | 撮像素子および撮像装置 |
JP4798270B2 (ja) * | 2009-02-13 | 2011-10-19 | 株式会社ニコン | 撮像素子、撮像装置及び撮像素子の製造方法 |
JP5131309B2 (ja) | 2010-04-16 | 2013-01-30 | ソニー株式会社 | 固体撮像装置、その製造方法および撮像装置 |
JP5856376B2 (ja) * | 2011-01-27 | 2016-02-09 | キヤノン株式会社 | 撮像装置及びその制御方法 |
JP2013098503A (ja) * | 2011-11-07 | 2013-05-20 | Toshiba Corp | 固体撮像素子 |
US9093345B2 (en) * | 2012-10-26 | 2015-07-28 | Canon Kabushiki Kaisha | Solid-state imaging apparatus and imaging system |
JP6161258B2 (ja) | 2012-11-12 | 2017-07-12 | キヤノン株式会社 | 固体撮像装置およびその製造方法ならびにカメラ |
JP2014213767A (ja) | 2013-04-26 | 2014-11-17 | 日本プラスト株式会社 | 可動部材の組み付け構造 |
JP6458343B2 (ja) * | 2014-02-27 | 2019-01-30 | 株式会社ニコン | 撮像装置 |
-
2017
- 2017-02-28 JP JP2018508832A patent/JP6791243B2/ja active Active
- 2017-02-28 EP EP17774024.8A patent/EP3439038A4/en not_active Withdrawn
- 2017-02-28 CN CN201780029752.2A patent/CN109155323A/zh active Pending
- 2017-02-28 US US16/085,168 patent/US10686004B2/en active Active
- 2017-02-28 WO PCT/JP2017/007936 patent/WO2017169479A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070045513A1 (en) * | 2005-08-30 | 2007-03-01 | Lee Ji S | Image sensors with optical trench |
JP2013055159A (ja) * | 2011-09-01 | 2013-03-21 | Canon Inc | 固体撮像装置 |
WO2013147199A1 (ja) * | 2012-03-30 | 2013-10-03 | 株式会社ニコン | 撮像素子、撮影方法、および撮像装置 |
WO2014156933A1 (ja) * | 2013-03-29 | 2014-10-02 | ソニー株式会社 | 撮像素子および撮像装置 |
WO2016063727A1 (ja) * | 2014-10-20 | 2016-04-28 | ソニー株式会社 | 固体撮像素子及び電子機器 |
JP2016127043A (ja) * | 2014-12-26 | 2016-07-11 | ソニー株式会社 | 固体撮像素子及び電子機器 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3439038A4 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019134145A (ja) * | 2018-02-02 | 2019-08-08 | 株式会社ニコン | 撮像素子、及び、撮像装置 |
JP7383876B2 (ja) | 2018-02-02 | 2023-11-21 | 株式会社ニコン | 撮像素子、及び、撮像装置 |
US11538942B2 (en) | 2018-07-18 | 2022-12-27 | Sony Semiconductor Solutions Corporation | Light receiving element and ranging module having light receiving regions and an isolation portion between adjacent light receiving regions |
US11538845B2 (en) | 2018-07-18 | 2022-12-27 | Sony Semiconductor Solutions Corporation | Light receiving element, ranging module, and electronic apparatus |
KR20210033935A (ko) * | 2018-07-18 | 2021-03-29 | 소니 세미컨덕터 솔루션즈 가부시키가이샤 | 수광 소자 및 거리측정 모듈 |
JPWO2020017341A1 (ja) * | 2018-07-18 | 2021-07-15 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子および測距モジュール |
EP3644366A4 (en) * | 2018-07-18 | 2021-08-04 | Sony Semiconductor Solutions Corporation | LIGHT RECEIVING ELEMENT AND DISTANCE MEASURING MODULE |
KR102675996B1 (ko) * | 2018-07-18 | 2024-06-18 | 소니 세미컨덕터 솔루션즈 가부시키가이샤 | 수광 소자 및 거리측정 모듈 |
JP2020013909A (ja) * | 2018-07-18 | 2020-01-23 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子、測距モジュール、および、電子機器 |
JP7451395B2 (ja) | 2018-07-18 | 2024-03-18 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子および測距モジュール |
US11652175B2 (en) | 2018-07-18 | 2023-05-16 | Sony Semiconductor Solutions Corporation | Light reception device and distance measurement module |
US11764246B2 (en) | 2018-07-18 | 2023-09-19 | Sony Semiconductor Solutions Corporation | Light receiving element, ranging module, and electronic apparatus |
JP7362198B2 (ja) | 2018-07-18 | 2023-10-17 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子、測距モジュール、および、電子機器 |
WO2020017341A1 (ja) * | 2018-07-18 | 2020-01-23 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子および測距モジュール |
US11916154B2 (en) | 2018-07-18 | 2024-02-27 | Sony Semiconductor Solutions Corporation | Light receiving element and ranging module having a plurality of pixels that each includes voltage application units and charge detection units |
US11863892B2 (en) | 2019-03-25 | 2024-01-02 | Sony Semiconductor Solutions Corporation | Imaging unit and electronic apparatus with first and second light blocking films |
WO2020195825A1 (ja) * | 2019-03-25 | 2020-10-01 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および電子機器 |
WO2021166672A1 (ja) * | 2020-02-20 | 2021-08-26 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、電子機器 |
Also Published As
Publication number | Publication date |
---|---|
EP3439038A4 (en) | 2020-03-04 |
JPWO2017169479A1 (ja) | 2019-01-31 |
US10686004B2 (en) | 2020-06-16 |
EP3439038A1 (en) | 2019-02-06 |
US20190081094A1 (en) | 2019-03-14 |
CN109155323A (zh) | 2019-01-04 |
JP6791243B2 (ja) | 2020-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6791243B2 (ja) | 撮像素子、及び、撮像装置 | |
KR102523203B1 (ko) | 고체 화상 센서 및 그 제조 방법, 및 전자 장치 | |
JP6458343B2 (ja) | 撮像装置 | |
JP2015128131A (ja) | 固体撮像素子および電子機器 | |
JP4983271B2 (ja) | 撮像装置 | |
JP2013157442A (ja) | 撮像素子および焦点検出装置 | |
JP5067148B2 (ja) | 撮像素子、焦点検出装置および撮像装置 | |
WO2018181590A1 (ja) | 撮像素子および撮像装置 | |
JP6693567B2 (ja) | 撮像素子、焦点検出装置、及び、電子カメラ | |
WO2012066846A1 (ja) | 固体撮像素子及び撮像装置 | |
WO2018061978A1 (ja) | 撮像素子および焦点調節装置 | |
JP2017183992A (ja) | 撮像素子、焦点検出装置および撮像装置 | |
JP2021100253A (ja) | 撮像素子および撮像装置 | |
WO2018181585A1 (ja) | 撮像素子および撮像装置 | |
JP6693568B2 (ja) | 撮像素子、焦点検出装置、及び、撮像装置 | |
WO2018061729A1 (ja) | 撮像素子および焦点調節装置 | |
WO2018061941A1 (ja) | 撮像素子および撮像装置 | |
WO2018181591A1 (ja) | 撮像素子および撮像装置 | |
WO2018061728A1 (ja) | 撮像素子および焦点調節装置 | |
WO2018061940A1 (ja) | 撮像素子および焦点調節装置 | |
JP7419975B2 (ja) | 撮像素子、及び、撮像装置 | |
JP7383876B2 (ja) | 撮像素子、及び、撮像装置 | |
JP2020109779A (ja) | 撮像素子、及び、撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018508832 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017774024 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017774024 Country of ref document: EP Effective date: 20181031 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17774024 Country of ref document: EP Kind code of ref document: A1 |