US20220311944A1 - Auto-focus image sensor and digital image processing device including the same - Google Patents
Auto-focus image sensor and digital image processing device including the same Download PDFInfo
- Publication number
- US20220311944A1 US20220311944A1 US17/840,750 US202217840750A US2022311944A1 US 20220311944 A1 US20220311944 A1 US 20220311944A1 US 202217840750 A US202217840750 A US 202217840750A US 2022311944 A1 US2022311944 A1 US 2022311944A1
- Authority
- US
- United States
- Prior art keywords
- pixels
- pixel
- image sensor
- auto
- focus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012545 processing Methods 0.000 title abstract description 31
- 238000002955 isolation Methods 0.000 claims abstract description 85
- 239000000758 substrate Substances 0.000 claims abstract description 55
- 238000009413 insulation Methods 0.000 claims description 14
- 239000010410 layer Substances 0.000 description 242
- 238000000034 method Methods 0.000 description 36
- 239000002019 doping agent Substances 0.000 description 26
- 239000011229 interlayer Substances 0.000 description 21
- 238000004519 manufacturing process Methods 0.000 description 19
- 238000011049 filling Methods 0.000 description 17
- 238000012546 transfer Methods 0.000 description 17
- 238000009792 diffusion process Methods 0.000 description 15
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 12
- 238000010586 diagram Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000004377 microelectronic Methods 0.000 description 7
- 238000002161 passivation Methods 0.000 description 7
- 238000005530 etching Methods 0.000 description 6
- 210000001747 pupil Anatomy 0.000 description 6
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 5
- 229910052814 silicon oxide Inorganic materials 0.000 description 5
- 229910000449 hafnium oxide Inorganic materials 0.000 description 4
- WIHZLLGSGQNAGK-UHFFFAOYSA-N hafnium(4+);oxygen(2-) Chemical compound [O-2].[O-2].[Hf+4] WIHZLLGSGQNAGK-UHFFFAOYSA-N 0.000 description 4
- 239000010936 titanium Substances 0.000 description 4
- 229910052581 Si3N4 Inorganic materials 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000000227 grinding Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 229910001512 metal fluoride Inorganic materials 0.000 description 3
- 229910044991 metal oxide Inorganic materials 0.000 description 3
- 150000004706 metal oxides Chemical class 0.000 description 3
- 229910052710 silicon Inorganic materials 0.000 description 3
- 239000010703 silicon Substances 0.000 description 3
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 3
- 241000519995 Stachys sylvatica Species 0.000 description 2
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 239000003795 chemical substances by application Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000000151 deposition Methods 0.000 description 2
- 229910052735 hafnium Inorganic materials 0.000 description 2
- VBJZVLUMGGDVMO-UHFFFAOYSA-N hafnium atom Chemical compound [Hf] VBJZVLUMGGDVMO-UHFFFAOYSA-N 0.000 description 2
- 238000005468 ion implantation Methods 0.000 description 2
- 229910052747 lanthanoid Inorganic materials 0.000 description 2
- 150000002602 lanthanoids Chemical class 0.000 description 2
- 238000005240 physical vapour deposition Methods 0.000 description 2
- 239000000049 pigment Substances 0.000 description 2
- VSZWPYCFIRKVQL-UHFFFAOYSA-N selanylidenegallium;selenium Chemical compound [Se].[Se]=[Ga].[Se]=[Ga] VSZWPYCFIRKVQL-UHFFFAOYSA-N 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 229910052715 tantalum Inorganic materials 0.000 description 2
- GUVRBAGPIYLISA-UHFFFAOYSA-N tantalum atom Chemical compound [Ta] GUVRBAGPIYLISA-UHFFFAOYSA-N 0.000 description 2
- 229910052719 titanium Inorganic materials 0.000 description 2
- 229910052727 yttrium Inorganic materials 0.000 description 2
- VWQVUPCCIRVNHF-UHFFFAOYSA-N yttrium atom Chemical compound [Y] VWQVUPCCIRVNHF-UHFFFAOYSA-N 0.000 description 2
- IRPGOXJVTQTAAN-UHFFFAOYSA-N 2,2,3,3,3-pentafluoropropanal Chemical compound FC(F)(F)C(F)(F)C=O IRPGOXJVTQTAAN-UHFFFAOYSA-N 0.000 description 1
- KLZUFWVZNOTSEM-UHFFFAOYSA-K Aluminum fluoride Inorganic materials F[Al](F)F KLZUFWVZNOTSEM-UHFFFAOYSA-K 0.000 description 1
- PXGOKWXKJXAPGV-UHFFFAOYSA-N Fluorine Chemical compound FF PXGOKWXKJXAPGV-UHFFFAOYSA-N 0.000 description 1
- 101000709121 Homo sapiens Ral guanine nucleotide dissociation stimulator-like 1 Proteins 0.000 description 1
- 101000709135 Homo sapiens Ral guanine nucleotide dissociation stimulator-like 2 Proteins 0.000 description 1
- 206010034972 Photosensitivity reaction Diseases 0.000 description 1
- 102100032665 Ral guanine nucleotide dissociation stimulator-like 1 Human genes 0.000 description 1
- 102100032786 Ral guanine nucleotide dissociation stimulator-like 2 Human genes 0.000 description 1
- 101100099195 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) TGL1 gene Proteins 0.000 description 1
- 101100099196 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) TGL2 gene Proteins 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000009713 electroplating Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 229910052731 fluorine Inorganic materials 0.000 description 1
- 239000011737 fluorine Substances 0.000 description 1
- 239000011810 insulating material Substances 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000036211 photosensitivity Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
- H01L27/14607—Geometry of the photosensitive area
-
- H04N5/232122—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14609—Pixel-elements with integrated switching, control, storage or amplification elements
- H01L27/14612—Pixel-elements with integrated switching, control, storage or amplification elements involving a transistor
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14609—Pixel-elements with integrated switching, control, storage or amplification elements
- H01L27/14612—Pixel-elements with integrated switching, control, storage or amplification elements involving a transistor
- H01L27/14614—Pixel-elements with integrated switching, control, storage or amplification elements involving a transistor having a special gate structure
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14623—Optical shielding
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1463—Pixel isolation structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1464—Back illuminated imager structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
- H01L27/14685—Process for coatings or optical elements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
- H01L27/14689—MOS based technologies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/133—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/135—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
-
- H04N5/232123—
-
- H04N5/36961—
-
- H04N9/04555—
-
- H04N9/04559—
Definitions
- the inventive concepts relate to an auto-focus image sensor and a digital image processing device including the same.
- a conventional digital image processing device includes an additional focus detecting device separate and/or different from an image sensor.
- costs of the focus detecting device and/or an additional optical lens may be increased and/or an entire size of the digital image processing device may be increased by the focus detecting device.
- an auto-focus image sensor using a method of detecting a phase difference has been developed.
- Example embodiments of the inventive concepts may provide an auto-focus image sensor capable of realizing a clearer image.
- Example embodiments of the inventive concepts may also provide a digital image processing device capable of realizing a clearer image.
- an auto-focus image sensor may include: a substrate comprising at least one first pixel used for detecting a phase difference and at least one second pixel used for detecting an image; a deep device isolation portion disposed in the substrate to isolate the first pixel from the second pixel; and a light shielding pattern disposed on the substrate of at least the first pixel.
- the amount of light incident on the first pixel may be smaller than the amount of light incident on the second pixel by the light shielding pattern.
- the substrate may include: a first surface on which a gate electrode is disposed; and a second surface opposite to the first surface.
- the deep device isolation portion may be adjacent to at least the second surface.
- the auto-focus image sensor may further include: an interconnection disposed on the first surface in the second pixel. Charge generated from the second pixel may be transmitted through the interconnection, and the light shielding pattern and the interconnection may be disposed at the same height from the first surface.
- the light shielding pattern may have a width greater than that of the interconnection.
- light may be incident through the second surface, and the light shielding pattern may be disposed on the second surface.
- the deep device isolation portion may have a mesh structure, and the light shielding pattern may have a mesh structure that overlaps with the deep device isolation portion when viewed from a plan view.
- an area of the light shielding pattern in the first pixel may be greater than that of the light shielding pattern in the second pixel.
- a ground voltage or a reference voltage may be applied to the light shielding pattern.
- the deep device isolation portion may penetrate the substrate so as to be exposed at the first and second surfaces.
- the deep device isolation portion may include: a filling insulation layer; and a poly-silicon pattern disposed within the filling insulation layer.
- the deep device isolation portion may include: a filling insulating layer; and a fixed charge layer disposed between the filling insulation layer and the substrate.
- the fixed charge layer and the filling insulation layer may extend onto the second surface, and the fixed charge layer may be in contact with the second surface.
- the fixed charge layer may be formed of a metal oxide or metal fluoride that includes at least one selected from a group consisting of hafnium (Hf), zirconium (Zr), aluminum (Al), tantalum (Ta), titanium (Ti), yttrium (Y), and a lanthanoid.
- the deep device isolation portion may further include: a gap-fill assistant layer spaced apart from the fixed charge layer with the filling insulation layer therebetween.
- the deep device isolation portion may include: a fixed charge layer being in contact with a sidewall of the substrate and an air gap region exposing the fixed charge layer.
- the deep device isolation portion may include: a poly-silicon pattern disposed in a first trench extending from the first surface toward the second surface; a first filling insulation layer being in contact with both sidewalls of the poly-silicon pattern in the first trench; a fixed charge layer disposed in a second trench extending from the second surface toward the first surface, the second trench overlapping with the first trench, the fixed charge layer being in contact with both the first filling insulation layer and the poly-silicon pattern, and the fixed charge layer covering an inner sidewall of the second trench; and a second filling insulation layer filling the second trench.
- the deep device isolation portion may include: a first deep device isolation portion adjacent to the first surface; and a second deep device isolation portion adjacent to the second surface.
- the second deep device isolation portion may be in contact with the first deep device isolation portion.
- the auto-focus image sensor may further include: a fixed charge layer disposed on the second surface.
- the auto-focus image sensor may further include: a shallow device isolation portion disposed in the substrate to define an active region, the shallow device isolation portion adjacent to the first surface; and a color filter and a micro-lens disposed on the first surface or the second surface.
- a color filter disposed on the first pixel may not include a pigment.
- the auto-focus image sensor may further include: a first ground region disposed in the substrate of the first pixel, the first ground region adjacent to the first surface in the first pixel, and a ground voltage applied to the substrate of the first pixel through the first ground region; and a second ground region disposed in the substrate of the second pixel, the second ground region adjacent to the first surface in the second pixel, and the ground voltage applied to the substrate of the second pixel through the second ground region.
- a digital image processing device may include: the auto-focus image sensor; an optical system inputting light into the auto-focus image sensor; and a focus controller controlling a focus of the optical system using the phase difference detected from the first pixel.
- an auto-focus image sensor may include: a substrate comprising: a first auto-focus (AF) pixel and a second AF pixel that are used for detecting a phase difference and are adjacent to each other; and at least one image pixel used for detecting an image; a deep device isolation portion isolating the first AF pixel, the second AF pixel, and the image pixel from each other; and a light shielding pattern disposed on at least the first and second AF pixels, the light shielding pattern having a first opening and a second opening that partially expose the first AF pixel and the second AF pixel, respectively.
- the first opening and the second opening may be disposed to be symmetric.
- an auto-focus image sensor may include: a substrate including at least one first pixel configured to detect a phase difference and at least one second pixel configured to detect an image; an isolation portion configured to isolate the at least one first pixel from at least one second pixel; and a light shield on the substrate and between the at least one first pixel and incident light.
- the auto-focus image sensor may include an amount of light incident on the at least one first pixel which may be smaller than an amount of light incident on the at least one second pixel.
- the deep device isolation portion of the auto-focus image sensor may include a mesh structure and surrounds the at least one first pixel and the at least one second pixel in at least two directions.
- the substrate of the auto-focus image sensor may include a first surface on which a gate electrode is disposed and a second surface opposite to the first surface and wherein the deep device isolation portion is adjacent to at least the second surface, the auto-focus image sensor further comprising: a shallow device isolation portion in the substrate to define an active region, the shallow device isolation portion adjacent to the first surface; and a color filter and a micro-lens on the first surface or the second surface.
- a digital image processing device may comprise an auto-focus image sensor, an optical system configured to input light into the auto-focus image sensor; and a focus controller configured to control a focus of the optical system using the phase difference detected from the at least one first pixel.
- FIG. 1 is a schematic block diagram illustrating a digital image processing device according to some example embodiments of the inventive concepts
- FIG. 2 is a diagram illustrating a principle of a phase-difference auto-focus (AF) using an auto-focus image sensor of FIG. 1 ;
- FIG. 3A is a graph illustrating phases of output values of auto-focus (AF) pixels when a photographing lens is out of focus;
- FIG. 3B is a graph illustrating phases of output values of auto-focus (AF) pixels when a photographing lens is in focus;
- FIG. 4 is a circuit diagram of an auto-focus image sensor according to some example embodiments of the inventive concepts.
- FIG. 5A is a layout illustrating a portion of a pixel region of an auto-focus image sensor according to some example embodiments of the inventive concepts
- FIGS. 5B and 5C are layouts illustrating a portion of a pixel region of an auto-focus image sensor according to other example embodiments of the inventive concepts
- FIG. 6 is an upper layout of an auto-focus image sensor according to some example embodiments of the inventive concepts.
- FIG. 7 is a lower layout of the auto-focus image sensor of FIG. 6 ;
- FIG. 8 is a cross-sectional view taken along lines A-A′ and B-B′ of FIG. 6 or 7 ;
- FIGS. 9A, 10A, 11A, 12A, and 13A are plan views illustrating a method of fabricating an auto-focus image sensor having the upper layout of FIG. 6 ;
- FIGS. 14A and 15A are plan views illustrating a method of fabricating an auto-focus image sensor having the lower layout of FIG. 7 ;
- FIGS. 9B, 10B, 11B, 12B, 13B, 14B, and 15B are cross-sectional views illustrating a method of fabricating an auto-focus image sensor having the cross-sectional view of FIG. 8 ;
- FIG. 16 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to other example embodiments of the inventive concepts;
- FIG. 17 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to still other example embodiments of the inventive concepts;
- FIG. 18A is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet other example embodiments of the inventive concepts;
- FIG. 18B is a cross-sectional view illustrating a method of fabricating the auto-focus image sensor of FIG. 18A ;
- FIGS. 19A and 19B are cross-sectional views taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts;
- FIG. 20 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts;
- FIG. 21 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts;
- FIGS. 22 to 24 are cross-sectional views illustrating a method of fabricating the auto-focus image sensor of FIG. 21 ;
- FIG. 25 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.
- FIG. 26 is a cross-sectional view taken along lines A-A′ and B-B′ of FIG. 25 ;
- FIG. 27 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.
- FIG. 28 is a cross-sectional view taken along lines A-A′ and B-B′ of FIG. 27 ;
- FIG. 29 is a layout of a first layer first signal line and a first layer third signal line in a first focus detecting region
- FIG. 30 is a layout of a second layer first signal line and a second layer second signal line in a second focus detecting region
- FIGS. 31 to 35 illustrate embodiments of a digital image processing device including an auto-focus image sensor according to example embodiments of the inventive concepts.
- FIG. 36 is a schematic block diagram an interface and an electronic system including an auto-focus image sensor according to example embodiments of the inventive concepts.
- inventive concepts will now be described more fully hereinafter with reference to the accompanying drawings, in which example embodiments of the inventive concepts are shown.
- inventive concepts are not limited to the following example embodiments, and may be implemented in various forms. Accordingly, the example embodiments are provided only to disclose the inventive concepts and let those skilled in the art know the category of the inventive concepts.
- embodiments of the inventive concepts are not limited to the specific examples provided herein and are exaggerated for clarity.
- example embodiments are described herein with reference to cross-sectional illustrations and/or plane illustrations that are idealized example illustrations. Accordingly, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments should not be construed as limited to the shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an etching region illustrated as a rectangle will, typically, have rounded or curved features. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
- devices and methods of forming devices according to various embodiments described herein may be embodied in microelectronic devices such as integrated circuits, wherein a plurality of devices according to various embodiments described herein are integrated in the same microelectronic device. Accordingly, the cross-sectional view(s) illustrated herein may be replicated in two different directions, which need not be orthogonal, in the microelectronic device.
- a plan view of the microelectronic device that embodies devices according to various embodiments described herein may include a plurality of the devices in an array and/or in a two-dimensional pattern that is based on the functionality of the microelectronic device.
- microelectronic devices according to various embodiments described herein may be interspersed among other devices depending on the functionality of the microelectronic device. Moreover, microelectronic devices according to various embodiments described herein may be replicated in a third direction that may be orthogonal to the two different directions, to provide three-dimensional integrated circuits.
- the cross-sectional view(s) illustrated herein provide support for a plurality of devices according to various embodiments described herein that extend along two different directions in a plan view and/or in three different directions in a perspective view.
- the device/structure may include a plurality of active regions and transistor structures (or memory cell structures, gate structures, etc., as appropriate to the case) thereon, as would be illustrated by a plan view of the device/structure.
- FIG. 1 is a schematic block diagram illustrating a digital image processing device according to some example embodiments of the inventive concepts.
- a digital image processing device 100 may be separable from a lens.
- the inventive concepts are not limited thereto.
- An auto-focus image sensor 108 according to some example embodiments and the lens may constitute one integrated body.
- the digital image processing device 100 can perform a phase difference auto-focus (AF) process and a contrast AF process.
- AF phase difference auto-focus
- the digital image processing device 100 includes a photographing lens 101 including a focus lens 102 .
- the digital image processing device 100 may have a focus detecting function and may drive the focus lens 102 .
- the photographing lens 101 further includes a lens driver 103 driving the focus lens 102 , a lens position detector 104 detecting a position of the focus lens 102 , and a lens controller 105 controlling the focus lens 102 .
- the lens controller 105 exchanges data relative to focus detection with a central processing unit (CPU) 106 of the digital image processing device 100 .
- CPU central processing unit
- the digital image processing device 100 includes the auto-focus image sensor 108 .
- the digital image processing device 100 may photograph subject light inputted through the photographing lens 101 to generate an image signal.
- the auto-focus image sensor 108 may include a plurality of photoelectric converters (not shown) arranged in a matrix form and transmission lines (not shown) through which charge moves from the photoelectric converters to output the image signal.
- a sensor controller 107 generates a timing signal, so the auto-focus image sensor 108 is controlled to photograph an image. In addition, the sensor controller 107 sequentially outputs image signals if charge accumulation is completed in each scanning line.
- the outputted signals pass through an analog signal processing part 109 and are then converted into digital signals in an analog/digital (A/D) converter 110 .
- the digital signals are inputted into an image input controller 111 and are then processed.
- An auto-white balance (AWB) operation, an auto-exposure (AE) operation, and an AF operation are performed to a digital image signal inputted to the image input controller 111 in an AWB detecting part 116 , an AE detecting part 117 , and an AF detecting part 118 , respectively.
- the AF detecting part 118 outputs a detecting value with respect to a contrast value during the contrast AF process and outputs pixel information to the CPU 106 during the phase difference AF process, so the CPU 106 performs a phase difference operation.
- the phase difference operation of the CPU 106 may be obtained by performing a correlation operation of a plurality of pixel column signals. A position or a direction of a focus may be obtained by the result of the phase difference operation.
- An image signal is stored in a synchronous dynamic random access memory (SDRAM) 119 that is a temporary memory.
- SDRAM synchronous dynamic random access memory
- a digital signal processor 112 performs one or more image signal processes (e.g., gamma correction) to create a displayable live view or a capture image.
- a compressor-expander 113 may compress the image signal in a compressed form (e.g., JPEG or H.264) or may expand the image signal when it is reproduced.
- An image file including the image signal compressed in the compressor-expander 113 is transmitted through a media controller 121 to be stored in a memory card 122 .
- a nonvolatile memory may be embodied to include a three dimensional (3D) memory array.
- the 3D memory array may be monolithically formed on a substrate (e.g., semiconductor substrate such as silicon, or semiconductor-on-insulator substrate).
- the 3D memory array may include two or more physical levels of memory cells having an active area disposed above the substrate and circuitry associated with the operation of those memory cells, whether such associated circuitry is above or within such substrate.
- the layers of each level of the array may be directly deposited on the layers of each underlying level of the array.
- the 3D memory array may include vertical NAND strings that are vertically oriented such that at least one memory cell is located over another memory cell.
- the at least one memory cell may comprise a charge trap layer.
- Display image information is stored in a video random access memory (VRAM) 120 , and the image is disposed on a liquid crystal display (LCD) 115 through a video encoder 114 .
- the CPU 106 used as a controller may control operations of each part.
- An electrically erasable programmable read-only memory (EEPROM) 123 may store and maintain information for correcting pixel defects of the auto-focus image sensor 108 or adjustment information.
- An operating interface 124 receives various commands from a user to operate the digital image processing device 100 .
- the operating part 124 may include various buttons such as a shutter-release button (not shown), a main button (not shown), a mode dial (not shown), and/or a menu button (not shown).
- Such existing hardware may include one or more Central Processing Units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits (ASICs), field programmable gate arrays (FPGAs) computers or the like configured as special purpose machines to perform the functions of the module.
- CPUs Central Processing Units
- DSPs digital signal processors
- ASICs application-specific-integrated-circuits
- FPGAs field programmable gate arrays
- CPUs, DSPs, ASICs and FPGAs may generally be referred to as processing devices.
- a structure is or includes a processor executing software
- the processor is configured as a special purpose machine to execute the software, stored in a storage medium, to perform the functions of the structure.
- FIG. 2 is a diagram illustrating a principle of a phase-difference auto-focus (AF) using an auto-focus image sensor of FIG. 1 .
- AF phase-difference auto-focus
- light (or incident light) of an object that has passed through the photographing lens 101 passes through a micro-lens array 14 so as to be introduced to a first AF pixel R and a second AF pixel L.
- Masks or openings 17 and 18 that limit light inputted from pupils 12 and 13 of the photographing lens 101 may be adjacent to portions of the first and second AF pixels R and L.
- the light inputted from the pupil 12 disposed above a light axis of the photographing lens 101 is induced to the second AF pixel L, and the light inputted from the pupil 13 disposed under the light axis of the photographing lens 101 is induced to the first AF pixel R.
- “Pupil segmentation” means that the first AF pixel R and the second AF pixel L receive light, which are reversely projected at positions of the pupils 12 and 13 by the micro-lens array 14 , through the masks or openings 17 and 18 .
- Continuous pupil-segmented pixel outputs of the first and second AF pixels R and L according to positions of the first and second AF pixels R and L are illustrated in FIGS. 3A and 3B .
- a horizontal axis represents a position of each of the first and second AF pixels R and L
- a vertical axis represents an output value of each of the first and second AF pixels R and L.
- a shape of the continuous output value of the first AF pixel R is the same as that of the second AF pixel L.
- positions (e.g., phases) of the output values of the first and second AF pixels R and L may be different from each other.
- a front-focusing state means that the photographing lens 101 focuses in front of the object.
- a back-focusing state means that the photographing lens 101 focuses on back of the object.
- the phase of the output value of the first AF pixel R is right-shifted from the phase of the focused state and the phase of the output value of the second AF pixel L is left-shifted from the phase of the focused state.
- a shift amount between the phases of the output values of the first and second AF pixels R and L may be converted into a deviation amount between focuses.
- FIG. 4 is a circuit diagram of an auto-focus image sensor, for example, auto-focus image sensor 108 , according to some example embodiments of the inventive concepts.
- each of unit pixels UP 1 , UP 2 , UP 3 , and UP 4 of the auto-focus image sensor may include a photoelectric converter region PD, a transfer transistor Tx, a source follower transistor Sx, a reset transistor Rx, and a selection transistor Ax.
- a photoelectric converter region PD may include a photoelectric converter region PD, a transfer transistor Tx, a source follower transistor Sx, a reset transistor Rx, and a selection transistor Ax.
- four unit pixels adjacent to each other will be described as an example for the purpose of ease and convenience in explanation. However, the inventive concepts are not limited to the number of the unit pixels.
- the auto-focus image sensor 108 may include five or more unit pixels.
- At least two unit pixels adjacent to each other of the unit pixels UP 1 , UP 2 , UP 3 , and UP 4 may be AF pixels that are used to detect a phase difference, and others of the unit pixels UP 1 , UP 2 , UP 3 , and UP 4 may be image pixels that are used to detect an image.
- the transfer transistor Tx, the source follower transistor Sx, the reset transistor Rx, and the selection transistor Ax may include a transfer gate TG, a source follower gate SF, a reset gate RG, and a selection gate SEL, respectively.
- a photoelectric converter is provided in the photoelectric converter region PD.
- the photoelectric PD may be a photodiode including an N-type dopant region and a P-type dopant region.
- a drain of the transfer transistor Tx may be a floating diffusion region FD.
- the floating diffusion region FD may also be a source of the reset transistor Rx.
- the floating diffusion region FD may be electrically connected to the source follower gate SF of the source follower transistor Sx.
- the source follower transistor Sx is connected to the selection transistor Ax.
- the transfer gates TG of first and second unit pixels UP 1 and UP 2 adjacent to each other in a first direction D 1 may be electrically connected to a first transfer gate line TGL 1 .
- the transfer gates TG of third and fourth unit pixels UP 3 and UP 4 adjacent to each other in the first direction D 1 may be electrically connected to a second transfer gate line TGL 2 .
- the reset gates RG of the first and second unit pixels UP 1 and UP 2 may be electrically connected to a first reset gate line RGL 1
- the reset gates RG of the third and fourth unit pixels UP 3 and UP 4 may be electrically connected to a second reset gate line RGL 2 .
- the selection gates SEL of the first and second unit pixels UP 1 and UP 2 may be electrically connected to a first selection gate line SELL 1
- the selection gates SEL of the third and fourth unit pixels UP 3 and UP 4 may be electrically connected to a second selection gate line SELL 2 .
- the reset transistor Rx, the source follower transistor Sx, and the selection transistor Ax may be shared by neighboring pixels, thereby improving an integration degree of the auto-focus image sensor 108 .
- a power voltage Vdd is applied to the drains of the reset transistors Rx and the drains of the source follower transistors Sx of the first and second unit pixels UP 1 and UP 2 in a dark state, thereby discharging charge remaining in the floating diffusion regions PD.
- the reset transistors Rx are turned-off and light is inputted from an external system to the photoelectric converter regions PD to generate electron-hole pairs in the photoelectric converter regions PD. Holes are moved into and then accumulated in the P-type dopant regions, and electrons are moved into and then accumulated in the N-type dopant regions.
- the transfer transistors Tx are turned-on, the electrons are transferred into and then accumulated in the floating diffusion regions PD. Gate biases of the source follower transistors Sx are changed in proportion to the amounts of the electrons accumulated in the floating diffusion regions FD, so source potentials of the source follower transistors Sx are changed. At this time, if the selection transistors Ax are turned-on, signals generated by the electrons are read through signal sensing lines Vout. Next, the processes described above may be performed on the third and fourth unit pixels UP 3 and UP 4 .
- first and second unit pixels UP 1 and UP 2 are the AF pixels and the third and fourth unit pixels UP 3 and UP 4 are the image pixels
- output values like FIG. 3A are obtained from the AF pixels corresponding to the first and second unit pixels UP 1 and UP 2 and the photographing lens 101 of FIG. 1 then focuses using the obtained output values. Whether the photographing lens 101 focuses or not may be confirmed. In addition, whether output values like FIG. 3B are outputted from the AF pixels or not may be confirmed.
- the digital image processing device 100 of FIG. 1 is a digital camera, a shutter may be pressed after the photographing lens 101 focuses, thereby obtaining an image from output values received from the image pixels such as the third and fourth unit pixels UP 3 and UP 4 . As a result, a clean image may be obtained.
- FIG. 5A is a layout illustrating a portion of a pixel region of an auto-focus image sensor according to some example embodiments of the inventive concepts.
- an auto-focus image sensor may include first and second focus detecting regions 32 and 33 and image detecting regions 30 .
- the first focus detecting region 32 may extend in a first direction D 1
- the second focus detecting region 33 may extend in a second direction D 2 intersecting the first direction D 1 .
- the first focus detecting region 32 may include a first AF pixel 20 R and a second AF pixel 20 L that are adjacent to each other and are used to detect a phase difference.
- the first focus detecting region 32 may include a plurality of first AF pixels 20 R and a plurality of second AF pixels 20 L that are alternately arranged along the first direction D 1 .
- the second focus detecting region 33 may include a third AF pixel 20 D and a fourth AF pixel 20 U that are adjacent to each other and are used to detect a phase difference.
- the second focus detecting region 33 may include a plurality of third AF pixels 20 D and a plurality of fourth AF pixels 20 U that are alternately arranged along the second direction D 2 .
- the image detecting region 30 may include image pixels 21 .
- the first and second focus detecting regions 32 and 33 may intercross to constitute a cross shape.
- FIG. 5 illustrates a portion of a pixel region, so a cross point of the first and second focus detecting regions 32 and 33 is illustrated to be one-sided. However, the cross point of the first and second focus detection regions 32 and 33 may be disposed at a center of an entire portion of the pixel region.
- a color filter array may be disposed on the first and second focus detecting regions 32 and 33 and the image detecting regions 30 .
- the color filter array may be a Bayer pattern array consisting of red (R), green (G), and blue (B) or may adopt a complementary color system (e.g., a system using magenta, green, cyan, and yellow).
- Color filters disposed on the AF pixels 20 R, 20 L, 20 D, and 20 U may not be used to realize colors. However, color filters may also be formed on the AF pixels 20 R, 20 L, 20 D, and 20 U for the purpose of convenience in a process of forming the color filter array.
- a micro-lens array 35 is disposed on the color filter array.
- a light shielding pattern that controls light-receiving amounts of at least the AF pixels 20 R, 20 L, 20 D, and 20 U may be disposed under the color filter array.
- the light shielding pattern of the AF pixels 20 R, 20 L, 20 D, and 20 U may include one or more first openings 332 .
- the light shielding pattern may further include second openings 330 disposed on the image pixels 21 .
- An area of each of the first openings 332 may be smaller than that of each of the second openings 330 .
- the area of the first opening 332 may be about 50% of the area of the second opening 330 .
- the first opening 332 may be disposed to be one-sided from a light axis along which light is inputted.
- the first openings 332 of the first and second AF pixels 20 R and 20 L adjacent to each other may be disposed to be symmetric.
- the first openings 332 of the third and fourth AF pixels 20 D and 20 U adjacent to each other may be disposed to be symmetric.
- the first opening 332 of the light shielding pattern may reduce the amount of light incident on each of the AF pixels 20 R, 20 L, 20 D, and 20 U in comparison with the amount of light incident on the image pixel 21 . In other words, the amount of the light incident on each AF pixel 20 R, 20 L, 20 D, or 20 U may be smaller than the amount of the light incident on the image pixel 21 due to the light shielding pattern.
- FIGS. 5B and 5C are layouts illustrating a portion of a pixel region of an auto-focus image sensor according to other example embodiments of the inventive concepts.
- only green color filters G may be disposed on the first and second focus detecting regions 32 and 33 .
- color filters W disposed on the first and second focus detecting regions 32 and 33 may be white color filters or transparent filters.
- a pigment for showing a color such as red, green, and/or blue is not added to the color filters W.
- light of all wavelengths may be inputted into the AF pixels 20 R, 20 L, 20 D, and 20 U of the first and second focus detecting regions 32 and 33 , so the light receiving amounts of the AF pixels 20 R, 20 L, 20 D, and 20 U may increase to improve photosensitivity of the AF pixels 20 R, 20 L, 20 D, and 20 U.
- color filters may not exist on the AF pixels 20 R, 20 L, 20 D, and 20 U of the first and second focus detecting regions 32 and 33 .
- FIG. 6 is an upper layout of an auto-focus image sensor according to some example embodiments of the inventive concepts.
- FIG. 7 is a lower layout of the auto-focus image sensor of FIG. 6 .
- FIG. 8 is a cross-sectional view taken along lines A-A′ and B-B′ of FIG. 6 or 7 .
- An auto-focus image sensor according to some example embodiments may be a backside-illuminated auto-focus image sensor.
- the auto-focus image sensor includes a substrate 51 that has a first surface 51 a and a second surface 51 b opposite to each other.
- a deep device isolation layer (or a deep device isolation portion or isolation portion) 53 is disposed in the substrate 51 to separate AF pixels 20 and image pixels 21 from each other.
- the AF pixels 20 may be disposed in first and second focus detecting regions 32 and 33 and the image pixels 21 may be disposed in image detecting regions 30 .
- the deep device isolation layer 53 may penetrate the substrate 51 so as to be exposed at the first and second surfaces 51 a and 51 b.
- a shallow device isolation layer (or a shallow device isolation portion) 55 adjacent to the first surface 51 a may be disposed to define first to third active regions AR 1 , AR 2 , and AR 3 that are spaced apart from each other.
- the shallow device isolation layer 55 is spaced apart from the second surface 51 b.
- a photoelectric converter PD may be disposed in each of the pixels 20 and 21 .
- the photoelectric converter PD may include a first dopant region 59 adjacent to the first surface 51 a and a second dopant region 57 adjacent to the second surface 51 b.
- the first dopant region 59 may be doped with P-type dopants
- the second dopant region 57 may be doped with N-type dopants.
- a transfer gate TG may be disposed on the first surface 51 a of the first active region AR 1 with a gate insulating layer 61 therebetween.
- a reset gate RG, a source follower gate SF, and a selection gate SEL which are spaced apart from each other may be disposed on the first surface 51 a of the second active region AR 2 .
- a floating diffusion region FD is disposed in the active region AR 1 .
- the floating diffusion region FD is adjacent to the first surface 51 a which does not overlap with the transfer gate TG.
- the floating diffusion region FD is spaced apart from the second dopant region 57 .
- a ground region 63 may be disposed in the third active region AR 3 and may be adjacent to the first surface 51 a.
- the floating diffusion region FD may be doped with dopants of the same conductivity type as the dopants in the second dopant region 57 .
- the floating diffusion region FD may be doped with N-type dopants.
- the ground region 63 may be doped with dopants of the same conductivity type as the dopants in the first dopant region 59 .
- the ground region 63 may be doped with P-type dopants.
- a dopant concentration of the ground region 63 may be higher than that of the first dopant region 59 .
- the first surface Ma of the substrate 51 is covered with a first interlayer insulating layer 65 .
- the first layer first contact C 11 contacts the transfer gate TG.
- the first layer second contact C 12 contacts the floating diffusion region FD.
- the first layer third contact C 13 contacts the source follower gate SF.
- the first layer fourth contact C 14 contacts a source region (of a reset transistor) disposed at a side of the reset gate RG.
- the first layer fifth contact C 15 contacts the reset gate RG.
- the first layer sixth contact C 16 contacts the selection gate SEL.
- the first layer seventh contact C 17 contacts a dopant region between the reset gate RG and the source follower gate SF.
- the dopant region between the reset gate RG and the source follower gate SF corresponds to the drain of the reset transistor Rx and the drain of the source follower transistor Sx.
- First layer first to first layer fifth signal lines L 11 to L 15 are disposed on the first interlayer insulating layer 65 .
- the signal lines L 11 to L 15 may correspond to interconnections.
- the first layer first signal line L 11 contacts the first layer first contact C 11 , so a voltage may be applied to the transfer gate TG through the first layer first signal line L 11 .
- the first layer second signal line L 12 contacts the first layer second to first layer fourth contacts C 12 to C 14 at the same time so as to electrically connect the floating diffusion region FD, the source region of the reset transistor, and the source follower gate SF to each other.
- the first layer third signal line L 13 contacts the first layer fifth contact C 15 , so a voltage may be applied to the reset gate RG through the first layer third signal line L 13 .
- the first layer fourth signal line L 14 contacts the first layer sixth contact C 16 , so a voltage may be applied to the selection gate SEL through the first layer fourth signal line L 14 .
- the first layer fifth signal line L 15 contacts the first layer seventh contact C 17 , so the power voltage Vdd may be applied to the drains of the reset transistor and the source follower transistor through the first layer fifth signal line L 15 .
- a second interlayer insulating layer 67 covers the first interlayer insulating layer 65 and the first layer first to first layer fifth signal lines L 11 to L 15 .
- Second layer first and second layer second contacts C 21 and C 22 penetrate the second and first interlayer insulating layers 67 and 65 .
- the second layer first contact C 21 contacts the ground region 63 .
- the second layer second contact C 22 contacts a source (of the selection transistor) that is disposed at a side of the selection gate SEL.
- a second layer first signal line L 21 and a second layer second signal line L 22 are disposed on the second interlayer insulating layer 67 .
- the signal lines L 21 and L 22 may correspond to interconnections.
- the second layer first signal line L 21 contacts the second layer first contact C 21 so as to apply a ground voltage to the ground region 63 .
- the second layer second signal line L 22 contacts the second layer second contact C 22 .
- the second layer second signal line L 22 may correspond to the signal sensing line Vout of FIG. 4 .
- a third interlayer insulating layer 69 may cover the second interlayer insulating layer 67 and the second layer first and second signal lines L 21 and L 22 .
- the third interlayer insulating layer 69 may be covered with a first passivation layer 71 .
- a fixed charge layer 73 may be disposed on the second surface 51 b of the substrate 51 .
- the fixed charge layer 73 may be formed of a metal oxide or metal fluoride having oxygen or fluorine of which a content ratio is lower than its stoichiometric ratio.
- the fixed charge layer 73 may have negative fixed charge.
- the fixed charge layer 73 may be formed of a metal oxide or metal fluoride that includes at least one selected from a group consisting of hafnium (Hf), zirconium (Zr), aluminum (Al), tantalum (Ta), titanium (Ti), yttrium (Y), and a lanthanoid.
- the fixed charge layer 73 may be a hafnium oxide layer or an aluminum fluoride layer. Holes may be accumulated around the second surface 51 b due to the fixed charge layer 73 , so occurrence of a dark current and white spots may be effectively reduced.
- a first insulating layer 75 and a second insulating layer 77 may be sequentially stacked on the fixed charge layer 73 .
- the first insulating layer 75 may be, for example, a silicon oxide layer.
- the second insulating layer 77 may be, for example, a silicon nitride layer.
- a light shielding pattern (or light shield) 79 may be disposed on the second insulating layer 77 .
- the light shielding pattern 79 may be formed of, for example, an opaque metal.
- the light shielding pattern 79 may be disposed in only the first and second focus detecting regions 32 and 33 .
- the first openings 332 may be disposed in the light shielding pattern 79 .
- a second passivation layer 83 may be conformally stacked on the light shielding pattern 79 .
- a planarization layer 85 is disposed on the second passivation layer 83 .
- a color filter array 87 may be disposed on the planarization layer 85 , and a micro-lens array 35 may be disposed on the color filter array 87 .
- the auto-focus image sensor includes the deep device isolation layer 53 , crosstalk between the pixels may be reduced or prevented.
- the auto-focus image sensor is a backside-illuminated type, light (or incident light) is inputted through the second surface 51 b of the substrate 51 .
- the signal lines L 11 to L 15 , L 21 , and L 22 adjacent to the first surface 51 a may not be limited to their positions.
- the signal lines L 11 to L 15 , L 21 , and L 22 may overlap with the photoelectric converter PD.
- FIGS. 9A to 13A are plan views illustrating a method of fabricating an auto-focus image sensor having the upper layout of FIG. 6 .
- FIGS. 14A and 15A are plan views illustrating a method of fabricating an auto-focus image sensor having the lower layout of FIG. 7 .
- FIGS. 9 B to 15 B are cross-sectional views illustrating a method of fabricating an auto-focus image sensor having the cross-sectional view of FIG. 8 .
- a deep device isolation layer 53 is formed in a substrate 51 having first and second surfaces 51 a and 51 b opposite to each other to isolate pixels from each other. At this time, a bottom surface of the deep device isolation layer 53 may be spaced apart from the second surface 51 b.
- the deep device isolation layer 53 may be formed of an insulating material such as silicon oxide.
- the deep device isolation layer 53 may be formed to have a mesh shape when viewed from a plan view.
- ion implantation processes may be performed to a first dopant region 59 and a second dopant region 57 in the substrate 51 of each of the pixels isolated by the deep device isolation layer 53 .
- a photoelectric converter PD is formed in each pixel.
- a shallow device isolation layer 53 that is adjacent to the first surface 51 a may be formed in the substrate 51 to define active regions AR 1 , AR 2 , and AR 3 .
- a portion of the substrate 51 around the deep device isolation layer 55 may be removed to form a shallow trench, and the shallow trench may be filled with a filling insulating layer to form the shallow device isolation layer 55 .
- a transfer gate TG may be formed to intersect the first active region AR 1
- a reset gate RG, a source follower gate SF, and a selection gate SEL may be formed to intersect the second active region AR 2 .
- Ion implantation processes may be performed to form a floating diffusion region FD and a ground region 63 .
- dopant regions that are used as source/drain regions of reset, source follower, and selection transistors may be formed in the second active region AR 2 .
- a first interlayer insulating layer 65 may be formed to cover the first surface 51 a.
- first layer first to first layer seventh contacts C 11 to C 17 are formed to penetrate the first interlayer insulating layer 65 .
- First layer first to first layer fifth signal lines L 11 to L 15 electrically connected to the contacts C 11 to C 17 are formed on the first interlayer insulating layer 65 .
- a second interlayer insulating layer 67 is formed on the first interlayer insulating layer 65 .
- second layer first and second layer second contacts C 21 and C 22 are formed to penetrate the second and first interlayer insulating layers 67 and 65 .
- Second layer first and second layer second signal lines L 21 and L 22 are formed on the second interlayer insulating layer 67 .
- a third interlayer insulating layer 69 and a first passivation layer 71 are sequentially formed on the second interlayer insulating layer 67 .
- the substrate 51 is turned over such that the second surface 51 b faces upward.
- a back grinding process may be performed on the second surface 51 b, so a portion, which is adjacent to the second surface 51 b, of the substrate 51 is removed to expose the deep device isolation layer 53 .
- a fixed charge layer 73 is formed on an entire portion of the second surface 51 b.
- First and second insulating layers 75 and 77 are sequentially formed on the fixed charge layer 73 .
- a light shielding pattern 79 is formed on the second insulating layer 77 .
- an opaque metal layer may be stacked on an entire top surface of the second insulating layer 77 , and the stacked opaque metal layer may be etched to form the light shielding pattern 79 .
- the light shielding pattern 79 may be formed by a damascene process using a process of forming a mask pattern (not shown), an electroplating process, and a planarization etching process.
- the second passivation layer 83 , the planarization layer 85 , the color filter array 87 , and the micro-lens array 35 may be sequentially formed on the light shielding pattern 79 .
- Materials of the layers may be the same as described with reference to FIGS. 6 to 8 .
- the deep device isolation layer 53 is first formed.
- the inventive concepts are not limited thereto.
- the order of the processes described above may be changed.
- the shallow device isolation layer 55 may be first formed to be adjacent to the first surface 51 a, and then, the transistors and the signal lines may be formed. Subsequently, the back grinding process may be performed on the second surface 51 b. Next, a portion of the substrate 51 may be etched from the second surface 51 b grinded to form a deep trench, and the deep trench may be filled with an insulating layer to form the deep device isolation layer 53 .
- FIG. 16 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to other example embodiments of the inventive concepts.
- a deep device isolation layer 53 may include a filling insulation layer 53 a and a poly-silicon pattern 53 b disposed within the filling insulation layer 53 a. Since the poly-silicon pattern 53 b has a substantially same thermal expansion coefficient as the substrate 51 formed of silicon, it is possible to reduce a physical stress caused by a difference between thermal expansion coefficients of materials.
- Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference to FIGS. 6 to 8 .
- FIG. 17 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to still other example embodiments of the inventive concepts.
- a deep device isolation layer 53 is spaced apart from the first surface 51 a.
- the deep device isolation layer 53 contacts a top surface of the shallow device isolation layer 55 .
- a method of fabricating the auto-focus image sensor according to some example embodiments will be described. After the processes described with reference to FIG. 9B , a portion of the deep device isolation layer 53 and the substrate 51 adjacent thereto may be etched at the same time to form a shallow trench. The shallow trench may be filled with a filling insulating layer to form the shallow device isolation layer 55 .
- Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference to FIGS. 6 to 8 .
- FIG. 18A is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet other example embodiments of the inventive concepts.
- a deep device isolation layer 53 i may include a fixed charge layer 73 and a first insulating layer 75 .
- the fixed charge layer 73 may include a hafnium oxide layer.
- the first insulating layer 75 may include a silicon oxide layer or a silicon nitride layer.
- the fixed charge layer 73 is disposed on the second surface 51 b and surrounds sidewalls of the photoelectric converter PD, thereby further reducing a dark current characteristic.
- Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference to FIGS. 6 to 8 .
- FIG. 18B is a cross-sectional view illustrating a method of fabricating the auto-focus image sensor of FIG. 18A .
- the deep device isolation layer 53 of the structure of FIG. 14B may be selectively removed to form a deep trench T 1 .
- a fixed charge layer 73 and a first insulating layer 75 are conformally formed on an entire portion of the second surface 51 b to fill the deep trench T 1 .
- the deep device isolation layer 53 of FIG. 14B may be used as a sacrificial pattern.
- an additional etching mask for forming the deep trench T 1 is not required to correct a misalignment problem.
- Other fabricating processes of example embodiments may be similar to or the same as corresponding processes of example embodiments described with reference to FIGS. 9B to 15B .
- FIGS. 19A and 19B are cross-sectional views taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.
- a deep device isolation layer 53 m may include a fixed charge layer 73 and an air gap region AG.
- the first insulating layer 75 may be formed by a deposition method having a poor step coverage characteristic (e.g., a physical vapor deposition (PVD) deposition method) to form the air gap region AG.
- PVD physical vapor deposition
- Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference to FIG. 18A .
- a deep device isolation layer 53 n may include a fixed charge layer 73 , a first insulating layer 75 , and a gap-fill assistant layer 76 in an auto-focus image sensor according to some example embodiments.
- the fixed charge layer 73 may be, for example, a hafnium oxide layer.
- the first insulating layer 75 may be, for example, a silicon oxide layer or a silicon nitride layer.
- the gap-fill assistant layer 76 may be, for example, a hafnium oxide layer.
- FIG. 20 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.
- a deep device isolation layer 53 j may include a filling insulation layer 53 a, a poly-silicon pattern 53 b, a fixed charge layer 73 , and a first insulating layer 75 in an auto-focus image sensor according to some example embodiments.
- the fixed charge layer 73 may be in contact with both the filling insulation layer 53 a and the poly-silicon pattern 53 b.
- the auto-focus image sensor of FIG. 20 may be fabricated by a combining method of the method of fabricating the auto-focus image sensor of FIG. 16 and the method of fabricating the auto-focus image sensor of FIG. 18A .
- an initial deep device isolation layer 53 may be formed to include the filling insulation layer 53 a and the poly-silicon pattern 53 b , and a portion of the initial deep device isolation layer 53 may remain when the deep trench T 1 is formed.
- the fixed charge layer 73 and the first insulating layer 75 may be formed in the deep trench T 1 , thereby forming the deep device isolation layer 53 j.
- Other fabricating processes of some example embodiments may be similar to or the same as corresponding processes of example embodiments described with reference to FIGS. 9B to 15B .
- FIG. 21 is a cross-sectional view taken along a line A-A′ of FIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other embodiments of the inventive concepts.
- a deep device isolation layer 53 k may include a first sub-deep device isolation layer 53 c and a second sub-deep device isolation layer 53 d in an auto-focus image sensor according to some example embodiments.
- the sub-deep device isolation layers 53 c and 53 d may include at least one of a silicon oxide layer, a poly-silicon layer, or a fixed charge layer.
- FIGS. 22 to 24 are cross-sectional views illustrating a method of fabricating the auto-focus image sensor of FIG. 21 .
- a first sub-deep device isolation layer 53 c is formed in a substrate 51 .
- the first sub-deep device isolation layer 53 c is adjacent to the first surface 51 a and is spaced apart from the second surface 51 b.
- a shallow device isolation layer 55 is formed in the substrate 51 . At this time, the shallow device isolation layer 55 may be formed to be shallower than the first sub-deep device isolation layer 53 c.
- transistors, lines L 11 to L 17 , L 21 , and L 22 , interlayer insulating layers 65 , 67 , and 69 , and a first passivation layer 71 may be formed on the first surface 51 a, and the substrate 51 may be then overturned. Subsequently, a back grinding process may be performed on the second surface 51 b. At this time, the first sub-deep device isolation layer 53 c is not exposed.
- a portion, adjacent to the second surface 51 b, of the substrate 51 may be etched to form a deep trench T 2 exposing the first sub-deep device isolation layer 53 c.
- a depth of the deep trench T 2 may be shallower than a depth of the deep trench T 1 of FIG. 18B .
- the second sub-deep device isolation layer 53 d may be formed to fill the deep trench T 2 .
- Other fabricating processes of some example embodiments may be similar to or the same as corresponding processes of example embodiments described with reference to FIGS. 9B to 15B .
- the substrate 51 may be etched from the first surface 51 a by a desired (or alternatively) predetermined depth and may be then etched from the second surface 51 b by a desired (or alternatively) predetermined depth to form the deep device isolation layer 53 k.
- an etching depth of each of the etching processes for the formation of the deep device isolation layer 53 k having a desired depth may be reduced to reduce burden of the etching processes.
- a depth of each of the trenches for the formation of the deep device isolation layer 53 k may be reduced to improve a gap-fill characteristic.
- FIG. 25 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.
- FIG. 26 is a cross-sectional view taken along lines A-A′ and B-B′ of FIG. 25 .
- a light shielding pattern 79 may extend into an image detection region 30 in an auto-focus image sensor according to some example embodiments.
- the light shielding pattern 79 may overlap with the deep device isolation layer 53 and may have a mesh shape.
- the light shielding pattern 79 may further include second openings 330 exposing the image pixels.
- the light shielding pattern 79 may reduce or prevent crosstalk in the image detecting region 30 .
- the light shielding pattern 79 may be connected to a ground voltage or a reference voltage, so the auto-focus image sensor may be more stably operated.
- FIG. 27 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.
- FIG. 28 is a cross-sectional view taken along lines A-A′ and B-B′ of FIG. 27 .
- a deep device isolation layer 53 is disposed in a substrate 51 having first and second surfaces 51 a and 52 b opposite to each other to isolate pixels from each other.
- a shallow device isolation layer 55 is disposed from the first surface 51 a into the substrate 51 .
- the shallow device isolation layer 55 defines active regions AR 4 and AR 5 .
- a transfer gate TG, a reset gate RG, a source follower gate SF, and a selection gate SEL are disposed on the first surface 51 a.
- a photoelectric converter PD is disposed at a side of the transfer gate TG, and a floating diffusion region FD is disposed at another side of the transfer gate TG.
- a fixed charge layer 73 , a first insulating layer 75 , and a second insulating layer 77 may be sequentially stacked on the second surface 51 b.
- the second insulating layer 77 may act as a passivation layer.
- the signal lines L 11 to L 15 , L 21 , and L 22 may not overlap with the photoelectric converter PD in an image detecting region 30 if possible. Thus, a path of light incident on the photoelectric converter PD may not be blocked.
- some signal lines L 11 a and L 22 a may perform both a signal transmission function and a light shielding function in focus detecting regions 32 and 33 . To achieve this, shapes of some signal lines L 11 a and L 22 a may be modified to perform the light shielding function in the focus detecting regions 32 and 33 .
- FIG. 29 is a layout of a first layer first signal line and a first layer third signal line in a first focus detecting region.
- a first layer first signal line L 11 a may include a first protrusion L 11 b that protrudes to overlap with the photoelectric converter PD of the AF pixel.
- the first layer first signal line L 11 a and the first layer third signal line L 13 adjacent thereto may provide shapes similar to the first openings 332 of FIGS. 5A to 5C in the first focus detecting region 32 .
- the first layer first signal line L 11 a which also performs the light shielding function may be disposed at the same height as the first layer second to fifth signal lines L 12 to L 15 from the first surface 51 a.
- FIG. 30 is a layout of a second layer first signal line and a second layer second signal line in a second focus detecting region.
- a second layer second signal line L 22 a may include a second protrusion L 22 b that protrudes to overlap with the photoelectric converter PD of the AF pixel.
- the second layer second signal line L 22 a and the second layer first signal line L 21 adjacent thereto may provide shapes similar to the first openings 332 of FIGS. 5A to 5C in the second focus detecting region 33 .
- the second layer first signal line L 22 a which also performs the light shielding function may be disposed at the same height as the second layer second signal line L 21 from the first surface 51 a.
- FIGS. 31 to 35 illustrate example embodiments of a digital image processing device including an auto-focus image sensor according to example embodiments of the inventive concepts.
- the digital image processing device according to example embodiments of the inventive concepts may be applied to a mobile or smart phone 2000 illustrated in FIG. 31 and/or a tablet or smart tablet 3000 illustrated in FIG. 32 .
- the digital image processing device according to example embodiments of the inventive concepts may be applied to a notebook computer 4000 of FIG. 33 and/or a television or smart television 5000 of FIG. 34 .
- the digital image processing device according to example embodiments of the inventive concepts may be applied to a digital camera or digital camcorder 6000 of FIG. 35 .
- FIG. 36 is a schematic block diagram an interface and an electronic system including an auto-focus image sensor according to example embodiments of the inventive concepts.
- an electronic system 1000 may be realized as a data processing device capable of using or supporting mobile industry processor interface (MIPI), e.g., a portable phone, a personal digital assistant (PDA), a portable multimedia player (PMP), or a smart phone.
- MIPI mobile industry processor interface
- PDA personal digital assistant
- PMP portable multimedia player
- the electronic system 1000 may include an application processor 1010 , an image sensor 1040 , and a display 1050 .
- the image sensor 1040 may be one of the auto-focus image sensors according to example embodiments of the inventive concepts.
- a CSI host 1012 realized in the application processor 1010 may serially communicate with a CSI device 1041 of the image sensor 1040 through a camera serial interface (CSI).
- CSI camera serial interface
- an optical de-serializer may be realized in the CSI host 1012
- an optical serializer may be realized in the CSI device 1041 .
- a DSI host 1011 realized in the application processor 1010 may serially communicate with a DSI device 1051 of the display 1050 through a display serial interface (DSI).
- DSI display serial interface
- an optical serializer may be realized in the DSI host 1011
- an optical de-serializer may be realized in the DSI device 1051 .
- the electronic system 1000 may further include a radio frequency (RF) chip 1060 capable of communicating with the application processor 1010 .
- RF radio frequency
- a PHY 1013 of the electronic system 1000 may exchange data with a PHY 1061 of the RF chip 1060 according to MIPI DigRF.
- the electronic system 1000 may further include a global positioning system (GPS) 1020 , a storage 1070 , a microphone 1080 , a DRAM 1085 , and speaker 1090 .
- the electronic system 1000 may communicate using Wimax 1030 , WLAN 1100 , and UWB 1110 .
- the pixels are isolated from each other by the deep device isolation portion, so the crosstalk between neighboring pixels may be reduced or prevented.
- the sensor includes the fixed charge layer being in contact with at least one surface of the substrate. Holes may be accumulated around the fixed charge layer, and thus, the occurrence of the dark current and the white spots may be effectively reduced.
- the poly-silicon pattern may be disposed within the deep device isolation portion. Since the poly-silicon pattern has a substantially same thermal expansion coefficient as the substrate formed of silicon, it is possible to reduce the physical stress caused by the difference between the thermal expansion coefficients of materials.
- example embodiments of the inventive concepts may provide an auto-focus image sensor capable of realizing a cleaner image and a digital image processing device including the same.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- Computer Hardware Design (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- General Physics & Mathematics (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Focusing (AREA)
Abstract
The inventive concepts provide an auto-focus image sensor and a digital image processing device including the same. The auto-focus image sensor includes a substrate including at least one first pixel used for detecting a phase difference and at least one second pixel used for detecting an image, a deep device isolation portion disposed in the substrate to isolate the first pixel from the second pixel, and a light shielding pattern disposed on the substrate of at least the first pixel. The amount of light incident on the first pixel is smaller than the amount of light incident on the second pixel by the light shielding pattern.
Description
- This application is a continuation of U.S. application Ser. No. 17/005,423, filed on Aug. 28, 2020, which is a continuation of U.S. application Ser. No. 16/356,057, filed on Mar. 18, 2019, now granted as U.S. Pat. No. 10,979,621 on Apr. 13, 2021, which is a continuation of U.S. patent application Ser. No. 15/903,727, filed on Feb. 23, 2018, now granted as U.S. Pat. No. 10,382,666, issued on Aug. 13, 2019, which is a continuation under 35 U.S.C. § 120 of U.S. patent application Ser. No. 14/746,302, filed Jun. 22, 2015, and issued as U.S. Pat. No. 9,942,461, on Apr. 10, 2018, which claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2014-0076509, filed on Jun. 23, 2014, in the Korean Intellectual Property Office, the disclosure of each of which is hereby incorporated by reference in its entirety.
- The inventive concepts relate to an auto-focus image sensor and a digital image processing device including the same.
- In a digital image processing device such as a camera, it may be helpful to detect a focus control state of a photographing lens to automatically control a focus of the lens. To achieve this, a conventional digital image processing device includes an additional focus detecting device separate and/or different from an image sensor. In this case, costs of the focus detecting device and/or an additional optical lens may be increased and/or an entire size of the digital image processing device may be increased by the focus detecting device. To solve these problems, an auto-focus image sensor using a method of detecting a phase difference has been developed.
- Example embodiments of the inventive concepts may provide an auto-focus image sensor capable of realizing a clearer image.
- Example embodiments of the inventive concepts may also provide a digital image processing device capable of realizing a clearer image.
- In one aspect, an auto-focus image sensor may include: a substrate comprising at least one first pixel used for detecting a phase difference and at least one second pixel used for detecting an image; a deep device isolation portion disposed in the substrate to isolate the first pixel from the second pixel; and a light shielding pattern disposed on the substrate of at least the first pixel. The amount of light incident on the first pixel may be smaller than the amount of light incident on the second pixel by the light shielding pattern. The substrate may include: a first surface on which a gate electrode is disposed; and a second surface opposite to the first surface. The deep device isolation portion may be adjacent to at least the second surface.
- In some example embodiments, light may be incident through the first surface, the light shielding pattern may be disposed on the first surface, and charge generated from the first pixel may be transmitted through the light shielding pattern. In this case, the auto-focus image sensor may further include: an interconnection disposed on the first surface in the second pixel. Charge generated from the second pixel may be transmitted through the interconnection, and the light shielding pattern and the interconnection may be disposed at the same height from the first surface. The light shielding pattern may have a width greater than that of the interconnection.
- In some example embodiments, light may be incident through the second surface, and the light shielding pattern may be disposed on the second surface. The deep device isolation portion may have a mesh structure, and the light shielding pattern may have a mesh structure that overlaps with the deep device isolation portion when viewed from a plan view. In this case, an area of the light shielding pattern in the first pixel may be greater than that of the light shielding pattern in the second pixel. In this case, a ground voltage or a reference voltage may be applied to the light shielding pattern.
- In some example embodiments, the deep device isolation portion may penetrate the substrate so as to be exposed at the first and second surfaces.
- In some example embodiments, the deep device isolation portion may include: a filling insulation layer; and a poly-silicon pattern disposed within the filling insulation layer.
- In some example embodiments, the deep device isolation portion may include: a filling insulating layer; and a fixed charge layer disposed between the filling insulation layer and the substrate.
- In some example embodiments, the fixed charge layer and the filling insulation layer may extend onto the second surface, and the fixed charge layer may be in contact with the second surface.
- In some example embodiments, the fixed charge layer may be formed of a metal oxide or metal fluoride that includes at least one selected from a group consisting of hafnium (Hf), zirconium (Zr), aluminum (Al), tantalum (Ta), titanium (Ti), yttrium (Y), and a lanthanoid.
- In some example embodiments, the deep device isolation portion may further include: a gap-fill assistant layer spaced apart from the fixed charge layer with the filling insulation layer therebetween.
- In some example embodiments, the deep device isolation portion may include: a fixed charge layer being in contact with a sidewall of the substrate and an air gap region exposing the fixed charge layer.
- In some example embodiments, the deep device isolation portion may include: a poly-silicon pattern disposed in a first trench extending from the first surface toward the second surface; a first filling insulation layer being in contact with both sidewalls of the poly-silicon pattern in the first trench; a fixed charge layer disposed in a second trench extending from the second surface toward the first surface, the second trench overlapping with the first trench, the fixed charge layer being in contact with both the first filling insulation layer and the poly-silicon pattern, and the fixed charge layer covering an inner sidewall of the second trench; and a second filling insulation layer filling the second trench.
- In some example embodiments, the deep device isolation portion may include: a first deep device isolation portion adjacent to the first surface; and a second deep device isolation portion adjacent to the second surface. The second deep device isolation portion may be in contact with the first deep device isolation portion.
- In some example embodiments, the auto-focus image sensor may further include: a fixed charge layer disposed on the second surface.
- In some example embodiments, the auto-focus image sensor may further include: a shallow device isolation portion disposed in the substrate to define an active region, the shallow device isolation portion adjacent to the first surface; and a color filter and a micro-lens disposed on the first surface or the second surface.
- In some example embodiments, a color filter disposed on the first pixel may not include a pigment.
- In some example embodiments, the auto-focus image sensor may further include: a first ground region disposed in the substrate of the first pixel, the first ground region adjacent to the first surface in the first pixel, and a ground voltage applied to the substrate of the first pixel through the first ground region; and a second ground region disposed in the substrate of the second pixel, the second ground region adjacent to the first surface in the second pixel, and the ground voltage applied to the substrate of the second pixel through the second ground region.
- In another aspect, a digital image processing device may include: the auto-focus image sensor; an optical system inputting light into the auto-focus image sensor; and a focus controller controlling a focus of the optical system using the phase difference detected from the first pixel.
- In still another aspect, an auto-focus image sensor may include: a substrate comprising: a first auto-focus (AF) pixel and a second AF pixel that are used for detecting a phase difference and are adjacent to each other; and at least one image pixel used for detecting an image; a deep device isolation portion isolating the first AF pixel, the second AF pixel, and the image pixel from each other; and a light shielding pattern disposed on at least the first and second AF pixels, the light shielding pattern having a first opening and a second opening that partially expose the first AF pixel and the second AF pixel, respectively. The first opening and the second opening may be disposed to be symmetric.
- In still another aspect, an auto-focus image sensor may include: a substrate including at least one first pixel configured to detect a phase difference and at least one second pixel configured to detect an image; an isolation portion configured to isolate the at least one first pixel from at least one second pixel; and a light shield on the substrate and between the at least one first pixel and incident light.
- In some example embodiments, the auto-focus image sensor may include an amount of light incident on the at least one first pixel which may be smaller than an amount of light incident on the at least one second pixel.
- In some example embodiments, the deep device isolation portion of the auto-focus image sensor may include a mesh structure and surrounds the at least one first pixel and the at least one second pixel in at least two directions.
- In some example embodiments, the substrate of the auto-focus image sensor may include a first surface on which a gate electrode is disposed and a second surface opposite to the first surface and wherein the deep device isolation portion is adjacent to at least the second surface, the auto-focus image sensor further comprising: a shallow device isolation portion in the substrate to define an active region, the shallow device isolation portion adjacent to the first surface; and a color filter and a micro-lens on the first surface or the second surface.
- In some example embodiments, a digital image processing device may comprise an auto-focus image sensor, an optical system configured to input light into the auto-focus image sensor; and a focus controller configured to control a focus of the optical system using the phase difference detected from the at least one first pixel.
- The foregoing and other features of inventive concepts will be apparent from the more particular description of non-limiting embodiments of inventive concepts, as illustrated in the accompanying drawings in which like reference characters refer to like parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating principles of inventive concepts. In the drawings:
-
FIG. 1 is a schematic block diagram illustrating a digital image processing device according to some example embodiments of the inventive concepts; -
FIG. 2 is a diagram illustrating a principle of a phase-difference auto-focus (AF) using an auto-focus image sensor ofFIG. 1 ; -
FIG. 3A is a graph illustrating phases of output values of auto-focus (AF) pixels when a photographing lens is out of focus; -
FIG. 3B is a graph illustrating phases of output values of auto-focus (AF) pixels when a photographing lens is in focus; -
FIG. 4 is a circuit diagram of an auto-focus image sensor according to some example embodiments of the inventive concepts; -
FIG. 5A is a layout illustrating a portion of a pixel region of an auto-focus image sensor according to some example embodiments of the inventive concepts; -
FIGS. 5B and 5C are layouts illustrating a portion of a pixel region of an auto-focus image sensor according to other example embodiments of the inventive concepts; -
FIG. 6 is an upper layout of an auto-focus image sensor according to some example embodiments of the inventive concepts; -
FIG. 7 is a lower layout of the auto-focus image sensor ofFIG. 6 ; -
FIG. 8 is a cross-sectional view taken along lines A-A′ and B-B′ ofFIG. 6 or 7 ; -
FIGS. 9A, 10A, 11A, 12A, and 13A are plan views illustrating a method of fabricating an auto-focus image sensor having the upper layout ofFIG. 6 ; -
FIGS. 14A and 15A are plan views illustrating a method of fabricating an auto-focus image sensor having the lower layout ofFIG. 7 ; -
FIGS. 9B, 10B, 11B, 12B, 13B, 14B, and 15B are cross-sectional views illustrating a method of fabricating an auto-focus image sensor having the cross-sectional view ofFIG. 8 ; -
FIG. 16 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to other example embodiments of the inventive concepts; -
FIG. 17 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to still other example embodiments of the inventive concepts; -
FIG. 18A is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet other example embodiments of the inventive concepts; -
FIG. 18B is a cross-sectional view illustrating a method of fabricating the auto-focus image sensor ofFIG. 18A ; -
FIGS. 19A and 19B are cross-sectional views taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts; -
FIG. 20 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts; -
FIG. 21 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts; -
FIGS. 22 to 24 are cross-sectional views illustrating a method of fabricating the auto-focus image sensor ofFIG. 21 ; -
FIG. 25 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts; -
FIG. 26 is a cross-sectional view taken along lines A-A′ and B-B′ ofFIG. 25 ; -
FIG. 27 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts; -
FIG. 28 is a cross-sectional view taken along lines A-A′ and B-B′ ofFIG. 27 ; -
FIG. 29 is a layout of a first layer first signal line and a first layer third signal line in a first focus detecting region; -
FIG. 30 is a layout of a second layer first signal line and a second layer second signal line in a second focus detecting region; -
FIGS. 31 to 35 illustrate embodiments of a digital image processing device including an auto-focus image sensor according to example embodiments of the inventive concepts; and -
FIG. 36 is a schematic block diagram an interface and an electronic system including an auto-focus image sensor according to example embodiments of the inventive concepts. - The inventive concepts will now be described more fully hereinafter with reference to the accompanying drawings, in which example embodiments of the inventive concepts are shown. The advantages and features of the inventive concepts and methods of achieving them will be apparent from the following example embodiments that will be described in more detail with reference to the accompanying drawings. It should be noted, however, that the inventive concepts are not limited to the following example embodiments, and may be implemented in various forms. Accordingly, the example embodiments are provided only to disclose the inventive concepts and let those skilled in the art know the category of the inventive concepts. In the drawings, embodiments of the inventive concepts are not limited to the specific examples provided herein and are exaggerated for clarity.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the invention. As used herein, the singular terms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it may be directly connected or coupled to the other element or intervening elements may be present.
- Similarly, it will be understood that when an element such as a layer, region or substrate is referred to as being “on” another element, it can be directly on the other element or intervening elements may be present. In contrast, the term “directly” means that there are no intervening elements. It will be further understood that the terms “comprises”, “comprising,”, “includes” and/or “including”, when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- Additionally, the embodiment in the detailed description will be described with sectional views as ideal example views of the inventive concepts. Accordingly, shapes of the example views may be modified according to manufacturing techniques and/or allowable errors. Therefore, the embodiments of the inventive concepts are not limited to the specific shape illustrated in the example views, but may include other shapes that may be created according to manufacturing processes. Areas exemplified in the drawings have general properties, and are used to illustrate specific shapes of elements. Thus, this should not be construed as limited to the scope of the inventive concepts.
- It will be also understood that although the terms first, second, third etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. Thus, a first element in some embodiments could be termed a second element in other embodiments without departing from the teachings of the present invention. Exemplary embodiments of aspects of the present inventive concepts explained and illustrated herein include their complementary counterparts. The same reference numerals or the same reference designators denote the same elements throughout the specification.
- Moreover, example embodiments are described herein with reference to cross-sectional illustrations and/or plane illustrations that are idealized example illustrations. Accordingly, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments should not be construed as limited to the shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an etching region illustrated as a rectangle will, typically, have rounded or curved features. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
- As appreciated by the present inventive entity, devices and methods of forming devices according to various embodiments described herein may be embodied in microelectronic devices such as integrated circuits, wherein a plurality of devices according to various embodiments described herein are integrated in the same microelectronic device. Accordingly, the cross-sectional view(s) illustrated herein may be replicated in two different directions, which need not be orthogonal, in the microelectronic device. Thus, a plan view of the microelectronic device that embodies devices according to various embodiments described herein may include a plurality of the devices in an array and/or in a two-dimensional pattern that is based on the functionality of the microelectronic device.
- The devices according to various embodiments described herein may be interspersed among other devices depending on the functionality of the microelectronic device. Moreover, microelectronic devices according to various embodiments described herein may be replicated in a third direction that may be orthogonal to the two different directions, to provide three-dimensional integrated circuits.
- Accordingly, the cross-sectional view(s) illustrated herein provide support for a plurality of devices according to various embodiments described herein that extend along two different directions in a plan view and/or in three different directions in a perspective view. For example, when a single active region is illustrated in a cross-sectional view of a device/structure, the device/structure may include a plurality of active regions and transistor structures (or memory cell structures, gate structures, etc., as appropriate to the case) thereon, as would be illustrated by a plan view of the device/structure.
-
FIG. 1 is a schematic block diagram illustrating a digital image processing device according to some example embodiments of the inventive concepts. - Referring to
FIG. 1 , a digitalimage processing device 100 may be separable from a lens. However, the inventive concepts are not limited thereto. An auto-focus image sensor 108 according to some example embodiments and the lens may constitute one integrated body. In addition, since the auto-focus image sensor 108 according to the inventive concepts is used, the digitalimage processing device 100 can perform a phase difference auto-focus (AF) process and a contrast AF process. - The digital
image processing device 100 includes a photographinglens 101 including afocus lens 102. The digitalimage processing device 100 may have a focus detecting function and may drive thefocus lens 102. The photographinglens 101 further includes alens driver 103 driving thefocus lens 102, alens position detector 104 detecting a position of thefocus lens 102, and alens controller 105 controlling thefocus lens 102. Thelens controller 105 exchanges data relative to focus detection with a central processing unit (CPU) 106 of the digitalimage processing device 100. - The digital
image processing device 100 includes the auto-focus image sensor 108. Thus, the digitalimage processing device 100 may photograph subject light inputted through the photographinglens 101 to generate an image signal. The auto-focus image sensor 108 may include a plurality of photoelectric converters (not shown) arranged in a matrix form and transmission lines (not shown) through which charge moves from the photoelectric converters to output the image signal. - A
sensor controller 107 generates a timing signal, so the auto-focus image sensor 108 is controlled to photograph an image. In addition, thesensor controller 107 sequentially outputs image signals if charge accumulation is completed in each scanning line. - The outputted signals pass through an analog
signal processing part 109 and are then converted into digital signals in an analog/digital (A/D)converter 110. The digital signals are inputted into animage input controller 111 and are then processed. - An auto-white balance (AWB) operation, an auto-exposure (AE) operation, and an AF operation are performed to a digital image signal inputted to the
image input controller 111 in anAWB detecting part 116, anAE detecting part 117, and anAF detecting part 118, respectively. In some example embodiments, theAF detecting part 118 outputs a detecting value with respect to a contrast value during the contrast AF process and outputs pixel information to theCPU 106 during the phase difference AF process, so theCPU 106 performs a phase difference operation. The phase difference operation of theCPU 106 may be obtained by performing a correlation operation of a plurality of pixel column signals. A position or a direction of a focus may be obtained by the result of the phase difference operation. - An image signal is stored in a synchronous dynamic random access memory (SDRAM) 119 that is a temporary memory. A
digital signal processor 112 performs one or more image signal processes (e.g., gamma correction) to create a displayable live view or a capture image. A compressor-expander 113 may compress the image signal in a compressed form (e.g., JPEG or H.264) or may expand the image signal when it is reproduced. An image file including the image signal compressed in the compressor-expander 113 is transmitted through amedia controller 121 to be stored in amemory card 122. - In example embodiments, a nonvolatile memory may be embodied to include a three dimensional (3D) memory array. The 3D memory array may be monolithically formed on a substrate (e.g., semiconductor substrate such as silicon, or semiconductor-on-insulator substrate). The 3D memory array may include two or more physical levels of memory cells having an active area disposed above the substrate and circuitry associated with the operation of those memory cells, whether such associated circuitry is above or within such substrate. The layers of each level of the array may be directly deposited on the layers of each underlying level of the array.
- In example embodiments, the 3D memory array may include vertical NAND strings that are vertically oriented such that at least one memory cell is located over another memory cell. The at least one memory cell may comprise a charge trap layer.
- The following patent documents, which are hereby incorporated by reference in their entirety, describe suitable configurations for three-dimensional memory arrays, in which the three-dimensional memory array is configured as a plurality of levels, with word lines and/or bit lines shared between levels: U.S. Pat. Nos. 7,679,133; 8,553,466; 8,654,587; 8,559,235; and US Pat. Pub. No. 2011/0233648.
- Display image information is stored in a video random access memory (VRAM) 120, and the image is disposed on a liquid crystal display (LCD) 115 through a
video encoder 114. TheCPU 106 used as a controller may control operations of each part. An electrically erasable programmable read-only memory (EEPROM) 123 may store and maintain information for correcting pixel defects of the auto-focus image sensor 108 or adjustment information. Anoperating interface 124 receives various commands from a user to operate the digitalimage processing device 100. The operatingpart 124 may include various buttons such as a shutter-release button (not shown), a main button (not shown), a mode dial (not shown), and/or a menu button (not shown). - When a structure is hardware, such existing hardware may include one or more Central Processing Units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits (ASICs), field programmable gate arrays (FPGAs) computers or the like configured as special purpose machines to perform the functions of the module. As stated above, CPUs, DSPs, ASICs and FPGAs may generally be referred to as processing devices.
- In the event a structure is or includes a processor executing software, the processor is configured as a special purpose machine to execute the software, stored in a storage medium, to perform the functions of the structure.
-
FIG. 2 is a diagram illustrating a principle of a phase-difference auto-focus (AF) using an auto-focus image sensor ofFIG. 1 . - Referring to the phase difference AF principle diagram of
FIG. 2 , light (or incident light) of an object that has passed through the photographinglens 101 passes through amicro-lens array 14 so as to be introduced to a first AF pixel R and a second AF pixel L. Masks oropenings pupils lens 101 may be adjacent to portions of the first and second AF pixels R and L. The light inputted from thepupil 12 disposed above a light axis of the photographinglens 101 is induced to the second AF pixel L, and the light inputted from thepupil 13 disposed under the light axis of the photographinglens 101 is induced to the first AF pixel R. “Pupil segmentation” means that the first AF pixel R and the second AF pixel L receive light, which are reversely projected at positions of thepupils micro-lens array 14, through the masks oropenings - Continuous pupil-segmented pixel outputs of the first and second AF pixels R and L according to positions of the first and second AF pixels R and L are illustrated in
FIGS. 3A and 3B . In each ofFIGS. 3A and 3B , a horizontal axis represents a position of each of the first and second AF pixels R and L, and a vertical axis represents an output value of each of the first and second AF pixels R and L. A shape of the continuous output value of the first AF pixel R is the same as that of the second AF pixel L. However, positions (e.g., phases) of the output values of the first and second AF pixels R and L may be different from each other. This is because positions of image formation of the light provided fromeccentric pupils lens 101 are different from each other. Thus, if the photographinglens 101 is out of focus, the phases of the output values of the first and second AF pixels are not co-located as illustrated inFIG. 3A . If the photographinglens 101 is in focus, the image is formed at the same position as illustrated inFIG. 3B . In addition, a direction of a focus difference may be determined from this. A front-focusing state means that the photographinglens 101 focuses in front of the object. In the front-focusing state, the phase of the output value of the first AF pixel R is left-shifted from a phase of a focused state and the phase of the output value of the second AF pixel L is right-shifted from the phase of the focused state. On the other hand, a back-focusing state means that the photographinglens 101 focuses on back of the object. In the back-focusing state, the phase of the output value of the first AF pixel R is right-shifted from the phase of the focused state and the phase of the output value of the second AF pixel L is left-shifted from the phase of the focused state. A shift amount between the phases of the output values of the first and second AF pixels R and L may be converted into a deviation amount between focuses. -
FIG. 4 is a circuit diagram of an auto-focus image sensor, for example, auto-focus image sensor 108, according to some example embodiments of the inventive concepts. - Referring to
FIG. 4 , each of unit pixels UP1, UP2, UP3, and UP4 of the auto-focus image sensor may include a photoelectric converter region PD, a transfer transistor Tx, a source follower transistor Sx, a reset transistor Rx, and a selection transistor Ax. In some example embodiments, four unit pixels adjacent to each other will be described as an example for the purpose of ease and convenience in explanation. However, the inventive concepts are not limited to the number of the unit pixels. The auto-focus image sensor 108 may include five or more unit pixels. At least two unit pixels adjacent to each other of the unit pixels UP1, UP2, UP3, and UP4 may be AF pixels that are used to detect a phase difference, and others of the unit pixels UP1, UP2, UP3, and UP4 may be image pixels that are used to detect an image. - The transfer transistor Tx, the source follower transistor Sx, the reset transistor Rx, and the selection transistor Ax may include a transfer gate TG, a source follower gate SF, a reset gate RG, and a selection gate SEL, respectively. A photoelectric converter is provided in the photoelectric converter region PD. The photoelectric PD may be a photodiode including an N-type dopant region and a P-type dopant region. A drain of the transfer transistor Tx may be a floating diffusion region FD. The floating diffusion region FD may also be a source of the reset transistor Rx. The floating diffusion region FD may be electrically connected to the source follower gate SF of the source follower transistor Sx. The source follower transistor Sx is connected to the selection transistor Ax.
- The transfer gates TG of first and second unit pixels UP1 and UP2 adjacent to each other in a first direction D1 may be electrically connected to a first transfer gate line TGL1. The transfer gates TG of third and fourth unit pixels UP3 and UP4 adjacent to each other in the first direction D1 may be electrically connected to a second transfer gate line TGL2. Likewise, the reset gates RG of the first and second unit pixels UP1 and UP2 may be electrically connected to a first reset gate line RGL1, and the reset gates RG of the third and fourth unit pixels UP3 and UP4 may be electrically connected to a second reset gate line RGL2. The selection gates SEL of the first and second unit pixels UP1 and UP2 may be electrically connected to a first selection gate line SELL1, and the selection gates SEL of the third and fourth unit pixels UP3 and UP4 may be electrically connected to a second selection gate line SELL2.
- The reset transistor Rx, the source follower transistor Sx, and the selection transistor Ax may be shared by neighboring pixels, thereby improving an integration degree of the auto-
focus image sensor 108. - A method of operating the auto-focus image sensor will be described with reference to
FIG. 1 . For example, a power voltage Vdd is applied to the drains of the reset transistors Rx and the drains of the source follower transistors Sx of the first and second unit pixels UP1 and UP2 in a dark state, thereby discharging charge remaining in the floating diffusion regions PD. Thereafter, the reset transistors Rx are turned-off and light is inputted from an external system to the photoelectric converter regions PD to generate electron-hole pairs in the photoelectric converter regions PD. Holes are moved into and then accumulated in the P-type dopant regions, and electrons are moved into and then accumulated in the N-type dopant regions. If the transfer transistors Tx are turned-on, the electrons are transferred into and then accumulated in the floating diffusion regions PD. Gate biases of the source follower transistors Sx are changed in proportion to the amounts of the electrons accumulated in the floating diffusion regions FD, so source potentials of the source follower transistors Sx are changed. At this time, if the selection transistors Ax are turned-on, signals generated by the electrons are read through signal sensing lines Vout. Next, the processes described above may be performed on the third and fourth unit pixels UP3 and UP4. - If the first and second unit pixels UP1 and UP2 are the AF pixels and the third and fourth unit pixels UP3 and UP4 are the image pixels, output values like
FIG. 3A are obtained from the AF pixels corresponding to the first and second unit pixels UP1 and UP2 and the photographinglens 101 ofFIG. 1 then focuses using the obtained output values. Whether the photographinglens 101 focuses or not may be confirmed. In addition, whether output values likeFIG. 3B are outputted from the AF pixels or not may be confirmed. If the digitalimage processing device 100 ofFIG. 1 is a digital camera, a shutter may be pressed after the photographinglens 101 focuses, thereby obtaining an image from output values received from the image pixels such as the third and fourth unit pixels UP3 and UP4. As a result, a clean image may be obtained. -
FIG. 5A is a layout illustrating a portion of a pixel region of an auto-focus image sensor according to some example embodiments of the inventive concepts. - Referring to
FIG. 5A , an auto-focus image sensor according to some example embodiments may include first and secondfocus detecting regions image detecting regions 30. The firstfocus detecting region 32 may extend in a first direction D1, and the secondfocus detecting region 33 may extend in a second direction D2 intersecting the first direction D1. The firstfocus detecting region 32 may include afirst AF pixel 20R and asecond AF pixel 20L that are adjacent to each other and are used to detect a phase difference. In some example embodiments, the firstfocus detecting region 32 may include a plurality offirst AF pixels 20R and a plurality ofsecond AF pixels 20L that are alternately arranged along the first direction D1. The secondfocus detecting region 33 may include athird AF pixel 20D and afourth AF pixel 20U that are adjacent to each other and are used to detect a phase difference. In some example embodiments, the secondfocus detecting region 33 may include a plurality ofthird AF pixels 20D and a plurality offourth AF pixels 20U that are alternately arranged along the second direction D2. Theimage detecting region 30 may includeimage pixels 21. In some example embodiments, the first and secondfocus detecting regions FIG. 5 illustrates a portion of a pixel region, so a cross point of the first and secondfocus detecting regions focus detection regions - A color filter array may be disposed on the first and second
focus detecting regions image detecting regions 30. The color filter array may be a Bayer pattern array consisting of red (R), green (G), and blue (B) or may adopt a complementary color system (e.g., a system using magenta, green, cyan, and yellow). Color filters disposed on theAF pixels AF pixels micro-lens array 35 is disposed on the color filter array. - A light shielding pattern that controls light-receiving amounts of at least the
AF pixels AF pixels first openings 332. The light shielding pattern may further includesecond openings 330 disposed on theimage pixels 21. An area of each of thefirst openings 332 may be smaller than that of each of thesecond openings 330. For example, the area of thefirst opening 332 may be about 50% of the area of thesecond opening 330. Thefirst opening 332 may be disposed to be one-sided from a light axis along which light is inputted. Thefirst openings 332 of the first andsecond AF pixels first openings 332 of the third andfourth AF pixels first opening 332 of the light shielding pattern may reduce the amount of light incident on each of theAF pixels image pixel 21. In other words, the amount of the light incident on eachAF pixel image pixel 21 due to the light shielding pattern. -
FIGS. 5B and 5C are layouts illustrating a portion of a pixel region of an auto-focus image sensor according to other example embodiments of the inventive concepts. - Referring to
FIG. 5B , in an auto-focus image sensor according to some example embodiments, only green color filters G may be disposed on the first and secondfocus detecting regions - Referring to
FIG. 5C , in an auto-focus image sensor according to some example embodiments, color filters W disposed on the first and secondfocus detecting regions AF pixels focus detecting regions AF pixels AF pixels - Alternatively, color filters may not exist on the
AF pixels focus detecting regions -
FIG. 6 is an upper layout of an auto-focus image sensor according to some example embodiments of the inventive concepts.FIG. 7 is a lower layout of the auto-focus image sensor ofFIG. 6 .FIG. 8 is a cross-sectional view taken along lines A-A′ and B-B′ ofFIG. 6 or 7 . An auto-focus image sensor according to some example embodiments may be a backside-illuminated auto-focus image sensor. - Referring to
FIGS. 6 to 8 , the auto-focus image sensor according to some example embodiments includes asubstrate 51 that has afirst surface 51 a and asecond surface 51 b opposite to each other. A deep device isolation layer (or a deep device isolation portion or isolation portion) 53 is disposed in thesubstrate 51 to separateAF pixels 20 andimage pixels 21 from each other. As described with reference toFIGS. 5A to 5C , theAF pixels 20 may be disposed in first and secondfocus detecting regions image pixels 21 may be disposed inimage detecting regions 30. In some example embodiments, the deepdevice isolation layer 53 may penetrate thesubstrate 51 so as to be exposed at the first andsecond surfaces pixels first surface 51 a may be disposed to define first to third active regions AR1, AR2, and AR3 that are spaced apart from each other. The shallowdevice isolation layer 55 is spaced apart from thesecond surface 51 b. - A photoelectric converter PD may be disposed in each of the
pixels first dopant region 59 adjacent to thefirst surface 51 a and asecond dopant region 57 adjacent to thesecond surface 51 b. For example, thefirst dopant region 59 may be doped with P-type dopants, and thesecond dopant region 57 may be doped with N-type dopants. A transfer gate TG may be disposed on thefirst surface 51 a of the first active region AR1 with agate insulating layer 61 therebetween. A reset gate RG, a source follower gate SF, and a selection gate SEL which are spaced apart from each other may be disposed on thefirst surface 51 a of the second active region AR2. A floating diffusion region FD is disposed in the active region AR1. The floating diffusion region FD is adjacent to thefirst surface 51 a which does not overlap with the transfer gate TG. The floating diffusion region FD is spaced apart from thesecond dopant region 57. Aground region 63 may be disposed in the third active region AR3 and may be adjacent to thefirst surface 51 a. For example, the floating diffusion region FD may be doped with dopants of the same conductivity type as the dopants in thesecond dopant region 57. For example, the floating diffusion region FD may be doped with N-type dopants. Theground region 63 may be doped with dopants of the same conductivity type as the dopants in thefirst dopant region 59. For example, theground region 63 may be doped with P-type dopants. Here, a dopant concentration of theground region 63 may be higher than that of thefirst dopant region 59. - The first surface Ma of the
substrate 51 is covered with a firstinterlayer insulating layer 65. First layer first to first layer seventh contacts C11 to C17 penetrate the firstinterlayer insulating layer 65. The first layer first contact C11 contacts the transfer gate TG. The first layer second contact C12 contacts the floating diffusion region FD. The first layer third contact C13 contacts the source follower gate SF. The first layer fourth contact C14 contacts a source region (of a reset transistor) disposed at a side of the reset gate RG. The first layer fifth contact C15 contacts the reset gate RG. The first layer sixth contact C16 contacts the selection gate SEL. The first layer seventh contact C17 contacts a dopant region between the reset gate RG and the source follower gate SF. The dopant region between the reset gate RG and the source follower gate SF corresponds to the drain of the reset transistor Rx and the drain of the source follower transistor Sx. - First layer first to first layer fifth signal lines L11 to L15 are disposed on the first
interlayer insulating layer 65. The signal lines L11 to L15 may correspond to interconnections. The first layer first signal line L11 contacts the first layer first contact C11, so a voltage may be applied to the transfer gate TG through the first layer first signal line L11. The first layer second signal line L12 contacts the first layer second to first layer fourth contacts C12 to C14 at the same time so as to electrically connect the floating diffusion region FD, the source region of the reset transistor, and the source follower gate SF to each other. The first layer third signal line L13 contacts the first layer fifth contact C15, so a voltage may be applied to the reset gate RG through the first layer third signal line L13. The first layer fourth signal line L14 contacts the first layer sixth contact C16, so a voltage may be applied to the selection gate SEL through the first layer fourth signal line L14. The first layer fifth signal line L15 contacts the first layer seventh contact C17, so the power voltage Vdd may be applied to the drains of the reset transistor and the source follower transistor through the first layer fifth signal line L15. - A second
interlayer insulating layer 67 covers the firstinterlayer insulating layer 65 and the first layer first to first layer fifth signal lines L11 to L15. Second layer first and second layer second contacts C21 and C22 penetrate the second and firstinterlayer insulating layers ground region 63. The second layer second contact C22 contacts a source (of the selection transistor) that is disposed at a side of the selection gate SEL. - A second layer first signal line L21 and a second layer second signal line L22 are disposed on the second
interlayer insulating layer 67. The signal lines L21 and L22 may correspond to interconnections. The second layer first signal line L21 contacts the second layer first contact C21 so as to apply a ground voltage to theground region 63. The second layer second signal line L22 contacts the second layer second contact C22. The second layer second signal line L22 may correspond to the signal sensing line Vout ofFIG. 4 . - A third
interlayer insulating layer 69 may cover the secondinterlayer insulating layer 67 and the second layer first and second signal lines L21 and L22. The thirdinterlayer insulating layer 69 may be covered with afirst passivation layer 71. - A fixed
charge layer 73 may be disposed on thesecond surface 51 b of thesubstrate 51. The fixedcharge layer 73 may be formed of a metal oxide or metal fluoride having oxygen or fluorine of which a content ratio is lower than its stoichiometric ratio. Thus, the fixedcharge layer 73 may have negative fixed charge. The fixedcharge layer 73 may be formed of a metal oxide or metal fluoride that includes at least one selected from a group consisting of hafnium (Hf), zirconium (Zr), aluminum (Al), tantalum (Ta), titanium (Ti), yttrium (Y), and a lanthanoid. For example, the fixedcharge layer 73 may be a hafnium oxide layer or an aluminum fluoride layer. Holes may be accumulated around thesecond surface 51 b due to the fixedcharge layer 73, so occurrence of a dark current and white spots may be effectively reduced. - A first insulating
layer 75 and a second insulatinglayer 77 may be sequentially stacked on the fixedcharge layer 73. The first insulatinglayer 75 may be, for example, a silicon oxide layer. The second insulatinglayer 77 may be, for example, a silicon nitride layer. - A light shielding pattern (or light shield) 79 may be disposed on the second insulating
layer 77. Thelight shielding pattern 79 may be formed of, for example, an opaque metal. Thelight shielding pattern 79 may be disposed in only the first and secondfocus detecting regions first openings 332 may be disposed in thelight shielding pattern 79. - A
second passivation layer 83 may be conformally stacked on thelight shielding pattern 79. Aplanarization layer 85 is disposed on thesecond passivation layer 83. Acolor filter array 87 may be disposed on theplanarization layer 85, and amicro-lens array 35 may be disposed on thecolor filter array 87. - Since the auto-focus image sensor according to some example embodiments includes the deep
device isolation layer 53, crosstalk between the pixels may be reduced or prevented. - If the auto-focus image sensor is a backside-illuminated type, light (or incident light) is inputted through the
second surface 51 b of thesubstrate 51. As a result, the signal lines L11 to L15, L21, and L22 adjacent to thefirst surface 51 a may not be limited to their positions. For example, the signal lines L11 to L15, L21, and L22 may overlap with the photoelectric converter PD. - Next, a method of fabricating the auto-focus image sensor of
FIGS. 6 to 8 will be described. -
FIGS. 9A to 13A are plan views illustrating a method of fabricating an auto-focus image sensor having the upper layout ofFIG. 6 .FIGS. 14A and 15A are plan views illustrating a method of fabricating an auto-focus image sensor having the lower layout ofFIG. 7 . FIGS. 9B to 15B are cross-sectional views illustrating a method of fabricating an auto-focus image sensor having the cross-sectional view ofFIG. 8 . - Referring to
FIGS. 9A and 9B , a deepdevice isolation layer 53 is formed in asubstrate 51 having first andsecond surfaces device isolation layer 53 may be spaced apart from thesecond surface 51 b. In some example embodiments, the deepdevice isolation layer 53 may be formed of an insulating material such as silicon oxide. The deepdevice isolation layer 53 may be formed to have a mesh shape when viewed from a plan view. - Referring to
FIGS. 10A and 10B , ion implantation processes may be performed to afirst dopant region 59 and asecond dopant region 57 in thesubstrate 51 of each of the pixels isolated by the deepdevice isolation layer 53. Thus, a photoelectric converter PD is formed in each pixel. A shallowdevice isolation layer 53 that is adjacent to thefirst surface 51 a may be formed in thesubstrate 51 to define active regions AR1, AR2, and AR3. A portion of thesubstrate 51 around the deepdevice isolation layer 55 may be removed to form a shallow trench, and the shallow trench may be filled with a filling insulating layer to form the shallowdevice isolation layer 55. - Referring to
FIGS. 11A and 11B , a transfer gate TG may be formed to intersect the first active region AR1, and a reset gate RG, a source follower gate SF, and a selection gate SEL may be formed to intersect the second active region AR2. Ion implantation processes may be performed to form a floating diffusion region FD and aground region 63. At this time, dopant regions that are used as source/drain regions of reset, source follower, and selection transistors may be formed in the second active region AR2. Next, a firstinterlayer insulating layer 65 may be formed to cover thefirst surface 51 a. - Referring to
FIGS. 12A and 12B , first layer first to first layer seventh contacts C11 to C17 are formed to penetrate the firstinterlayer insulating layer 65. First layer first to first layer fifth signal lines L11 to L15 electrically connected to the contacts C11 to C17 are formed on the firstinterlayer insulating layer 65. A secondinterlayer insulating layer 67 is formed on the firstinterlayer insulating layer 65. - Referring to
FIGS. 13A and 13B , second layer first and second layer second contacts C21 and C22 are formed to penetrate the second and firstinterlayer insulating layers interlayer insulating layer 67. A thirdinterlayer insulating layer 69 and afirst passivation layer 71 are sequentially formed on the secondinterlayer insulating layer 67. - Referring to
FIGS. 14A and 14B , thesubstrate 51 is turned over such that thesecond surface 51 b faces upward. Next, a back grinding process may be performed on thesecond surface 51 b, so a portion, which is adjacent to thesecond surface 51 b, of thesubstrate 51 is removed to expose the deepdevice isolation layer 53. - Referring to
FIGS. 15A and 15B , a fixedcharge layer 73 is formed on an entire portion of thesecond surface 51 b. First and second insulatinglayers charge layer 73. Alight shielding pattern 79 is formed on the second insulatinglayer 77. In some embodiments, an opaque metal layer may be stacked on an entire top surface of the second insulatinglayer 77, and the stacked opaque metal layer may be etched to form thelight shielding pattern 79. Alternatively, thelight shielding pattern 79 may be formed by a damascene process using a process of forming a mask pattern (not shown), an electroplating process, and a planarization etching process. - Subsequently, as illustrated in
FIGS. 7 and 8 , thesecond passivation layer 83, theplanarization layer 85, thecolor filter array 87, and themicro-lens array 35 may be sequentially formed on thelight shielding pattern 79. Materials of the layers may be the same as described with reference toFIGS. 6 to 8 . - In some example embodiments, the deep
device isolation layer 53 is first formed. However, the inventive concepts are not limited thereto. In other example embodiments, the order of the processes described above may be changed. For example, the shallowdevice isolation layer 55 may be first formed to be adjacent to thefirst surface 51 a, and then, the transistors and the signal lines may be formed. Subsequently, the back grinding process may be performed on thesecond surface 51 b. Next, a portion of thesubstrate 51 may be etched from thesecond surface 51 b grinded to form a deep trench, and the deep trench may be filled with an insulating layer to form the deepdevice isolation layer 53. -
FIG. 16 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to other example embodiments of the inventive concepts. - Referring to
FIG. 16 , in an auto-focus image sensor according to some example embodiments, a deepdevice isolation layer 53 may include a fillinginsulation layer 53 a and a poly-silicon pattern 53 b disposed within the fillinginsulation layer 53 a. Since the poly-silicon pattern 53 b has a substantially same thermal expansion coefficient as thesubstrate 51 formed of silicon, it is possible to reduce a physical stress caused by a difference between thermal expansion coefficients of materials. Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference toFIGS. 6 to 8 . -
FIG. 17 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to still other example embodiments of the inventive concepts. - Referring to
FIG. 17 , in an auto-focus image sensor according to some example embodiments, a deepdevice isolation layer 53 is spaced apart from thefirst surface 51 a. The deepdevice isolation layer 53 contacts a top surface of the shallowdevice isolation layer 55. A method of fabricating the auto-focus image sensor according to some example embodiments will be described. After the processes described with reference toFIG. 9B , a portion of the deepdevice isolation layer 53 and thesubstrate 51 adjacent thereto may be etched at the same time to form a shallow trench. The shallow trench may be filled with a filling insulating layer to form the shallowdevice isolation layer 55. Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference toFIGS. 6 to 8 . -
FIG. 18A is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet other example embodiments of the inventive concepts. - Referring to
FIG. 18A , in an auto-focus image sensor according to some example embodiments, a deepdevice isolation layer 53 i may include a fixedcharge layer 73 and a first insulatinglayer 75. For example, the fixedcharge layer 73 may include a hafnium oxide layer. The first insulatinglayer 75 may include a silicon oxide layer or a silicon nitride layer. The fixedcharge layer 73 is disposed on thesecond surface 51 b and surrounds sidewalls of the photoelectric converter PD, thereby further reducing a dark current characteristic. Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference toFIGS. 6 to 8 . -
FIG. 18B is a cross-sectional view illustrating a method of fabricating the auto-focus image sensor ofFIG. 18A . - Referring to
FIG. 18B , the deepdevice isolation layer 53 of the structure ofFIG. 14B may be selectively removed to form a deep trench T1. Next, a fixedcharge layer 73 and a first insulatinglayer 75 are conformally formed on an entire portion of thesecond surface 51 b to fill the deep trench T1. Here, the deepdevice isolation layer 53 ofFIG. 14B may be used as a sacrificial pattern. Thus, an additional etching mask for forming the deep trench T1 is not required to correct a misalignment problem. Other fabricating processes of example embodiments may be similar to or the same as corresponding processes of example embodiments described with reference toFIGS. 9B to 15B . -
FIGS. 19A and 19B are cross-sectional views taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts. - Referring to
FIG. 19A , in an auto-focus image sensor according to some example embodiments, a deepdevice isolation layer 53 m may include a fixedcharge layer 73 and an air gap region AG. In the method of fabricating the auto-focus image sensor ofFIG. 18B , the first insulatinglayer 75 may be formed by a deposition method having a poor step coverage characteristic (e.g., a physical vapor deposition (PVD) deposition method) to form the air gap region AG. Other elements of the auto-focus image sensor according to some example embodiments may be similar to or the same as corresponding elements of the auto-focus image sensor described with reference toFIG. 18A . - Referring to
FIG. 19B , a deepdevice isolation layer 53 n may include a fixedcharge layer 73, a first insulatinglayer 75, and a gap-fill assistant layer 76 in an auto-focus image sensor according to some example embodiments. The fixedcharge layer 73 may be, for example, a hafnium oxide layer. The first insulatinglayer 75 may be, for example, a silicon oxide layer or a silicon nitride layer. The gap-fill assistant layer 76 may be, for example, a hafnium oxide layer. -
FIG. 20 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other example embodiments of the inventive concepts. - Referring to
FIG. 20 , a deepdevice isolation layer 53 j may include a fillinginsulation layer 53 a, a poly-silicon pattern 53 b, a fixedcharge layer 73, and a first insulatinglayer 75 in an auto-focus image sensor according to some example embodiments. The fixedcharge layer 73 may be in contact with both the fillinginsulation layer 53 a and the poly-silicon pattern 53 b. - The auto-focus image sensor of
FIG. 20 may be fabricated by a combining method of the method of fabricating the auto-focus image sensor ofFIG. 16 and the method of fabricating the auto-focus image sensor ofFIG. 18A . In other words, an initial deepdevice isolation layer 53 may be formed to include the fillinginsulation layer 53 a and the poly-silicon pattern 53 b, and a portion of the initial deepdevice isolation layer 53 may remain when the deep trench T1 is formed. Subsequently, the fixedcharge layer 73 and the first insulatinglayer 75 may be formed in the deep trench T1, thereby forming the deepdevice isolation layer 53 j. Other fabricating processes of some example embodiments may be similar to or the same as corresponding processes of example embodiments described with reference toFIGS. 9B to 15B . -
FIG. 21 is a cross-sectional view taken along a line A-A′ ofFIG. 6 or 7 to illustrate an auto-focus image sensor according to yet still other embodiments of the inventive concepts. - Referring to
FIG. 21 , a deepdevice isolation layer 53 k may include a first sub-deepdevice isolation layer 53 c and a second sub-deepdevice isolation layer 53 d in an auto-focus image sensor according to some example embodiments. The sub-deep device isolation layers 53 c and 53 d may include at least one of a silicon oxide layer, a poly-silicon layer, or a fixed charge layer. -
FIGS. 22 to 24 are cross-sectional views illustrating a method of fabricating the auto-focus image sensor ofFIG. 21 . - Referring to
FIG. 22 , a first sub-deepdevice isolation layer 53 c is formed in asubstrate 51. The first sub-deepdevice isolation layer 53 c is adjacent to thefirst surface 51 a and is spaced apart from thesecond surface 51 b. A shallowdevice isolation layer 55 is formed in thesubstrate 51. At this time, the shallowdevice isolation layer 55 may be formed to be shallower than the first sub-deepdevice isolation layer 53 c. - Referring to
FIG. 23 , transistors, lines L11 to L17, L21, and L22,interlayer insulating layers first passivation layer 71 may be formed on thefirst surface 51 a, and thesubstrate 51 may be then overturned. Subsequently, a back grinding process may be performed on thesecond surface 51 b. At this time, the first sub-deepdevice isolation layer 53 c is not exposed. - Referring to
FIG. 24 , a portion, adjacent to thesecond surface 51 b, of thesubstrate 51 may be etched to form a deep trench T2 exposing the first sub-deepdevice isolation layer 53 c. At this time, a depth of the deep trench T2 may be shallower than a depth of the deep trench T1 ofFIG. 18B . - Referring again to
FIG. 21 , subsequently, the second sub-deepdevice isolation layer 53 d may be formed to fill the deep trench T2. Other fabricating processes of some example embodiments may be similar to or the same as corresponding processes of example embodiments described with reference toFIGS. 9B to 15B . - In the fabricating method according to some example embodiments, the
substrate 51 may be etched from thefirst surface 51 a by a desired (or alternatively) predetermined depth and may be then etched from thesecond surface 51 b by a desired (or alternatively) predetermined depth to form the deepdevice isolation layer 53 k. Thus, an etching depth of each of the etching processes for the formation of the deepdevice isolation layer 53 k having a desired depth may be reduced to reduce burden of the etching processes. In addition, a depth of each of the trenches for the formation of the deepdevice isolation layer 53 k may be reduced to improve a gap-fill characteristic. -
FIG. 25 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.FIG. 26 is a cross-sectional view taken along lines A-A′ and B-B′ ofFIG. 25 . - Referring to
FIGS. 25 and 26 , alight shielding pattern 79 may extend into animage detection region 30 in an auto-focus image sensor according to some example embodiments. Thelight shielding pattern 79 may overlap with the deepdevice isolation layer 53 and may have a mesh shape. Thelight shielding pattern 79 may further includesecond openings 330 exposing the image pixels. Thelight shielding pattern 79 may reduce or prevent crosstalk in theimage detecting region 30. - The
light shielding pattern 79 may be connected to a ground voltage or a reference voltage, so the auto-focus image sensor may be more stably operated. -
FIG. 27 is a lower layout of an auto-focus image sensor according to yet still other example embodiments of the inventive concepts.FIG. 28 is a cross-sectional view taken along lines A-A′ and B-B′ ofFIG. 27 . - Referring to
FIGS. 27 and 28 , a deepdevice isolation layer 53 is disposed in asubstrate 51 having first andsecond surfaces 51 a and 52 b opposite to each other to isolate pixels from each other. A shallowdevice isolation layer 55 is disposed from thefirst surface 51 a into thesubstrate 51. The shallowdevice isolation layer 55 defines active regions AR4 and AR5. A transfer gate TG, a reset gate RG, a source follower gate SF, and a selection gate SEL are disposed on thefirst surface 51 a. A photoelectric converter PD is disposed at a side of the transfer gate TG, and a floating diffusion region FD is disposed at another side of the transfer gate TG. Contacts C11 to C17, C21, and C22, signal lines L11 to L15, L21, and L22,interlayer insulating layers color filter array 87, and amicro-lens array 35 are disposed on thefirst surface 51 a. A fixedcharge layer 73, a first insulatinglayer 75, and a second insulatinglayer 77 may be sequentially stacked on thesecond surface 51 b. The second insulatinglayer 77 may act as a passivation layer. - In some example embodiments, the signal lines L11 to L15, L21, and L22 may not overlap with the photoelectric converter PD in an
image detecting region 30 if possible. Thus, a path of light incident on the photoelectric converter PD may not be blocked. However, some signal lines L11 a and L22 a may perform both a signal transmission function and a light shielding function infocus detecting regions focus detecting regions -
FIG. 29 is a layout of a first layer first signal line and a first layer third signal line in a first focus detecting region. - Referring to
FIGS. 27, 28, and 29 , in the firstfocus detecting region 32, a first layer first signal line L11 a may include a first protrusion L11 b that protrudes to overlap with the photoelectric converter PD of the AF pixel. Thus, the first layer first signal line L11 a and the first layer third signal line L13 adjacent thereto may provide shapes similar to thefirst openings 332 ofFIGS. 5A to 5C in the firstfocus detecting region 32. The first layer first signal line L11 a which also performs the light shielding function may be disposed at the same height as the first layer second to fifth signal lines L12 to L15 from thefirst surface 51 a. -
FIG. 30 is a layout of a second layer first signal line and a second layer second signal line in a second focus detecting region. - Referring to
FIGS. 27, 28, and 30 , in the secondfocus detecting region 33, a second layer second signal line L22 a may include a second protrusion L22 b that protrudes to overlap with the photoelectric converter PD of the AF pixel. Thus, the second layer second signal line L22 a and the second layer first signal line L21 adjacent thereto may provide shapes similar to thefirst openings 332 ofFIGS. 5A to 5C in the secondfocus detecting region 33. The second layer first signal line L22 a which also performs the light shielding function may be disposed at the same height as the second layer second signal line L21 from thefirst surface 51 a. - Other elements and other fabricating processes of some example embodiments may be similar to or the same as corresponding elements and corresponding fabricating processes described with reference to
FIGS. 6 to 15B . -
FIGS. 31 to 35 illustrate example embodiments of a digital image processing device including an auto-focus image sensor according to example embodiments of the inventive concepts. For example, the digital image processing device according to example embodiments of the inventive concepts may be applied to a mobile orsmart phone 2000 illustrated inFIG. 31 and/or a tablet orsmart tablet 3000 illustrated inFIG. 32 . In addition, the digital image processing device according to example embodiments of the inventive concepts may be applied to anotebook computer 4000 ofFIG. 33 and/or a television orsmart television 5000 ofFIG. 34 . Furthermore, the digital image processing device according to example embodiments of the inventive concepts may be applied to a digital camera ordigital camcorder 6000 ofFIG. 35 . -
FIG. 36 is a schematic block diagram an interface and an electronic system including an auto-focus image sensor according to example embodiments of the inventive concepts. - Referring to
FIG. 36 , anelectronic system 1000 may be realized as a data processing device capable of using or supporting mobile industry processor interface (MIPI), e.g., a portable phone, a personal digital assistant (PDA), a portable multimedia player (PMP), or a smart phone. - The
electronic system 1000 may include anapplication processor 1010, animage sensor 1040, and adisplay 1050. Theimage sensor 1040 may be one of the auto-focus image sensors according to example embodiments of the inventive concepts. - A
CSI host 1012 realized in theapplication processor 1010 may serially communicate with aCSI device 1041 of theimage sensor 1040 through a camera serial interface (CSI). For example, an optical de-serializer may be realized in theCSI host 1012, and an optical serializer may be realized in theCSI device 1041. - A
DSI host 1011 realized in theapplication processor 1010 may serially communicate with aDSI device 1051 of thedisplay 1050 through a display serial interface (DSI). For example, an optical serializer may be realized in theDSI host 1011, and an optical de-serializer may be realized in theDSI device 1051. - The
electronic system 1000 may further include a radio frequency (RF)chip 1060 capable of communicating with theapplication processor 1010. A PHY 1013 of theelectronic system 1000 may exchange data with aPHY 1061 of theRF chip 1060 according to MIPI DigRF. - The
electronic system 1000 may further include a global positioning system (GPS) 1020, astorage 1070, a microphone 1080, aDRAM 1085, andspeaker 1090. Theelectronic system 1000 may communicate usingWimax 1030,WLAN 1100, andUWB 1110. - In the auto-focus image sensor according to example embodiments of the inventive concepts, the pixels are isolated from each other by the deep device isolation portion, so the crosstalk between neighboring pixels may be reduced or prevented. In addition, the sensor includes the fixed charge layer being in contact with at least one surface of the substrate. Holes may be accumulated around the fixed charge layer, and thus, the occurrence of the dark current and the white spots may be effectively reduced.
- Moreover, the poly-silicon pattern may be disposed within the deep device isolation portion. Since the poly-silicon pattern has a substantially same thermal expansion coefficient as the substrate formed of silicon, it is possible to reduce the physical stress caused by the difference between the thermal expansion coefficients of materials.
- As a result, example embodiments of the inventive concepts may provide an auto-focus image sensor capable of realizing a cleaner image and a digital image processing device including the same.
- While the inventive concepts have been described with reference to example embodiments, it will be apparent to those skilled in the art that various changes and modifications may be made without departing from the spirits and scopes of the inventive concepts. Therefore, it should be understood that the above example embodiments are not limiting, but illustrative. Thus, the scopes of the inventive concepts are to be determined by the broadest permissible interpretation of the following claims and their equivalents, and shall not be restricted or limited by the foregoing description.
Claims (20)
1. An Image sensor comprising:
a substrate including a plurality of pixels, the plurality of pixels comprising
a first plurality of auto-focus pixels arranged in a first line along a first direction; and
a second plurality of auto-focus pixels arranged in a second line along a second direction perpendicular to the first direction,
wherein the first plurality of auto-focus pixels including a first AF pixel covered by a first light shield and a second AF pixel covered by a second light shield,
wherein the second plurality of auto-focus pixels including a third AF pixel covered by a third light shield and a fourth AF pixel covered by a fourth light shield,
wherein the first light shield is disposed on first and second quadrants of the first AF pixel so that a first opening is formed at third and fourth quadrants of the first AF pixel,
wherein the second light shield is disposed on third and fourth quadrants of the second AF pixel so that a second opening is formed at first and second quadrants of the second AF pixel,
wherein the third light shield is disposed on second and third quadrants of the third AF pixel so that a third opening is formed at first and fourth quadrants of the third AF pixel, and
wherein the fourth light shield is disposed on first and fourth quadrants of the fourth AF pixel so that a fourth opening is formed at second and third quadrants of the fourth AF pixel.
2. The image sensor of claim 1 , wherein the plurality of pixels further comprises a plurality of image pixels to generate an image signal, and
wherein each of the plurality of image pixels has a fifth opening bigger than each of the first to the fourth openings.
3. The image sensor of claim 2 , wherein each of the first to fourth AF pixels includes the first to fourth quadrants,
the first and second quadrants are adjacent to each other in the second direction,
the third and fourth quadrants are adjacent to each other in the second direction,
the second and third quadrants are adjacent to each other in the first direction, and
the first and fourth quadrants are adjacent to each other in the first direction.
4. The image sensor of claim 3 , further comprising an isolation structure formed in the substrate and between adjacent ones of the plurality of pixels.
5. The image sensor of claim 4 , wherein the substrate having a first surface on which a gate electrode is disposed and a second surface opposite to the first surface, and
wherein the isolation structure extends from the second surface to the first surface.
6. The image sensor of claim 5 , wherein the isolation structure includes a fixed charge layer and an insulation layer.
7. The image sensor of claim 6 , wherein the isolation structure further includes an air gap surrounded by the fixed charge layer.
8. The image sensor of claim 5 , further comprising a green color filter corresponding to each of the first to fourth AF pixels.
9. The image sensor of claim 5 , wherein the first plurality of auto-focus pixels are commonly coupled to a signal sensing line.
10. The image sensor of claim 9 , wherein the second plurality of auto-focus pixels commonly coupled to a selection line.
11. The image sensor of claim 2 , wherein total area of the first to fourth openings of the first to fourth AF pixels is about 50% of total area of the fifth openings of the plurality of image pixels.
12. The image sensor of claim 5 , further comprising a light shielding pattern overlapping on the isolation structure and having a mesh shape.
13. The image sensor of claim 12 , wherein the light shielding pattern is connected to the first to the fourth light shields.
14. The image sensor of claim 1 , further comprising at least three consecutive pixels of the plurality of pixels disposed between the first AF pixel and the second AF pixel.
15. The image sensor of claim 14 , wherein a middle of the at least three consecutive pixels is connected to a selection line to which the third AF and the fourth AF are connected.
16. The image sensor of claim 15 , further comprising a green color filter corresponding to at least one of the at least three consecutive pixels.
17. An Image sensor comprising:
a substrate including a plurality of pixels, the plurality of pixels comprising
a first plurality of auto-focus pixels arranged in a first line along a first direction;
a second plurality of auto-focus pixels arranged in a second line along a second direction perpendicular to the first direction; and
a light shielding pattern on the substrate of the first and second plurality of auto-focus pixels,
wherein the first plurality of auto-focus pixels including a first AF pixel and a second AF pixel,
wherein the second plurality of auto-focus pixels including a third AF pixel and a fourth AF pixel,
wherein each of the first and second AF pixels includes top and bottom parts symmetric to each other in the first direction,
wherein each of the third and fourth AF pixels includes left and right parts symmetric to each other in the second direction, and
wherein the light shielding pattern has a first opening on the bottom part of the first AF pixel, a second opening on the top part of the second AF pixel, a third opening on the right part of the third AF pixel, and a fourth opening on the left part of the fourth AF pixel.
18. The image sensor of claim 17 , wherein the plurality of pixels further comprises a plurality of image pixels to generate an image signal,
wherein the light shielding pattern has a fifth opening corresponding to each of the plurality of image pixels, and
the fifth opening is bigger than each of the first to the fourth openings.
19. The image sensor of claim 18 , wherein an area of each of the first to fourth openings is about 50% of an area of the fifth opening.
20. The image sensor of claim 17 , further comprising a green color filter corresponding to each of the first to fourth AF pixels.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/840,750 US20220311944A1 (en) | 2014-06-23 | 2022-06-15 | Auto-focus image sensor and digital image processing device including the same |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140076509A KR102268712B1 (en) | 2014-06-23 | 2014-06-23 | Auto-focus image sensor and digital image processing device having the sensor |
KR10-2014-0076509 | 2014-06-23 | ||
US14/746,302 US9942461B2 (en) | 2014-06-23 | 2015-06-22 | Auto-focus image sensor and digital image processing device including the same |
US15/903,727 US10382666B2 (en) | 2014-06-23 | 2018-02-23 | Auto-focus image sensor and digital image processing device including the same |
US16/356,057 US10979621B2 (en) | 2014-06-23 | 2019-03-18 | Auto-focus image sensor and digital image processing device including the same |
US17/005,423 US11375100B2 (en) | 2014-06-23 | 2020-08-28 | Auto-focus image sensor and digital image processing device including the same |
US17/840,750 US20220311944A1 (en) | 2014-06-23 | 2022-06-15 | Auto-focus image sensor and digital image processing device including the same |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/005,423 Continuation US11375100B2 (en) | 2014-06-23 | 2020-08-28 | Auto-focus image sensor and digital image processing device including the same |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220311944A1 true US20220311944A1 (en) | 2022-09-29 |
Family
ID=54870821
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/746,302 Active US9942461B2 (en) | 2014-06-23 | 2015-06-22 | Auto-focus image sensor and digital image processing device including the same |
US15/903,727 Active US10382666B2 (en) | 2014-06-23 | 2018-02-23 | Auto-focus image sensor and digital image processing device including the same |
US16/356,057 Active US10979621B2 (en) | 2014-06-23 | 2019-03-18 | Auto-focus image sensor and digital image processing device including the same |
US17/005,423 Active 2035-09-11 US11375100B2 (en) | 2014-06-23 | 2020-08-28 | Auto-focus image sensor and digital image processing device including the same |
US17/005,426 Abandoned US20200396390A1 (en) | 2014-06-23 | 2020-08-28 | Auto-focus image sensor and digital image processing device including the same |
US17/840,750 Abandoned US20220311944A1 (en) | 2014-06-23 | 2022-06-15 | Auto-focus image sensor and digital image processing device including the same |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/746,302 Active US9942461B2 (en) | 2014-06-23 | 2015-06-22 | Auto-focus image sensor and digital image processing device including the same |
US15/903,727 Active US10382666B2 (en) | 2014-06-23 | 2018-02-23 | Auto-focus image sensor and digital image processing device including the same |
US16/356,057 Active US10979621B2 (en) | 2014-06-23 | 2019-03-18 | Auto-focus image sensor and digital image processing device including the same |
US17/005,423 Active 2035-09-11 US11375100B2 (en) | 2014-06-23 | 2020-08-28 | Auto-focus image sensor and digital image processing device including the same |
US17/005,426 Abandoned US20200396390A1 (en) | 2014-06-23 | 2020-08-28 | Auto-focus image sensor and digital image processing device including the same |
Country Status (2)
Country | Link |
---|---|
US (6) | US9942461B2 (en) |
KR (1) | KR102268712B1 (en) |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6519378B2 (en) * | 2015-07-23 | 2019-05-29 | セイコーエプソン株式会社 | Data transfer circuit, imaging circuit device and electronic device |
US10014333B2 (en) * | 2015-08-26 | 2018-07-03 | Semiconductor Components Industries, Llc | Back-side illuminated pixels with interconnect layers |
US9686457B2 (en) * | 2015-09-11 | 2017-06-20 | Semiconductor Components Industries, Llc | High efficiency image sensor pixels with deep trench isolation structures and embedded reflectors |
TWI785618B (en) | 2016-01-27 | 2022-12-01 | 日商新力股份有限公司 | Solid-state imaging device and electronic equipment |
JP6476349B2 (en) * | 2016-06-01 | 2019-02-27 | 富士フイルム株式会社 | Imaging apparatus, focus control method, and focus control program |
KR102577844B1 (en) | 2016-08-09 | 2023-09-15 | 삼성전자주식회사 | Image sensor |
KR20180024604A (en) | 2016-08-30 | 2018-03-08 | 삼성전자주식회사 | Image sensor and driving method thereof |
JP2018046088A (en) * | 2016-09-13 | 2018-03-22 | セイコーエプソン株式会社 | Solid-state image sensor and electronic apparatus |
CN107833896A (en) * | 2016-09-15 | 2018-03-23 | 精工爱普生株式会社 | Solid camera head and electronic equipment |
KR102666073B1 (en) * | 2016-12-28 | 2024-05-17 | 삼성전자주식회사 | Image sensor |
KR102699535B1 (en) | 2016-12-29 | 2024-09-02 | 삼성전자주식회사 | Image sensor |
KR20180080931A (en) * | 2017-01-05 | 2018-07-13 | 삼성전자주식회사 | Image sensor |
US10423049B2 (en) | 2017-06-12 | 2019-09-24 | Qualcomm Incorporated | Systems and methods for enabling transmission of phase detection data |
KR102375989B1 (en) | 2017-08-10 | 2022-03-18 | 삼성전자주식회사 | Image sensor for compensating signal difference between pixels |
US10825853B2 (en) * | 2017-11-09 | 2020-11-03 | Taiwan Semiconductor Manufacturing Company Ltd. | Semiconductor image sensor device with deep trench isolations and method for manufacturing the same |
KR102570048B1 (en) | 2018-03-20 | 2023-08-22 | 에스케이하이닉스 주식회사 | Image sensor |
KR102614851B1 (en) * | 2018-07-23 | 2023-12-19 | 삼성전자주식회사 | Image sensor |
KR102637626B1 (en) | 2019-01-08 | 2024-02-20 | 삼성전자주식회사 | Image sensor |
KR102710378B1 (en) * | 2019-07-25 | 2024-09-26 | 삼성전자주식회사 | Pixel array included in auto-focus image sensor and auto-focus image sensor including the same |
KR20210028808A (en) * | 2019-09-04 | 2021-03-15 | 삼성전자주식회사 | Image sensor and imaging apparatus having the same |
US12021099B2 (en) | 2019-09-30 | 2024-06-25 | Taiwan Semiconductor Manufacturing Company, Ltd. | Embedded light shield structure for CMOS image sensor |
KR20210054092A (en) * | 2019-11-04 | 2021-05-13 | 삼성전자주식회사 | Image sensor including pixels mirror symmetric with each other |
KR102716631B1 (en) | 2019-12-05 | 2024-10-15 | 삼성전자주식회사 | Image sensor |
US11595575B2 (en) * | 2020-05-11 | 2023-02-28 | Samsung Electronics Co., Ltd. | Image sensor |
KR20220047465A (en) * | 2020-10-08 | 2022-04-18 | 삼성전자주식회사 | Image sensor and Method of fabricating the same |
KR20220060297A (en) | 2020-11-04 | 2022-05-11 | 에스케이하이닉스 주식회사 | Image sensing device |
KR20220063830A (en) * | 2020-11-10 | 2022-05-18 | 삼성전자주식회사 | Image sensor |
US11477364B1 (en) * | 2021-04-01 | 2022-10-18 | Visera Technologies Company Limited | Solid-state image sensor |
Family Cites Families (83)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5914523A (en) * | 1998-02-17 | 1999-06-22 | National Semiconductor Corp. | Semiconductor device trench isolation structure with polysilicon bias voltage contact |
JP3592147B2 (en) | 1998-08-20 | 2004-11-24 | キヤノン株式会社 | Solid-state imaging device |
GB2376083B (en) | 2001-05-30 | 2004-01-21 | Bookham Technology Plc | An integrated optical device |
US7154136B2 (en) | 2004-02-20 | 2006-12-26 | Micron Technology, Inc. | Isolation structures for preventing photons and carriers from reaching active areas and methods of formation |
US20060180885A1 (en) | 2005-02-14 | 2006-08-17 | Omnivision Technologies, Inc. | Image sensor using deep trench isolation |
JP4354931B2 (en) | 2005-05-19 | 2009-10-28 | パナソニック株式会社 | Solid-state imaging device and manufacturing method thereof |
US7307327B2 (en) | 2005-08-04 | 2007-12-11 | Micron Technology, Inc. | Reduced crosstalk CMOS image sensors |
US7265328B2 (en) * | 2005-08-22 | 2007-09-04 | Micron Technology, Inc. | Method and apparatus providing an optical guide for an imager pixel having a ring of air-filled spaced slots around a photosensor |
US7544982B2 (en) * | 2006-10-03 | 2009-06-09 | Taiwan Semiconductor Manufacturing Company, Ltd. | Image sensor device suitable for use with logic-embedded CIS chips and methods for making the same |
JP5040458B2 (en) * | 2007-06-16 | 2012-10-03 | 株式会社ニコン | Solid-state imaging device and imaging apparatus using the same |
US8199230B2 (en) * | 2007-06-28 | 2012-06-12 | Fujifilm Corporation | Signal processing apparatus, image pickup apparatus and computer readable medium |
US8063465B2 (en) | 2008-02-08 | 2011-11-22 | Omnivision Technologies, Inc. | Backside illuminated imaging sensor with vertical pixel sensor |
JP2009252983A (en) * | 2008-04-04 | 2009-10-29 | Canon Inc | Imaging sensor, and method of manufacturing imaging sensor |
US8229596B2 (en) * | 2008-05-16 | 2012-07-24 | Hewlett-Packard Development Company, L.P. | Systems and methods to interface diverse climate controllers and cooling devices |
US8158988B2 (en) | 2008-06-05 | 2012-04-17 | International Business Machines Corporation | Interlevel conductive light shield |
JP4661912B2 (en) * | 2008-07-18 | 2011-03-30 | ソニー株式会社 | Solid-state imaging device and camera system |
US8237206B2 (en) | 2008-08-12 | 2012-08-07 | United Microelectronics Corp. | CMOS image sensor, method of making the same, and method of suppressing dark leakage and crosstalk for CMOS image sensor |
KR20100025107A (en) * | 2008-08-27 | 2010-03-09 | 크로스텍 캐피탈, 엘엘씨 | Shallow trench isolation with an air gap, a cmos image sensor using the same, and manufacturing method thereof |
JP2010098219A (en) | 2008-10-20 | 2010-04-30 | Toshiba Corp | Backside-illuminated solid-state image pickup device |
CN102227665B (en) | 2008-11-27 | 2015-02-25 | 佳能株式会社 | Solid-state image sensing element and image sensing apparatus |
US7838956B2 (en) | 2008-12-17 | 2010-11-23 | Eastman Kodak Company | Back illuminated sensor with low crosstalk |
KR101550435B1 (en) * | 2009-01-14 | 2015-09-04 | 삼성전자주식회사 | Backside-illuminated image sensor and method of forming the same |
KR101786069B1 (en) | 2009-02-17 | 2017-10-16 | 가부시키가이샤 니콘 | Backside illumination image sensor, manufacturing method thereof and image-capturing device |
US8269264B2 (en) * | 2009-11-09 | 2012-09-18 | Omnivision Technologies, Inc. | Image sensor having waveguides formed in color filters |
US8357890B2 (en) | 2009-11-10 | 2013-01-22 | United Microelectronics Corp. | Image sensor and method for fabricating the same |
JP5172819B2 (en) | 2009-12-28 | 2013-03-27 | 株式会社東芝 | Solid-state imaging device |
JP2011221253A (en) * | 2010-04-08 | 2011-11-04 | Sony Corp | Imaging apparatus, solid-state image sensor, imaging method and program |
JP2011221254A (en) * | 2010-04-08 | 2011-11-04 | Sony Corp | Imaging device, solid-state image pick-up element, imaging method and program |
WO2011158567A1 (en) * | 2010-06-18 | 2011-12-22 | 富士フイルム株式会社 | Solid-state image capture element and digital camera |
KR101788124B1 (en) | 2010-07-07 | 2017-10-20 | 삼성전자 주식회사 | Backside illuminated image sensor and method for manufacturing the same |
JP2012023207A (en) | 2010-07-14 | 2012-02-02 | Toshiba Corp | Backside-illuminated solid-state imaging device |
US8390089B2 (en) * | 2010-07-27 | 2013-03-05 | Taiwan Semiconductor Manufacturing Company, Ltd. | Image sensor with deep trench isolation structure |
US8692304B2 (en) | 2010-08-03 | 2014-04-08 | Himax Imaging, Inc. | Image sensor |
US8507962B2 (en) | 2010-10-04 | 2013-08-13 | International Business Machines Corporation | Isolation structures for global shutter imager pixel, methods of manufacture and design structures |
JP5581954B2 (en) * | 2010-10-07 | 2014-09-03 | ソニー株式会社 | Solid-state imaging device, method for manufacturing solid-state imaging device, and electronic apparatus |
JP2012084816A (en) * | 2010-10-14 | 2012-04-26 | Fujifilm Corp | Back surface irradiation type imaging device and imaging apparatus |
US9532033B2 (en) * | 2010-11-29 | 2016-12-27 | Nikon Corporation | Image sensor and imaging device |
WO2012073728A1 (en) * | 2010-11-30 | 2012-06-07 | 富士フイルム株式会社 | Imaging device and focal position detection method |
FR2969384A1 (en) | 2010-12-21 | 2012-06-22 | St Microelectronics Sa | IMAGE SENSOR WITH REDUCED INTERMODULATION |
FR2969385A1 (en) | 2010-12-21 | 2012-06-22 | St Microelectronics Crolles 2 | IMAGE SENSOR WITH REDUCED INTERMODULATION RATE |
JP2012164768A (en) * | 2011-02-04 | 2012-08-30 | Toshiba Corp | Solid state image pickup device |
JP5606961B2 (en) * | 2011-02-25 | 2014-10-15 | ルネサスエレクトロニクス株式会社 | Semiconductor device |
JP6299058B2 (en) * | 2011-03-02 | 2018-03-28 | ソニー株式会社 | Solid-state imaging device, method for manufacturing solid-state imaging device, and electronic apparatus |
WO2012128154A1 (en) * | 2011-03-24 | 2012-09-27 | 富士フイルム株式会社 | Color image sensor, imaging device, and control program for imaging device |
US8610229B2 (en) | 2011-04-14 | 2013-12-17 | Taiwan Semiconductor Manufacturing Company, Ltd. | Sidewall for backside illuminated image sensor metal grid and method of manufacturing same |
KR101777351B1 (en) * | 2011-05-16 | 2017-09-11 | 삼성전자주식회사 | Image pickup device, digital photographing apparatus using the device, auto-focusing method, and computer-readable storage medium for performing the method |
WO2013003116A2 (en) * | 2011-06-29 | 2013-01-03 | Synaptics Incorporated | High voltage driver using medium voltage devices |
KR20130038035A (en) * | 2011-10-07 | 2013-04-17 | 삼성전자주식회사 | Image sensor |
KR101931658B1 (en) | 2012-02-27 | 2018-12-21 | 삼성전자주식회사 | Unit pixel of image sensor and image sensor including the same |
KR20130099425A (en) | 2012-02-29 | 2013-09-06 | 삼성전자주식회사 | Image sensor |
JP2013187360A (en) * | 2012-03-08 | 2013-09-19 | Sony Corp | Solid-state image pickup device and electronic apparatus |
WO2013145888A1 (en) * | 2012-03-28 | 2013-10-03 | 富士フイルム株式会社 | Solid-state image capture element, image capture device, and solid-state image capture element drive method |
KR101968197B1 (en) * | 2012-05-18 | 2019-04-12 | 삼성전자주식회사 | Image sensor and method of forming the same |
JP5690977B2 (en) * | 2012-06-07 | 2015-03-25 | 富士フイルム株式会社 | Imaging device and imaging apparatus |
JP2014022448A (en) * | 2012-07-13 | 2014-02-03 | Toshiba Corp | Solid-state imaging device |
TWI636557B (en) * | 2013-03-15 | 2018-09-21 | 新力股份有限公司 | Solid-state imaging device, manufacturing method thereof, and electronic device |
US9450005B2 (en) * | 2013-03-29 | 2016-09-20 | Sony Corporation | Image pickup device and image pickup apparatus |
JP6288075B2 (en) * | 2013-03-29 | 2018-03-07 | ソニー株式会社 | Imaging device and imaging apparatus |
JP6148530B2 (en) * | 2013-05-02 | 2017-06-14 | キヤノン株式会社 | Solid-state imaging device and camera |
JP6121263B2 (en) * | 2013-06-26 | 2017-04-26 | ルネサスエレクトロニクス株式会社 | Semiconductor integrated circuit device |
JP2015012127A (en) * | 2013-06-28 | 2015-01-19 | ソニー株式会社 | Solid state image sensor and electronic apparatus |
JP2015026675A (en) * | 2013-07-25 | 2015-02-05 | ソニー株式会社 | Solid state image sensor, manufacturing method thereof and electronic apparatus |
JP2015037102A (en) * | 2013-08-12 | 2015-02-23 | 株式会社東芝 | Solid state image pickup device |
TW201514599A (en) * | 2013-10-07 | 2015-04-16 | Novatek Microelectronics Corp | Image sensor and image capturing system |
JP2015076569A (en) * | 2013-10-11 | 2015-04-20 | ソニー株式会社 | Imaging device, manufacturing method thereof and electronic apparatus |
KR20150062487A (en) * | 2013-11-29 | 2015-06-08 | 삼성전자주식회사 | Image sensor |
KR102128467B1 (en) * | 2014-01-09 | 2020-07-09 | 삼성전자주식회사 | Image sensor and image photograph apparatus including image sensor |
JP6196911B2 (en) * | 2014-02-05 | 2017-09-13 | オリンパス株式会社 | Solid-state imaging device and imaging device |
KR102209097B1 (en) * | 2014-02-27 | 2021-01-28 | 삼성전자주식회사 | Image sensor and method of fabricating the same |
JP6405243B2 (en) * | 2014-03-26 | 2018-10-17 | キヤノン株式会社 | Focus detection apparatus and control method thereof |
JP2015194736A (en) * | 2014-03-28 | 2015-11-05 | キヤノン株式会社 | Imaging device and method for controlling the same |
KR20150121564A (en) * | 2014-04-21 | 2015-10-29 | 삼성전자주식회사 | Imaging device and photographing apparatus |
KR102242580B1 (en) * | 2014-04-23 | 2021-04-22 | 삼성전자주식회사 | Image sensor and method of forming the same |
US9484376B2 (en) * | 2014-05-30 | 2016-11-01 | Taiwan Semiconductor Manufacturing Company Ltd. | Semiconductor isolation structure and manufacturing method thereof |
JP6463010B2 (en) * | 2014-06-24 | 2019-01-30 | オリンパス株式会社 | Imaging device and imaging apparatus |
US9432568B2 (en) * | 2014-06-30 | 2016-08-30 | Semiconductor Components Industries, Llc | Pixel arrangements for image sensors with phase detection pixels |
JP5655970B1 (en) * | 2014-07-02 | 2015-01-21 | 富士ゼロックス株式会社 | Image processing apparatus and image processing program |
KR20160008385A (en) * | 2014-07-14 | 2016-01-22 | 삼성전자주식회사 | Phase Detection Auto Focus Pixel and Image Sensor therewith |
KR102286109B1 (en) * | 2014-08-05 | 2021-08-04 | 삼성전자주식회사 | An image pixel, an image sensor including the same, and an image processing system including the same |
KR102306670B1 (en) * | 2014-08-29 | 2021-09-29 | 삼성전자주식회사 | image sensor and manufacturing method thereof |
KR102242472B1 (en) * | 2014-12-18 | 2021-04-20 | 엘지이노텍 주식회사 | Image sensor, image pick-up apparatus including the same and portable terminal including the apparatus |
KR102368573B1 (en) * | 2015-01-14 | 2022-03-02 | 삼성전자주식회사 | Image sensor |
KR20170019542A (en) * | 2015-08-11 | 2017-02-22 | 삼성전자주식회사 | Auto-focus image sensor |
-
2014
- 2014-06-23 KR KR1020140076509A patent/KR102268712B1/en active IP Right Grant
-
2015
- 2015-06-22 US US14/746,302 patent/US9942461B2/en active Active
-
2018
- 2018-02-23 US US15/903,727 patent/US10382666B2/en active Active
-
2019
- 2019-03-18 US US16/356,057 patent/US10979621B2/en active Active
-
2020
- 2020-08-28 US US17/005,423 patent/US11375100B2/en active Active
- 2020-08-28 US US17/005,426 patent/US20200396390A1/en not_active Abandoned
-
2022
- 2022-06-15 US US17/840,750 patent/US20220311944A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US20200396389A1 (en) | 2020-12-17 |
KR20160000044A (en) | 2016-01-04 |
KR102268712B1 (en) | 2021-06-28 |
US20190215442A1 (en) | 2019-07-11 |
US11375100B2 (en) | 2022-06-28 |
US20150373255A1 (en) | 2015-12-24 |
US20200396390A1 (en) | 2020-12-17 |
US10979621B2 (en) | 2021-04-13 |
US20180249064A1 (en) | 2018-08-30 |
US9942461B2 (en) | 2018-04-10 |
US10382666B2 (en) | 2019-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11375100B2 (en) | Auto-focus image sensor and digital image processing device including the same | |
US10396119B2 (en) | Unit pixel of image sensor, image sensor including the same and method of manufacturing image sensor | |
US10700115B2 (en) | Image sensors | |
US10797095B2 (en) | Image sensors and methods of forming the same | |
US20170047363A1 (en) | Auto-focus image sensor | |
US9564463B2 (en) | Methods of fabricating image sensors having deep trenches including negative charge material | |
US9385157B2 (en) | Pixel of an image sensor, and image sensor | |
US9948849B2 (en) | Image signal processor and devices including the same | |
JP2015065270A (en) | Solid state image pickup device and manufacturing method of the same, and electronic apparatus | |
US9508766B2 (en) | Image sensors and methods of fabricating the same | |
JP2014229810A (en) | Solid-state imaging device, and electronic apparatus | |
CN102347339A (en) | Semiconductor device, solid-state imaging device, method for manufacturing semiconductor device, method for manufacturing solid-state imaging device, and electronic apparatus | |
KR20170018206A (en) | Image sensor and image processing device including the same | |
US20200344433A1 (en) | Image sensor | |
US20100258893A1 (en) | Solid-state imaging device manufacturing method, solid-state imaging device, and electronic apparatus | |
US11670660B2 (en) | Pixel array included in auto-focus image sensor and auto-focus image sensor including the same | |
US20150311238A1 (en) | Image sensors including deposited negative fixed charge layers on photoelectric conversion regions and methods of forming the same | |
KR102215822B1 (en) | Unit pixel of image sensor, image sensor including the same and method of manufacturing image sensor | |
TW202415087A (en) | Quad photodiode microlens arrangements, and associated systems and methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |