US20170092687A1 - Image sensor and method of fabricating the same - Google Patents
Image sensor and method of fabricating the same Download PDFInfo
- Publication number
- US20170092687A1 US20170092687A1 US15/372,999 US201615372999A US2017092687A1 US 20170092687 A1 US20170092687 A1 US 20170092687A1 US 201615372999 A US201615372999 A US 201615372999A US 2017092687 A1 US2017092687 A1 US 2017092687A1
- Authority
- US
- United States
- Prior art keywords
- light
- layer
- region
- color filters
- blocking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000004519 manufacturing process Methods 0.000 title claims description 13
- 239000004065 semiconductor Substances 0.000 claims abstract description 82
- 238000006243 chemical reaction Methods 0.000 claims abstract description 52
- 238000000034 method Methods 0.000 claims abstract description 45
- 229920002120 photoresistant polymer Polymers 0.000 claims description 43
- 238000004043 dyeing Methods 0.000 claims description 7
- 239000010410 layer Substances 0.000 description 221
- 230000000875 corresponding effect Effects 0.000 description 17
- 230000005540 biological transmission Effects 0.000 description 16
- 229910052751 metal Inorganic materials 0.000 description 14
- 239000002184 metal Substances 0.000 description 14
- 239000000463 material Substances 0.000 description 11
- 239000000758 substrate Substances 0.000 description 11
- 238000001514 detection method Methods 0.000 description 10
- 238000005530 etching Methods 0.000 description 9
- 230000003321 amplification Effects 0.000 description 7
- 238000003199 nucleic acid amplification method Methods 0.000 description 7
- 229910052710 silicon Inorganic materials 0.000 description 7
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- -1 poly(methyl methacrylate) Polymers 0.000 description 6
- 239000010703 silicon Substances 0.000 description 6
- QTBSBXVTEAMEQO-UHFFFAOYSA-N Acetic acid Chemical compound CC(O)=O QTBSBXVTEAMEQO-UHFFFAOYSA-N 0.000 description 5
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 5
- 230000002596 correlated effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 229910052814 silicon oxide Inorganic materials 0.000 description 5
- UMIVXZPTRXBADB-UHFFFAOYSA-N benzocyclobutene Chemical compound C1=CC=C2CCC2=C1 UMIVXZPTRXBADB-UHFFFAOYSA-N 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 239000010949 copper Substances 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 229920003229 poly(methyl methacrylate) Polymers 0.000 description 4
- 229920000139 polyethylene terephthalate Polymers 0.000 description 4
- 239000005020 polyethylene terephthalate Substances 0.000 description 4
- 239000004926 polymethyl methacrylate Substances 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- PFNQVRZLDWYSCW-UHFFFAOYSA-N (fluoren-9-ylideneamino) n-naphthalen-1-ylcarbamate Chemical compound C12=CC=CC=C2C2=CC=CC=C2C1=NOC(=O)NC1=CC=CC2=CC=CC=C12 PFNQVRZLDWYSCW-UHFFFAOYSA-N 0.000 description 3
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 3
- 206010034960 Photophobia Diseases 0.000 description 3
- 239000004642 Polyimide Substances 0.000 description 3
- 239000005083 Zinc sulfide Substances 0.000 description 3
- MCMNRKCIXSYSNV-UHFFFAOYSA-N ZrO2 Inorganic materials O=[Zr]=O MCMNRKCIXSYSNV-UHFFFAOYSA-N 0.000 description 3
- 229910052802 copper Inorganic materials 0.000 description 3
- 238000009792 diffusion process Methods 0.000 description 3
- 238000002513 implantation Methods 0.000 description 3
- 238000002955 isolation Methods 0.000 description 3
- 208000013469 light sensitivity Diseases 0.000 description 3
- CPLXHLVBOLITMK-UHFFFAOYSA-N magnesium oxide Inorganic materials [Mg]=O CPLXHLVBOLITMK-UHFFFAOYSA-N 0.000 description 3
- 239000000395 magnesium oxide Substances 0.000 description 3
- AXZKOIWUVFPNLO-UHFFFAOYSA-N magnesium;oxygen(2-) Chemical compound [O-2].[Mg+2] AXZKOIWUVFPNLO-UHFFFAOYSA-N 0.000 description 3
- TWNQGVIAIRXVLR-UHFFFAOYSA-N oxo(oxoalumanyloxy)alumane Chemical compound O=[Al]O[Al]=O TWNQGVIAIRXVLR-UHFFFAOYSA-N 0.000 description 3
- BPUBBGLMJRNUCC-UHFFFAOYSA-N oxygen(2-);tantalum(5+) Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ta+5].[Ta+5] BPUBBGLMJRNUCC-UHFFFAOYSA-N 0.000 description 3
- RVTZCBVAJQQJTK-UHFFFAOYSA-N oxygen(2-);zirconium(4+) Chemical compound [O-2].[O-2].[Zr+4] RVTZCBVAJQQJTK-UHFFFAOYSA-N 0.000 description 3
- 229920001721 polyimide Polymers 0.000 description 3
- 239000011347 resin Substances 0.000 description 3
- 229920005989 resin Polymers 0.000 description 3
- PBCFLUZVCVVTBY-UHFFFAOYSA-N tantalum pentoxide Inorganic materials O=[Ta](=O)O[Ta](=O)=O PBCFLUZVCVVTBY-UHFFFAOYSA-N 0.000 description 3
- XOLBLPGZBRYERU-UHFFFAOYSA-N tin dioxide Chemical compound O=[Sn]=O XOLBLPGZBRYERU-UHFFFAOYSA-N 0.000 description 3
- 229910001887 tin oxide Inorganic materials 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 239000012780 transparent material Substances 0.000 description 3
- 229910052984 zinc sulfide Inorganic materials 0.000 description 3
- VRBFTYUMFJWSJY-UHFFFAOYSA-N 28804-46-8 Chemical compound ClC1CC(C=C2)=CC=C2C(Cl)CC2=CC=C1C=C2 VRBFTYUMFJWSJY-UHFFFAOYSA-N 0.000 description 2
- JBRZTFJDHDCESZ-UHFFFAOYSA-N AsGa Chemical compound [As]#[Ga] JBRZTFJDHDCESZ-UHFFFAOYSA-N 0.000 description 2
- 229910005540 GaP Inorganic materials 0.000 description 2
- GRYLNZFGIOXLOG-UHFFFAOYSA-N Nitric acid Chemical compound O[N+]([O-])=O GRYLNZFGIOXLOG-UHFFFAOYSA-N 0.000 description 2
- NIXOWILDQLNWCW-UHFFFAOYSA-N acrylic acid group Chemical group C(C=C)(=O)O NIXOWILDQLNWCW-UHFFFAOYSA-N 0.000 description 2
- QCCDYNYSHILRDG-UHFFFAOYSA-K cerium(3+);trifluoride Chemical compound [F-].[F-].[F-].[Ce+3] QCCDYNYSHILRDG-UHFFFAOYSA-K 0.000 description 2
- 238000000151 deposition Methods 0.000 description 2
- FPHIOHCCQGUGKU-UHFFFAOYSA-L difluorolead Chemical compound F[Pb]F FPHIOHCCQGUGKU-UHFFFAOYSA-L 0.000 description 2
- KPUWHANPEXNPJT-UHFFFAOYSA-N disiloxane Chemical class [SiH3]O[SiH3] KPUWHANPEXNPJT-UHFFFAOYSA-N 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- HZXMRANICFIONG-UHFFFAOYSA-N gallium phosphide Chemical compound [Ga]#P HZXMRANICFIONG-UHFFFAOYSA-N 0.000 description 2
- 229910052732 germanium Inorganic materials 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 229910017604 nitric acid Inorganic materials 0.000 description 2
- 239000011368 organic material Substances 0.000 description 2
- 238000000059 patterning Methods 0.000 description 2
- 230000035515 penetration Effects 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 229920003023 plastic Polymers 0.000 description 2
- 238000005498 polishing Methods 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- WKBOTKDWSSQWDR-UHFFFAOYSA-N Bromine atom Chemical compound [Br] WKBOTKDWSSQWDR-UHFFFAOYSA-N 0.000 description 1
- KRHYYFGTRYWZRS-UHFFFAOYSA-N Fluorane Chemical compound F KRHYYFGTRYWZRS-UHFFFAOYSA-N 0.000 description 1
- 229910003334 KNbO3 Inorganic materials 0.000 description 1
- ZOKXTWBITQBERF-UHFFFAOYSA-N Molybdenum Chemical compound [Mo] ZOKXTWBITQBERF-UHFFFAOYSA-N 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- NRTOMJZYCJJWKI-UHFFFAOYSA-N Titanium nitride Chemical compound [Ti]#N NRTOMJZYCJJWKI-UHFFFAOYSA-N 0.000 description 1
- 239000012790 adhesive layer Substances 0.000 description 1
- 239000000956 alloy Substances 0.000 description 1
- 229910045601 alloy Inorganic materials 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- IVHJCRXBQPGLOV-UHFFFAOYSA-N azanylidynetungsten Chemical compound [W]#N IVHJCRXBQPGLOV-UHFFFAOYSA-N 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- GDTBXPJZTBHREO-UHFFFAOYSA-N bromine Substances BrBr GDTBXPJZTBHREO-UHFFFAOYSA-N 0.000 description 1
- 229910052794 bromium Inorganic materials 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000010987 cubic zirconia Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 1
- 238000000227 grinding Methods 0.000 description 1
- 229910000449 hafnium oxide Inorganic materials 0.000 description 1
- CJNBYAVZURUTKZ-UHFFFAOYSA-N hafnium(iv) oxide Chemical compound O=[Hf]=O CJNBYAVZURUTKZ-UHFFFAOYSA-N 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 229910052750 molybdenum Inorganic materials 0.000 description 1
- 239000011733 molybdenum Substances 0.000 description 1
- 239000000049 pigment Substances 0.000 description 1
- 239000004417 polycarbonate Substances 0.000 description 1
- 229920000515 polycarbonate Polymers 0.000 description 1
- UKDIAJWKFXFVFG-UHFFFAOYSA-N potassium;oxido(dioxo)niobium Chemical compound [K+].[O-][Nb](=O)=O UKDIAJWKFXFVFG-UHFFFAOYSA-N 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000010980 sapphire Substances 0.000 description 1
- 229910052594 sapphire Inorganic materials 0.000 description 1
- HBMJWWWQQXIZIP-UHFFFAOYSA-N silicon carbide Chemical compound [Si+]#[C-] HBMJWWWQQXIZIP-UHFFFAOYSA-N 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- VEALVRVVWBQVSL-UHFFFAOYSA-N strontium titanate Chemical compound [Sr+2].[O-][Ti]([O-])=O VEALVRVVWBQVSL-UHFFFAOYSA-N 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 229910052715 tantalum Inorganic materials 0.000 description 1
- GUVRBAGPIYLISA-UHFFFAOYSA-N tantalum atom Chemical compound [Ta] GUVRBAGPIYLISA-UHFFFAOYSA-N 0.000 description 1
- MZLGASXMSKOWSE-UHFFFAOYSA-N tantalum nitride Chemical compound [Ta]#N MZLGASXMSKOWSE-UHFFFAOYSA-N 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 1
- 229910052721 tungsten Inorganic materials 0.000 description 1
- 239000010937 tungsten Substances 0.000 description 1
- 238000001039 wet etching Methods 0.000 description 1
- DRDVZXDWVBGGMH-UHFFFAOYSA-N zinc;sulfide Chemical compound [S-2].[Zn+2] DRDVZXDWVBGGMH-UHFFFAOYSA-N 0.000 description 1
- ZVWKZXLXHLZXLS-UHFFFAOYSA-N zirconium nitride Chemical compound [Zr]#N ZVWKZXLXHLZXLS-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
- H01L27/14685—Process for coatings or optical elements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
- H01L27/14607—Geometry of the photosensitive area
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14623—Optical shielding
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14629—Reflectors
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14636—Interconnect structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1464—Back illuminated imager structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14641—Electronic components shared by two or more pixel-elements, e.g. one amplifier shared by two pixel elements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14645—Colour imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
- H01L27/14687—Wafer level processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/75—Circuitry for providing, modifying or processing image signals from the pixel array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H04N5/374—
-
- H04N5/378—
Definitions
- Some example embodiments of inventive concepts relate to an image sensor, such as a CMOS image sensor (CIS), and a method of fabricating the same.
- CIS CMOS image sensor
- An image sensor is a device that converts optical signals into electrical signals.
- image sensors are used in a variety of applications such as digital cameras, camcorders, personal communication systems, gaming machines, security cameras, micro-cameras for medical applications, and/or robots.
- the image sensors may be generally classified into charge coupled device (CCD) and complementary metal-oxide semiconductor (CMOS) image sensors.
- CMOS image sensors are configured to have signal processing circuits integrated on a single chip.
- CMOS image sensors may consume relatively low power, and thus, they are applicable to portable electronic devices.
- CMOS image sensors can be fabricated using cost-effective CMOS fabrication techniques and can provide high resolution images. Accordingly, the use of CMOS image sensors has increased.
- Example embodiments of inventive concepts provide an image sensor that reduces color distortion.
- an image sensor may include a semiconductor layer having a light-receiving region and a light-blocking region, the semiconductor layer including photoelectric conversion devices, a light-blocking layer on a surface of the semiconductor layer and on the light-blocking region, color filters on the semiconductor layer and the light-blocking layer, and micro lenses on the color filters.
- the color filters are absent from an interface region and the interface region is between the light-receiving region and the light-blocking region.
- the light-blocking region surrounds the light-receiving region
- the light-receiving region may include an image region and a dummy region and the dummy region may be between the image region and the light-blocking region.
- the color filters are absent from a portion of the light-blocking region and a portion of the dummy region, and the portion of the light-blocking region and the portion of the dummy region are adjacent to each other.
- a corresponding one of the color filters and a corresponding one of the micro lenses overlap each of the photoelectric conversion devices.
- the image sensor may further include an upper planarization layer between the color filters and the micro lenses, the upper planarization layer being on the interface region, the light-receiving region and the light-blocking region.
- the upper planarization layer is thicker on the interface region than on portions of the light-receiving and light-blocking regions.
- the image sensor may further include a lower planarization layer between the light-blocking layer and the color filters.
- the lower planarization layer is exposed on the interface region.
- the image sensor may further include an interconnection layer on an opposite surface of the semiconductor layer.
- the interconnection layer and the light-blocking layer are spaced apart from each other by the semiconductor layer.
- the image sensor may further include an interconnection layer between the light-blocking layer and the semiconductor layer.
- the light-blocking region surrounds the light-receiving region, the light-receiving region includes an image region and a dummy region, and the dummy region may be between the image region and the light-blocking region.
- the removing the portion of the color filters may include forming a photoresist layer on the semiconductor layer and the light-blocking layer, exposing and developing the photoresist layer to form photoresist patterns on pixels, respectively, to form the color pixels and to remove the portion of the color filters, and dyeing the photoresist patterns.
- the removing the portion of the color filters may include forming a photoresist layer on the semiconductor layer and the light-blocking layer, exposing and developing the photoresist layer to form photoresist patterns on pixels, respectively, dyeing the photoresist patterns to form the color filters, forming a mask pattern on the color filters, and removing color filters exposed by the mask pattern.
- the method may further include forming a planarization layer between the light-blocking layer and the color filters. The removing the portion of the color filters exposes a top surface of the planarization layer.
- the method may further include forming an interconnection layer on the semiconductor layer, the semiconductor layer between the interconnection layer and the light-blocking layer.
- the method may further include forming an interconnection layer between the light-blocking layer and the semiconductor layer.
- At least one example embodiment discloses an image sensor including a plurality of photoelectric conversion elements, a light transmission layer on the photoelectric conversion elements, the light transmission layer including at least a first filter and at least a second filter, a transition region being between the first filter and the second filter, an upper surface of the first filter being higher than an upper surface of the second filter, and a plurality of micro lenses on the first and second filters, the plurality of micro lenses being absent from the transition region.
- the light transmission layer includes a third filter in the transition region.
- the third filter has a first portion and a second portion, a thickness of the first portion being greater than a thickness of the second portion.
- the light transmission layer includes a lower planarization layer transitioning from a first height in the light transmission layer to a second height in the light transmission layer and a light blocking layer on a portion of the lower planarization layer having the second height.
- the light transmission layer includes an upper planarization layer on the first filter, the second filter and the lower planarization layer.
- FIG. 1 may be a block diagram of an image sensor according to an example embodiment of inventive concepts.
- FIGS. 2A and 2B are circuit diagrams illustrating an active pixel sensor array of an image sensor, according to an example embodiment of inventive concepts.
- FIG. 3A is a plan view of an image sensor according to example embodiments of inventive concepts.
- FIGS. 3B through 3D are sectional views of the image sensor taken along line I-I′ of FIG. 3 A.
- FIG. 4 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts.
- FIG. 5 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts.
- FIG. 6 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts.
- FIGS. 7A through 7C are sectional views illustrating a process of forming color filters, according to example embodiments of inventive concepts.
- FIGS. 8A through 8D are sectional views illustrating a process of forming color filters, according to other example embodiments of inventive concepts.
- FIG. 9 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts.
- FIG. 10 is a schematic block diagram illustrating a process-based system including an image sensor, according to an example embodiment of inventive concepts.
- FIG. 11 is a perspective view illustrating an electronic device including an image sensor, according to an example embodiment of inventive concepts.
- Example embodiments of inventive concepts will now be described more fully with reference to the accompanying drawings, in which example embodiments are shown.
- Example embodiments of inventive concepts may, however, be embodied in many different forms and should not be construed as being limited to example embodiments set forth herein; rather, example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those of ordinary skill in the art.
- the thicknesses of layers and regions are exaggerated for clarity.
- Like reference numerals in the drawings denote like elements, and thus their description will be omitted.
- first”, “second”, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of example embodiments.
- spatially relative terms such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- Example embodiments of inventive concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of example embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of inventive concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle may have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region.
- a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place.
- the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
- FIG. 1 may be a block diagram of an image sensor according to an example embodiment of inventive concepts.
- the image sensor may be a CMOS image sensor, but example embodiments of inventive concepts are not limited thereto.
- the image sensor may include an active pixel sensor array 10 , a row decoder 20 , a row driver 30 , a column decoder 40 , a timing generator 50 , a correlated double sampler 60 , an analog-to-digital converter 70 , and an input/output (I/O) buffer 80 .
- the active pixel sensor array 10 may include a plurality of two-dimensionally arranged unit pixels, each of which is configured to convert optical signals to electrical signals.
- the active pixel sensor array 10 may be driven by a plurality of driving signals such as a pixel selection signal, a reset signal, and a charge transmission signal from the row driver 30 .
- the converted electrical signal may be provided to the correlated double sampler 60 .
- the row driver 30 may provide several driving signals for driving several unit pixels of the active pixel sensor array 10 in accordance with a decoded result obtained from the row decoder 20 .
- the driving signals may be supplied to respective rows.
- the timing generator 50 may provide timing and control signals to the row decoder 20 and the column decoder 40 .
- the correlated double sampler 60 may receive the electric signals generated in the active pixel sensor array 10 , and hold and sample the received electric signals.
- the correlated double sampler 60 may perform a double sampling operation to sample a noise level and a signal level of the electric signal and output a difference level corresponding to a difference between the noise and signal levels.
- the analog-to-digital converter 70 may convert analog signals corresponding to the difference level output from the correlated double sampler 60 into digital signals, and then the analog to digital converter 70 outputs the converted digital signals.
- the I/O buffer 80 may latch the digital signal and then output the latched digital signals sequentially to an image signal processing unit (not shown) in accordance with the decoding result obtained from the column decoder 40 .
- FIGS. 2A and 2B are circuit diagrams illustrating an active pixel sensor array of an image sensor, according to example embodiments of inventive concepts.
- the active pixel sensor array 10 may include a plurality of unit pixels, which may be arranged in the form of a matrix form.
- each of the unit pixels may include at least one photoelectric conversion device ( 110 in FIG. 2A, 110 a and 110 b in FIG. 2B ), which is configured to generate electric charges from light incident thereto and store the generated electric charges, and a reading device, which is configured to read an optical signal generated in the photoelectric conversion device.
- the reading device may include a reset element 140 , an amplification element 150 , and a selection element 160 .
- FIG. 2A illustrates a plurality of unit pixels, each of which includes four N-channel MOS transistors.
- each unit pixel P 1 may be composed of a single photoelectric conversion device 110 and four MOS transistors 130 , 140 , 150 , and 160 .
- the unit pixel P 1 may be composed of three MOS transistors or five MOS transistors.
- the photoelectric conversion device 110 may be configured to generate and store charges corresponding to the incident light.
- the photoelectric conversion device 110 may be realized by a photo diode, a photo transistor, a photo gate, a pinned photo diode (PPD), or any combination thereof.
- the photo diode may be used as the photoelectric conversion device 110 .
- the photoelectric conversion device 110 may be connected to a charge transmission element 130 transmitting the stored charges to a detection area 120 .
- the detection area 120 may be a floating diffusion region FD, which is provided in the semiconductor layer and is doped with N-type impurities.
- the floating diffusion region FD may receive the charges stored in the photoelectric conversion device 110 to accumulate charges therein.
- the detection area 120 (e.g., the floating diffusion region FD) may be electrically connected to the amplification element 150 to control the amplification element 150 .
- the charge transmission element 130 may transmit the charges stored in the photoelectric conversion device 110 to the detection area 120 .
- the charge transmission element 130 may generally be composed of one MOS transistor and may be controlled by a bias applied to a charge transmission signal line TX(i).
- the reset element 140 may periodically reset the detection area 120 and may be composed of one MOS transistor.
- a source of the reset element 140 may be connected to the detection area 120 and a drain of the reset element 140 may be connected to a power supply terminal applied with a power supply voltage V DD .
- the reset element 140 may be driven by a bias applied to a reset signal line RX(i). In the case where the reset element 140 is turned on by the bias applied to the reset signal line RX(i), the power supply voltage V DD may be applied to the detection area 120 . Therefore, the detection area 120 may be reset, when the reset element 140 is turned on.
- the amplification element 150 may serve as a source follower buffer amplifier.
- the amplification element 150 may amplify a variation in electric potential of the detection area 120 , and then, output the amplified signal to an output line V out through the selection element 160 .
- the selection elements 160 may be configured to be able to select each row of the unit pixels P 1 in a reading operation and may be composed of one MOS transistor.
- the selection elements 160 in each row may be driven by a bias applied to a row select signal line SEL(i). In the case where the selection elements 160 are turned on by the bias applied to the row select signal line SEL(i), the output signals of the amplification elements 150 composed of the MOS transistors may be transmitted to the output lines V out through the selection element 160 .
- the driving signal lines TX(i), RX(i), and SEL(i) may be electrically connected to the charge transmission elements 130 , the reset elements 140 , and the selection elements 160 , respectively.
- the driving signal lines TX(i), RX(i), and SEL(i) may extend in a row direction (horizontal direction) so as to simultaneously drive the unit pixels arrayed in the same row.
- FIG. 2B illustrates an example of a paired pixel, in which two photoelectric conversion devices are configured to share the reading device.
- the active pixel sensor array 10 may include a plurality of paired pixels P 2 , which are arranged in the matrix form.
- Each of the paired pixels P 2 may be configured in such a way that a reading device is shared by a pair of photoelectric conversion devices 110 a and 110 b.
- the pair of the photoelectric conversion devices 110 a and 110 b may share the reset element 140 , the amplification element 150 , and/or the selection element 160 .
- each of the photoelectric conversion devices 110 a and 110 b may be connected to charge transmission elements 130 a and 130 b , respectively, which may be used to transfer accumulated charges to other components (e.g., the reading device).
- a bias applied to the row selection line SEL(i) may allow the selection element 160 to select each row of the paired pixels P 2 in a reading operation. Further, in the case where biases are applied to the charge transmission elements 130 a and 130 b through transmission lines TX(i)a and TX(i)b, charges can be transferred from one of two photoelectric conversion devices 110 a and 110 b to the detection area 120 .
- FIG. 3A is a plan view of an image sensor according to an example embodiment of inventive concepts
- FIG. 3B is a sectional view of the image sensor taken along line I-I′ of FIG. 3 A.
- the image sensor may include a light-receiving region PHO and a light-blocking region BLA.
- the light-receiving region PHO may be provided at a central region of the image sensor
- the light-blocking region BLA may be provided around the light-receiving region PHO or at an edge region of the image sensor.
- the light-receiving region PHO may include an effective image region EFF and a dummy region DUM.
- the dummy region DUM may be disposed between the effective image region EFF and the light-blocking region BLA.
- a plurality of active pixels may be provided on each or both of the effective image region EFF and the dummy region DUM.
- the effective image region EFF may be configured to generate image signals from an incident light, while the dummy region DUM may be provided to process the image signals generated from the effective image region EFF.
- Devices in the light-blocking region BLA may be operated under black or dark environment.
- signals generated in the light-blocking region BLA may serve as a reference signal having no dependence on the incident light.
- an environmental effect i.e., noise
- the light-blocking region BLA in which photoelectric conversion effect can be prevented, is provided in the image sensor, in addition to the light-receiving region PHO.
- an amount of electric charges generated in a pixel of the light-blocking region BLA is subtracted from an amount of electric charges generated in each active pixel of the light-receiving region PHO.
- a light-blocking layer e.g., of metal
- the pixel in the light-blocking region may be configured to have the same feature as the active pixel.
- the image sensor may include a semiconductor layer 100 , an interconnection layer 200 , and a light-transmission layer 300 .
- the semiconductor layer 100 may have first and second surfaces facing each other.
- the interconnection layer 200 may be provided on the first surface of the semiconductor layer 100
- the light-transmission layer 300 may be provided on the second surface of the semiconductor layer 100 .
- the semiconductor layer 100 may be configured to include a bulk silicon wafer of a first conductivity type (e.g., P-type) and an epitaxial layer 115 of the first conductivity type on the bulk silicon wafer.
- the semiconductor layer 100 may include only the P-type epitaxial layer 115 without the bulk silicon wafer.
- the semiconductor layer 100 may be a bulk semiconductor wafer, in which a well region of the first conductivity type is provided.
- the semiconductor layer 100 may include an N-type epitaxial layer, a bulk silicon wafer, a silicon-on-insulator (SOI) wafer, and so forth.
- SOI silicon-on-insulator
- a penetration depth of the incident light into the semiconductor layer 100 may vary depending on a wavelength of the incident light.
- a thickness of the semiconductor layer 100 may be determined in consideration of the wavelength of the light incident into the photoelectric conversion devices 110 .
- a device isolation layer (not shown) may be formed in the semiconductor layer 100 to define active regions.
- the device isolation layer may be formed to define a first active region for the photoelectric conversion device 110 and a second active region for the reading device.
- shapes and disposition of the first and second active regions may not be limited thereto.
- the photoelectric conversion devices 110 may be arranged in a matrix shape, when viewed in plan view.
- each of the photoelectric conversion devices 110 may be shaped like a rectangle or tetragon, when viewed in plan view.
- Each of the photoelectric conversion devices 110 may be provided in the form of a photo diode, a photo transistor, a photo gate, or a pinned photo diode (PPD).
- the interconnection layer 200 may include several devices, which may be configured to read out electrical signals generated from the photoelectric conversion devices 110 and to control the unit pixels.
- the interconnection layer 200 may include an interlayered insulating layer 210 , which is provided to have a multi-layered structure, and a plurality of metal lines 220 , which are vertically stacked in the interlayered insulating layer 210 .
- the metal lines 220 may be connected to reading and logic devices through contact plugs (not shown).
- the metal lines 220 may be provided without dependence on the arrangement of the photoelectric conversion devices 110 .
- the metal lines 220 may be provided to cross over the photoelectric conversion devices 110 .
- the interconnection layer 200 may be formed to be interposed between the semiconductor layer 100 and a supporting substrate (not shown).
- the supporting substrate may include at least one of a semiconductor substrate, a glass substrate, and a plastic substrate.
- the supporting substrate may be bonded to the interconnection layer 200 by an adhesive layer. The use of the supporting substrate makes it possible to prevent the semiconductor layer 100 from being bent or curved when the semiconductor layer 100 is thinned.
- the light-transmission layer 300 may be provided on the second surface of the semiconductor layer 100 and include a light-blocking layer 310 , color filters 320 , and micro lenses 330 .
- the light-blocking layer 310 may be formed on the light-blocking region BLA.
- the light-blocking layer 310 may be formed to be overlapped with the light-blocking region BLA, when viewed in plan view.
- the light-blocking layer 310 may be formed of a metal-containing layer (e.g., of copper).
- the color filters 320 may be overlapped with the photoelectric conversion devices 110 , respectively, when viewed in plan view.
- the color filter 320 may be disposed to realize red, green or blue, in accordance with a position or structure of the unit pixel.
- the color filters 320 may be two-dimensionally arranged, similar or identical to the arrangement of the photoelectric conversion devices 110 .
- the color filters 320 may be disposed to form a Bayer-type RGB pixel arrangement. To realize a color image, each of the color filters 320 may be configured in such a way that light having a specific wavelength can be incident into a corresponding one of the unit pixels.
- the color filters 320 may include red, green, and blue color filters, which filter the incident light and allow red, green, and blue lights, respectively, to transmit therethrough.
- the color filters 320 may be configured to realize other color system including cyan, magenta, or yellow.
- the color filters 320 may be removed to avoid technical problems caused by the height difference.
- some of the color filters 320 may be removed from an interface region SCR (hereinafter, referred as to a “staircase region”) between the light-blocking and light-receiving regions BLA and PHO.
- the color filters 320 may be absent from the staircase region SCR.
- the staircase region SCR for the removal of the color filters 320 may include at least a portion of the light-blocking region BLA and a portion of the dummy region DUM, as will be described in more detail below.
- the micro lenses 330 may be provided on the color filters 320 to face or overlap the unit pixels, respectively, when viewed in plan view. Each of the micro lenses 330 may have an upward-convex shape with a specific curvature radius.
- the micro lenses 330 may be formed of an optically-transparent resin. Each of the micro lenses 330 makes it possible to focus the incident light on a corresponding one of the photoelectric conversion devices 110 . For example, even when a fraction of the incident light is oriented toward a region beyond the photoelectric conversion device 110 , it can be incident into the photoelectric conversion device 110 , by virtue of the micro lens 330 .
- the micro lenses 330 may be disposed on the light-receiving region PHO, but example embodiments of inventive concepts are not be limited thereto.
- the micro lenses 330 may be provided on both of the light-blocking and light-receiving regions BLA and PHO.
- the presence of the light-blocking layer 310 may result in the height difference between the light-blocking and light-receiving regions BLA and PHO.
- the micro lens 330 on the staircase region SCR may have a sloped or curved bottom surface and this makes it difficult to focus the incident light on the corresponding photoelectric conversion device 110 .
- the removal of the color filters 320 on the staircase region SCR makes it possible to prevent technical problems, which may be caused by the micro lens 330 with the curved bottom surface. That is, it is possible to focus effectively the incident light on the photoelectric conversion device 110 .
- the image sensor can be configured to have high image quality without color distortion.
- a lower planarization layer 315 may be provided between the semiconductor layer 100 and the color filters 320 .
- an upper planarization layer 325 may be provided between the color filters 320 and the micro lenses 330 .
- the lower and upper planarization layers 315 and 325 may be formed of a material with a refractive index higher than that of silicon oxide, and this makes it possible to improve light sensitivity of the image sensor.
- the lower and upper planarization layers 315 and 325 may be formed of or include a material with a refractive index of about 1.4-4.0.
- the lower and upper planarization layers 315 and 325 may be formed of aluminum oxide (Al 2 O 3 ), cerium fluoride (CeF 3 ), hafnium oxide (HfO 2 ), indium tin oxide (ITO), magnesium oxide (MgO), tantalum pentoxide (Ta 2 O 5 ), tin oxide (TiO 2 ), zirconium dioxide (ZrO 2 ), silicon (Si), germanium (Ge), zinc selenide (ZnSe), zinc sulfide (ZnS), or lead fluoride (PbF 2 ).
- the lower and upper planarization layers 315 and 325 may be formed of an organic material with a high refractive index, for example, siloxane resin, benzocyclobutene (BCB), polyimide materials, acrylic materials, parylene C, poly(methyl methacrylate) (PMMA), and polyethylene terephthalate (PET).
- organic material with a high refractive index for example, siloxane resin, benzocyclobutene (BCB), polyimide materials, acrylic materials, parylene C, poly(methyl methacrylate) (PMMA), and polyethylene terephthalate (PET).
- the lower and upper planarization layers 315 and 325 may be formed of or include, for example, strontium titanate (SrTiO 3 ), polycarbonate, glass, bromine, sapphire, cubic zirconia, potassium Niobate (KNbO 3 ), silicon carbide (SiC), gallium (III) phosphide (GaP), or gallium (III) arsenide (GaAs).
- strontium titanate SrTiO 3
- polycarbonate glass
- bromine sapphire
- cubic zirconia potassium Niobate
- KNbO 3 potassium Niobate
- SiC silicon carbide
- GaP gallium (III) phosphide
- GaAs gallium arsenide
- FIG. 3C is a sectional view of an image sensor according to another example embodiment of inventive concepts.
- the sectional view of FIG. 3C may illustrate a portion of the image sensor taken along line I-I′ of FIG. 3A .
- the image sensor may include the light-receiving region PHO and the light-blocking region BLA, and the light-receiving region PHO may include the effective image region EFF and the dummy region DUM.
- the image sensor may include the semiconductor layer 100 , the interconnection layer 200 , and the light-transmission layer 300 .
- the semiconductor layer 100 may include first and second surfaces facing each other.
- the interconnection layer 200 may be provided on the first surface of the semiconductor layer 100
- the light-transmission layer 300 may be provided on the second surface of the semiconductor layer 100 .
- the removal of the color filters 320 on the staircase region SCR makes it possible to prevent technical problems, which may be caused by the curved bottom surface of the micro lens 330 on the staircase region SCR.
- the image sensor of FIG. 3C may be configured to have substantially the same features as that of FIG. 3B . Thus, a detailed description of the image sensor of FIG. 3C will be omitted.
- FIG. 3D is a sectional view of an image sensor according to still another example embodiment of inventive concepts.
- the sectional view of FIG. 3D may illustrate a portion of the image sensor taken along line I-I′ of FIG. 3A .
- the image sensor may include the light-receiving region PHO and the light-blocking region BLA, and the light-receiving region PHO may include the effective image region EFF and the dummy region DUM.
- the image sensor may include the semiconductor layer 100 , the interconnection layer 200 , and the light-transmission layer 300 .
- the interconnection layer 200 may include first and second surfaces facing each other.
- the semiconductor layer 100 may be provided on the first surface of the interconnection layer 200
- the light-transmission layer 300 may be provided on the second surface of the interconnection layer 200 .
- the light-transmission layer 300 and the semiconductor layer 100 may be spaced apart from each other with the interconnection layer 200 interposed therebetween.
- light-guiding patterns 230 may be provided in the interconnection layer 200 .
- Each of the light-guiding patterns 230 may be provided to face or overlap a corresponding one of the photoelectric conversion devices 110 , when viewed in plan view, and thereby to guide the light incident from a corresponding one of the color filters 320 to the corresponding one of the photoelectric conversion devices 110 .
- the light-guiding patterns may be formed of or include a material, whose refractive index is higher than that of the interlayered insulating layer 210 in the interconnection layer 200 .
- the light-guiding patterns 230 may be formed of or include silicon oxynitride or silicon oxide.
- the image sensor of FIG. 3D may be configured to have substantially the same features as that of FIG. 3B , except for the above described differences (e.g., the presence of the light-guiding patterns 230 and the arrangement of the semiconductor layer 100 , the interconnection layer 200 , and the light-transmission layer 300 ). Thus, a detailed description of the image sensor of FIG. 3D will be omitted.
- FIGS. 4 through 9 are sectional views illustrating a method of fabricating an image sensor, according to example embodiments of inventive concepts.
- the semiconductor layer 100 may be provided to include the photoelectric conversion device 110 , and the interconnection layer 200 may be provided on a surface of the semiconductor layer 100 .
- the semiconductor layer 100 may be provided to include a p-type epitaxial layer 115 , which may be formed on a p-type bulk wafer.
- a p-type epitaxial layer 115 an exposed surface of the p-type epitaxial layer 115 will be referred to as a first surface of the semiconductor layer 100
- an exposed surface of the p-type bulk wafer will be referred to as a second surface of the semiconductor layer 100 .
- the semiconductor layer 100 may include the p-type bulk wafer and the p-type epitaxial layer 115 grown from the p-type bulk wafer, but example embodiments of inventive concepts are not be limited thereto.
- the device isolation layer may be formed in the semiconductor layer 100 to define the active regions.
- the photoelectric conversion device 110 may be formed in the active regions of the semiconductor layer 100 , which may be formed adjacent to the first surface.
- Each of the photoelectric conversion devices 110 may be provided in the form of a photo diode, a photo transistor, a photo gate, or a pinned photo diode (PPD).
- the interconnection layer 200 may be formed on the second surface of the semiconductor layer 100 .
- the formation of the interconnection layer 200 may include forming the interlayered insulating layer 210 , depositing a metal layer on the interlayered insulating layer 210 , and then, patterning the metal layer to form the metal lines 220 .
- the metal lines 220 may be connected to each other or to a control device on the semiconductor layer 100 through, for example, contact plugs (not shown).
- the interconnection layer 200 may be formed using a damascene process.
- the formation of the interconnection layer 200 may include patterning the interlayered insulating layer 210 , depositing a metal layer on the patterned interlayered insulating layer 210 , and then, performing a planarization process to form the metal lines 220 and/or the contact plugs.
- the metal line 220 may be formed of, for example, copper (Cu), aluminum (Al), tungsten (W), titanium (Ti), molybdenum (Mo), tantalum (Ta), titanium nitride (TiN), tantalum nitride (TaN), zirconium nitride (ZrN), tungsten nitride (WN), or any alloys thereof.
- the insulating layer 120 may be formed on the first surface of the semiconductor layer 100 .
- the insulating layer 120 may be formed of or include an optically-transparent material (e.g., silicon oxide).
- a supporting substrate (not shown) may be bonded on the interconnection layer 200 .
- the supporting substrate may support the semiconductor layer 100 and prevent devices formed on the semiconductor layer 100 from being deformed.
- a bulk wafer or a plastic substrate may be used as the supporting substrate.
- the thinning process may be performed to the semiconductor layer 100 to reduce a thickness of the semiconductor layer 100 .
- the image sensor is configured in such a way that light is incident into the semiconductor layer 100 through the second surface thereof, the larger a thickness of the semiconductor layer 100 , the larger the loss of the incident light.
- a propagation length of the light to be incident into the photoelectric conversion devices 110 can be reduced by thinning the semiconductor layer 100 , and this makes it possible to improve light sensitivity of the photoelectric conversion device 110 . Further, since a penetration depth of the incident light in the semiconductor layer 100 varies depending on the wavelength of the incident light, the thinning of the semiconductor layer 100 may be controlled in consideration of the wavelength of the incident light.
- the thinning process of the semiconductor layer 100 may include grinding or polishing the bulk wafer, and then, anisotropically and isotropically etching the bulk wafer.
- a grinder or chemical-mechanical polishing (CMP) apparatus may be used to remove mechanically a portion of the bulk wafer, and then, an anisotropic or isotropic etching process may be performed to adjust precisely a final thickness of the semiconductor layer 100 .
- the etching process of the semiconductor layer 100 may be performed, in a wet etching manner, using mixture solution containing hydrogen fluoride (HF), nitric acid (HNO 3 ), and acetic acid (CH 3 COOH).
- the light-blocking layer 310 may be formed on the light-blocking region BLA.
- the light-blocking layer 310 may be formed on an insulating layer to be overlapped with the light-blocking region BLA, when viewed in plan view.
- the light-blocking layer 310 may be formed of a metal-containing layer (e.g., of copper).
- the lower planarization layer 315 may be formed on the light-blocking layer 310 and the insulating layer.
- the lower planarization layer 315 may be formed of or include an optically-transparent material (e.g., silicon oxide). Due to the presence of the light-blocking layer 310 , the lower planarization layer 315 may have a stepwise profile on the staircase region SCR between the light-blocking and light-receiving regions BLA and PHO.
- the lower planarization layer 315 may be formed of a material with a refractive index higher than that of silicon oxide, and this makes it possible to improve light sensitivity of the image sensor.
- the lower planarization layer 315 may be formed of or include a material with a refractive index of about 1.4-4.0.
- the lower planarization layer 315 may be formed of Al 2 O 3 , CeF 3 , HfO 2 , ITO, MgO, Ta 2 O 5 , TiO 2 , ZrO 2 , Si, Ge, ZnSe, ZnS or PbF 2 .
- the lower planarization layer 315 may be formed of an organic material with a high refractive index, for example, siloxane resin, benzocyclobutene (BCB), polyimide materials, acrylic materials, parylene C, poly(methyl methacrylate) (PMMA), and polyethylene terephthalate (PET).
- organic material with a high refractive index for example, siloxane resin, benzocyclobutene (BCB), polyimide materials, acrylic materials, parylene C, poly(methyl methacrylate) (PMMA), and polyethylene terephthalate (PET).
- the color filters 320 may be formed on the lower planarization layer 315 , and some of the color filters 320 may be removed from the staircase region SCR between the light-blocking and light-receiving regions BLA and PHO. In an example embodiment, some of the color filters 320 located on or near the staircase region SCR between the light-receiving region PHO and the light-blocking region BLA may be partially or wholly removed to expose a portion of the lower planarization layer 315 .
- Each of the color filters 320 may be formed to face or overlap a corresponding one of the photoelectric conversion devices 110 , when viewed in plan view.
- the color filters 320 may be formed using a dyeing process, a pigment dispersion process, a printing process or the like.
- the respective color filters 320 may be formed of a photoresist layer dyed with a color corresponding to the respective unit pixels.
- each of the color filters 320 may be formed to display any one of red, green, and blue.
- each of the color filters 320 may be formed to display any one of cyan, magenta, or yellow.
- the color filters 320 may be two dimensionally disposed to have the same or similar arrangement as that of the photoelectric conversion devices 110 .
- the color filters 320 may be disposed to form a Bayer-type RGB pixel arrangement.
- FIGS. 7A through 7C are sectional views illustrating a process of forming color filters, according to example embodiments of inventive concepts.
- a photoresist layer 316 may be formed on the lower planarization layer 315 . Due to the light-blocking layer 310 on the light-blocking region BLA, the photoresist layer 316 may be formed to have a stepwise structure on the staircase region SCR. In detail, the lower planarization layer 315 may be formed to have the stepwise structure because of the presence of the light-blocking layer 310 locally disposed on the light-blocking region BLA, and the photoresist layer 316 may be formed to have the stepwise structure due to the stepwise structure of the lower planarization layer 315 .
- an exposing and developing process may be performed to the photoresist layer 316 to form photoresist patterns 317 , each of which faces a corresponding one of the unit pixels.
- the exposing and developing process may be performed to a portion of the photoresist layer 316 located on the staircase region SCR, and thus, the portion of the photoresist layer 316 located on the staircase region SCR may be removed. In this case, the removal of the color filters 320 may be achieved without any additional etching process.
- a dyeing process to the photoresist patterns 317 may be performed to form the color filters 320 .
- Each of the photoresist patterns 317 may be dyed to have a specific color in accordance with a corresponding pixel.
- FIGS. 8A through 8D are sectional views illustrating a process of forming color filters, according to other example embodiments of inventive concepts.
- the photoresist layer 316 may be formed on the lower planarization layer 315 .
- an exposing and developing process may be performed to the photoresist layer 316 to form first photoresist patterns 318 , each of which faces a corresponding one of the unit pixels.
- the exposing and developing process may not be performed to a portion of the photoresist layer 316 located on the staircase region SCR.
- a mask pattern 410 may be formed on the first photoresist patterns 318 , and the first photoresist patterns 318 may be partially removed using the mask pattern 410 to form second photoresist patterns 319 .
- the mask pattern 410 may be formed to expose partially the first photoresist patterns 318 positioned on the staircase region SCR or near the staircase region SCR between the light-receiving region PHO and the light-blocking region BLA.
- the exposed portion of the first photoresist patterns 318 may be wholly or partially etched. After the etching process, the mask pattern 410 may be removed.
- the use of the mask pattern 410 makes it easy to remove partially the exposed portion of the first photoresist pattern 318 and thereby to remain the first photoresist pattern 318 on the staircase region SCR.
- each of the second photoresist patterns 319 may be dyed with a specific color to form the color filters 320 .
- the dyeing process may be performed after the formation of the first photoresist patterns 318 and before the formation of the mask pattern 410 .
- the micro lenses 330 may be formed on the color filters 320 , after the etching process of the first photoresist patterns 318 .
- Each of the micro lenses 330 may be formed on a corresponding one of the color filters 320 .
- the micro lens 330 may be formed using an optically-transparent photoresist (not shown).
- the formation of the micro lens 330 may include forming photoresist patterns (not shown) on the photoelectric conversion devices 110 , respectively, and then, performing a reflow or etching process to the photoresist patterns.
- the micro lenses 330 may be formed to have an upward-convex shape with a specific curvature radius.
- the micro lenses 330 on the staircase region SCR may have bent or curved bottom surfaces, and this makes it difficult to focus properly and effectively the incident light on the photoelectric conversion device 110 .
- some of the color filters 320 may be partially or wholly removed from the staircase region SCR, and thus, the micro lenses 330 may be formed to have flat bottom surfaces. Accordingly, it is possible to focus properly and effectively the incident light on the photoelectric conversion device 110 .
- the image sensor can be configured to have high image quality.
- a cleaning process may be performed to remove residues from surfaces of the micro lenses 330 .
- a bake process may be performed to improve structural stability of the micro lens 330 .
- the upper planarization layer 325 may be formed on the color filters 320 .
- the upper planarization layer 325 may be formed of an optically-transparent material (e.g., polyimide or polyacrylic materials).
- the process of fabricating the image sensor of FIGS. 3A and 3B has been described exemplarily.
- the image sensor of FIG. 3C may be realized by adjusting an etching condition of the etching process described with reference to FIGS. 7A and 7B to remove some of the color filters.
- the image sensor of FIG. 3D may be realized by changing positions of the semiconductor layer and the interconnection layer. Nevertheless, example embodiments of inventive concepts may not be limited to the afore-described examples of the fabricating process.
- FIG. 10 is a schematic block diagram illustrating a processor-based system including the image sensor according to example embodiments of inventive concepts.
- the processor-based system 1000 is a system that processes output images of an image sensor 1100 .
- the system 1000 may include one of a computer system, a camera system, a scanner, a mechanical clock system, a navigation system, a video phone, a monitoring system, an automatic focus system, a tracking system, an operation monitoring system, and an image stabilizing system.
- a computer system a camera system, a scanner, a mechanical clock system, a navigation system, a video phone, a monitoring system, an automatic focus system, a tracking system, an operation monitoring system, and an image stabilizing system.
- the processor-based system 1000 such as a computer system may include a central processing unit (CPU) 1200 such as a microprocessor capable of communicating with an I/O device 1300 via a bus 1001 .
- the image sensor 1100 may communicate with the CPU 1200 and/or the I/O device 1300 via the bus 1001 or another communication link.
- the processor-based system 1000 may further include a RAM 1400 and/or a port 1500 capable of communicating with the CPU 1200 through the bus 1001 .
- the port 1500 may be coupled with a video card, a sound card, a memory card, a USB device, or the like. Further, the port 1500 may be connected to an additional system to carry out data communication with the additional system.
- the image sensor 1100 may be integrated with a CPU, a digital signal processing device (DSP), or a microprocessor. Moreover, the image sensor 1100 may be integrated with a memory. Alternatively, the image sensor 1100 may be integrated in a chip different from that of a processor.
- FIG. 11 is a perspective view illustrating an electronic product including an image sensor according to an example embodiment of inventive concepts.
- the image sensor according to example embodiments of inventive concepts may be applicable to a mobile phone 2000 .
- the image sensor according to the embodiment may also be applicable to cameras, camcorders, personal digital assistants (PDAs), wireless phones, laptop computers, optical mouse, facsimile machines or copying machines.
- the image sensor according to the embodiment may also be installed in telescopes, mobile phone handsets, scanners, endoscopes, fingerprint recognition systems, toys, game machines, household robots or automobiles.
- some of the color filters may be removed from the staircase region between the light-receiving and light-blocking regions, and thus, the micro lenses can have flat bottom surfaces. Accordingly, it is possible to prevent the incident light from deflecting by the micro lens, and thus, the image sensor can exhibit a reduced color distortion property.
Abstract
Example embodiments disclose an image sensor and a fabricating method thereof. An image sensor may include a semiconductor layer with a light-receiving region and a light-blocking region, the semiconductor layer including photoelectric conversion devices, a light-blocking layer on a surface of the semiconductor layer, color filters on the semiconductor layer and the light-blocking layer, and micro lenses on the color filters. The color filters are absent from an interface region between the light-receiving region and the light-blocking region.
Description
- This application is a divisional application of and claims priority under 35 U.S.C. §§120,121 to U.S. application Ser. No. 14/465,062 filed Aug. 21, 2014, which claims priority under 35 U.S.C. §119 to Korean Patent Application No. 10-2013-0138429, filed on Nov. 14, 2013, in the Korean Intellectual Property Office, the entire contents of each of which are hereby incorporated herein by reference.
- Some example embodiments of inventive concepts relate to an image sensor, such as a CMOS image sensor (CIS), and a method of fabricating the same.
- An image sensor is a device that converts optical signals into electrical signals. With increased development of the computer and communications industries, image sensors are used in a variety of applications such as digital cameras, camcorders, personal communication systems, gaming machines, security cameras, micro-cameras for medical applications, and/or robots.
- The image sensors may be generally classified into charge coupled device (CCD) and complementary metal-oxide semiconductor (CMOS) image sensors. The CMOS image sensors are configured to have signal processing circuits integrated on a single chip. In addition, CMOS image sensors may consume relatively low power, and thus, they are applicable to portable electronic devices. Furthermore, CMOS image sensors can be fabricated using cost-effective CMOS fabrication techniques and can provide high resolution images. Accordingly, the use of CMOS image sensors has increased.
- Example embodiments of inventive concepts provide an image sensor that reduces color distortion.
- Other example embodiments of inventive concepts provide a method of fabricating the image sensor.
- According to at least some example embodiments of inventive concepts, an image sensor may include a semiconductor layer having a light-receiving region and a light-blocking region, the semiconductor layer including photoelectric conversion devices, a light-blocking layer on a surface of the semiconductor layer and on the light-blocking region, color filters on the semiconductor layer and the light-blocking layer, and micro lenses on the color filters. The color filters are absent from an interface region and the interface region is between the light-receiving region and the light-blocking region.
- In at least some example embodiments, the light-blocking region surrounds the light-receiving region, the light-receiving region may include an image region and a dummy region and the dummy region may be between the image region and the light-blocking region.
- In at least some example embodiments, the color filters are absent from a portion of the light-blocking region and a portion of the dummy region, and the portion of the light-blocking region and the portion of the dummy region are adjacent to each other.
- In at least some example embodiments, a corresponding one of the color filters and a corresponding one of the micro lenses overlap each of the photoelectric conversion devices.
- In at least some example embodiments, the image sensor may further include an upper planarization layer between the color filters and the micro lenses, the upper planarization layer being on the interface region, the light-receiving region and the light-blocking region. The upper planarization layer is thicker on the interface region than on portions of the light-receiving and light-blocking regions.
- In at least some example embodiments, the image sensor may further include a lower planarization layer between the light-blocking layer and the color filters. The lower planarization layer is exposed on the interface region.
- In at least some example embodiments, the image sensor may further include an interconnection layer on an opposite surface of the semiconductor layer. The interconnection layer and the light-blocking layer are spaced apart from each other by the semiconductor layer.
- In at least some example embodiments, the image sensor may further include an interconnection layer between the light-blocking layer and the semiconductor layer.
- According to at least some example embodiments of inventive concepts, a method of fabricating an image sensor may include forming photoelectric conversion devices in a semiconductor layer, the semiconductor layer having a light-receiving region and a light-blocking region, forming a light-blocking layer on the semiconductor layer to cover the light-blocking region, forming color filters on the light-blocking layer to face the photoelectric conversion devices, respectively, removing at least a portion of the color filters, and forming micro lenses on the remaining color filters, respectively.
- In at least some example embodiments, the light-blocking region surrounds the light-receiving region, the light-receiving region includes an image region and a dummy region, and the dummy region may be between the image region and the light-blocking region.
- In at least some example embodiments, the removing the portion of the color filters may include forming a photoresist layer on the semiconductor layer and the light-blocking layer, exposing and developing the photoresist layer to form photoresist patterns on pixels, respectively, to form the color pixels and to remove the portion of the color filters, and dyeing the photoresist patterns.
- In at least some example embodiments, the removing the portion of the color filters may include forming a photoresist layer on the semiconductor layer and the light-blocking layer, exposing and developing the photoresist layer to form photoresist patterns on pixels, respectively, dyeing the photoresist patterns to form the color filters, forming a mask pattern on the color filters, and removing color filters exposed by the mask pattern.
- In at least some example embodiments, the method may further include forming a planarization layer between the light-blocking layer and the color filters. The removing the portion of the color filters exposes a top surface of the planarization layer.
- In at least some example embodiments, the method may further include forming an interconnection layer on the semiconductor layer, the semiconductor layer between the interconnection layer and the light-blocking layer.
- In at least some example embodiments, the method may further include forming an interconnection layer between the light-blocking layer and the semiconductor layer.
- At least one example embodiment discloses an image sensor including a plurality of photoelectric conversion elements, a light transmission layer on the photoelectric conversion elements, the light transmission layer including at least a first filter and at least a second filter, a transition region being between the first filter and the second filter, an upper surface of the first filter being higher than an upper surface of the second filter, and a plurality of micro lenses on the first and second filters, the plurality of micro lenses being absent from the transition region.
- In an example embodiment, the light transmission layer includes a third filter in the transition region.
- In an example embodiment, the third filter has a first portion and a second portion, a thickness of the first portion being greater than a thickness of the second portion.
- In an example embodiment, the light transmission layer includes a lower planarization layer transitioning from a first height in the light transmission layer to a second height in the light transmission layer and a light blocking layer on a portion of the lower planarization layer having the second height.
- In an example embodiment, the light transmission layer includes an upper planarization layer on the first filter, the second filter and the lower planarization layer.
- Example embodiments will be more clearly understood from the following brief description taken in conjunction with the accompanying drawings. The accompanying drawings represent non-limiting, example embodiments as described herein.
-
FIG. 1 may be a block diagram of an image sensor according to an example embodiment of inventive concepts. -
FIGS. 2A and 2B are circuit diagrams illustrating an active pixel sensor array of an image sensor, according to an example embodiment of inventive concepts. -
FIG. 3A is a plan view of an image sensor according to example embodiments of inventive concepts. -
FIGS. 3B through 3D are sectional views of the image sensor taken along line I-I′ of FIG.3A. -
FIG. 4 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts. -
FIG. 5 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts. -
FIG. 6 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts. -
FIGS. 7A through 7C are sectional views illustrating a process of forming color filters, according to example embodiments of inventive concepts. -
FIGS. 8A through 8D are sectional views illustrating a process of forming color filters, according to other example embodiments of inventive concepts. -
FIG. 9 is a sectional view of a method of fabricating an image sensor, according to example embodiments of inventive concepts. -
FIG. 10 is a schematic block diagram illustrating a process-based system including an image sensor, according to an example embodiment of inventive concepts. -
FIG. 11 is a perspective view illustrating an electronic device including an image sensor, according to an example embodiment of inventive concepts. - It should be noted that these figures are intended to illustrate the general characteristics of methods, structure and/or materials utilized in some example embodiments and to supplement the written description provided below. These drawings are not, however, to scale and may not precisely reflect the precise structural or performance characteristics of any given embodiment, and should not be interpreted as defining or limiting the range of values or properties encompassed by example embodiments. For example, the relative thicknesses and positioning of molecules, layers, regions and/or structural elements may be reduced or exaggerated for clarity. The use of similar or identical reference numbers in the various drawings is intended to indicate the presence of a similar or identical element or feature.
- Example embodiments of inventive concepts will now be described more fully with reference to the accompanying drawings, in which example embodiments are shown. Example embodiments of inventive concepts may, however, be embodied in many different forms and should not be construed as being limited to example embodiments set forth herein; rather, example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those of ordinary skill in the art. In the drawings, the thicknesses of layers and regions are exaggerated for clarity. Like reference numerals in the drawings denote like elements, and thus their description will be omitted.
- It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. Like numbers indicate like elements throughout. As used herein the term “and/or” includes any and all combinations of one or more of the associated listed items. Other words used to describe the relationship between elements or layers should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” “on” versus “directly on”).
- It will be understood that, although the terms “first”, “second”, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of example embodiments.
- Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising”, “includes” and/or “including,” if used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
- Example embodiments of inventive concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of example embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of inventive concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle may have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region. Likewise, a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
- Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments of inventive concepts belong. It will be further understood that terms, such as those defined in commonly-used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
-
FIG. 1 may be a block diagram of an image sensor according to an example embodiment of inventive concepts. The image sensor may be a CMOS image sensor, but example embodiments of inventive concepts are not limited thereto. - Referring to
FIG. 1 , the image sensor may include an activepixel sensor array 10, arow decoder 20, arow driver 30, acolumn decoder 40, atiming generator 50, a correlateddouble sampler 60, an analog-to-digital converter 70, and an input/output (I/O)buffer 80. - The active
pixel sensor array 10 may include a plurality of two-dimensionally arranged unit pixels, each of which is configured to convert optical signals to electrical signals. The activepixel sensor array 10 may be driven by a plurality of driving signals such as a pixel selection signal, a reset signal, and a charge transmission signal from therow driver 30. The converted electrical signal may be provided to the correlateddouble sampler 60. - The
row driver 30 may provide several driving signals for driving several unit pixels of the activepixel sensor array 10 in accordance with a decoded result obtained from therow decoder 20. In the case where the unit pixels are arranged in a matrix shape, the driving signals may be supplied to respective rows. - The
timing generator 50 may provide timing and control signals to therow decoder 20 and thecolumn decoder 40. - The correlated
double sampler 60 may receive the electric signals generated in the activepixel sensor array 10, and hold and sample the received electric signals. The correlateddouble sampler 60 may perform a double sampling operation to sample a noise level and a signal level of the electric signal and output a difference level corresponding to a difference between the noise and signal levels. - The analog-to-
digital converter 70 may convert analog signals corresponding to the difference level output from the correlateddouble sampler 60 into digital signals, and then the analog todigital converter 70 outputs the converted digital signals. - The I/
O buffer 80 may latch the digital signal and then output the latched digital signals sequentially to an image signal processing unit (not shown) in accordance with the decoding result obtained from thecolumn decoder 40. -
FIGS. 2A and 2B are circuit diagrams illustrating an active pixel sensor array of an image sensor, according to example embodiments of inventive concepts. - The active
pixel sensor array 10 may include a plurality of unit pixels, which may be arranged in the form of a matrix form. In example embodiments, each of the unit pixels may include at least one photoelectric conversion device (110 inFIG. 2A, 110 a and 110 b inFIG. 2B ), which is configured to generate electric charges from light incident thereto and store the generated electric charges, and a reading device, which is configured to read an optical signal generated in the photoelectric conversion device. The reading device may include areset element 140, anamplification element 150, and aselection element 160. -
FIG. 2A illustrates a plurality of unit pixels, each of which includes four N-channel MOS transistors. Referring toFIG. 2A , each unit pixel P1 may be composed of a singlephotoelectric conversion device 110 and fourMOS transistors - More specifically, the
photoelectric conversion device 110 may be configured to generate and store charges corresponding to the incident light. In some example embodiments, thephotoelectric conversion device 110 may be realized by a photo diode, a photo transistor, a photo gate, a pinned photo diode (PPD), or any combination thereof. In the present example embodiment, the photo diode may be used as thephotoelectric conversion device 110. Thephotoelectric conversion device 110 may be connected to acharge transmission element 130 transmitting the stored charges to adetection area 120. - The
detection area 120 may be a floating diffusion region FD, which is provided in the semiconductor layer and is doped with N-type impurities. The floating diffusion region FD may receive the charges stored in thephotoelectric conversion device 110 to accumulate charges therein. The detection area 120 (e.g., the floating diffusion region FD) may be electrically connected to theamplification element 150 to control theamplification element 150. - The
charge transmission element 130 may transmit the charges stored in thephotoelectric conversion device 110 to thedetection area 120. Thecharge transmission element 130 may generally be composed of one MOS transistor and may be controlled by a bias applied to a charge transmission signal line TX(i). - The
reset element 140 may periodically reset thedetection area 120 and may be composed of one MOS transistor. A source of thereset element 140 may be connected to thedetection area 120 and a drain of thereset element 140 may be connected to a power supply terminal applied with a power supply voltage VDD. Thereset element 140 may be driven by a bias applied to a reset signal line RX(i). In the case where thereset element 140 is turned on by the bias applied to the reset signal line RX(i), the power supply voltage VDD may be applied to thedetection area 120. Therefore, thedetection area 120 may be reset, when thereset element 140 is turned on. - In conjunction with a static current source (not shown) located outside the unit pixel P1, the
amplification element 150 may serve as a source follower buffer amplifier. For example, theamplification element 150 may amplify a variation in electric potential of thedetection area 120, and then, output the amplified signal to an output line Vout through theselection element 160. - The
selection elements 160 may be configured to be able to select each row of the unit pixels P1 in a reading operation and may be composed of one MOS transistor. Theselection elements 160 in each row may be driven by a bias applied to a row select signal line SEL(i). In the case where theselection elements 160 are turned on by the bias applied to the row select signal line SEL(i), the output signals of theamplification elements 150 composed of the MOS transistors may be transmitted to the output lines Vout through theselection element 160. - The driving signal lines TX(i), RX(i), and SEL(i) may be electrically connected to the
charge transmission elements 130, thereset elements 140, and theselection elements 160, respectively. The driving signal lines TX(i), RX(i), and SEL(i) may extend in a row direction (horizontal direction) so as to simultaneously drive the unit pixels arrayed in the same row. -
FIG. 2B illustrates an example of a paired pixel, in which two photoelectric conversion devices are configured to share the reading device. According to an example embodiment shown inFIG. 2B , the activepixel sensor array 10 may include a plurality of paired pixels P2, which are arranged in the matrix form. Each of the paired pixels P2 may be configured in such a way that a reading device is shared by a pair ofphotoelectric conversion devices photoelectric conversion devices reset element 140, theamplification element 150, and/or theselection element 160. Further, each of thephotoelectric conversion devices transmission elements - A bias applied to the row selection line SEL(i) may allow the
selection element 160 to select each row of the paired pixels P2 in a reading operation. Further, in the case where biases are applied to thecharge transmission elements photoelectric conversion devices detection area 120. -
FIG. 3A is a plan view of an image sensor according to an example embodiment of inventive concepts, andFIG. 3B is a sectional view of the image sensor taken along line I-I′ of FIG.3A. - Referring to
FIGS. 3A and 3B , the image sensor may include a light-receiving region PHO and a light-blocking region BLA. In an example embodiment, the light-receiving region PHO may be provided at a central region of the image sensor, and the light-blocking region BLA may be provided around the light-receiving region PHO or at an edge region of the image sensor. - The light-receiving region PHO may include an effective image region EFF and a dummy region DUM. The dummy region DUM may be disposed between the effective image region EFF and the light-blocking region BLA. In the light-receiving region PHO, a plurality of active pixels may be provided on each or both of the effective image region EFF and the dummy region DUM. The effective image region EFF may be configured to generate image signals from an incident light, while the dummy region DUM may be provided to process the image signals generated from the effective image region EFF.
- Devices in the light-blocking region BLA may be operated under black or dark environment. Thus, signals generated in the light-blocking region BLA may serve as a reference signal having no dependence on the incident light. For example, for the image sensor converting optical signals to electric signals, it is necessary to remove an environmental effect (i.e., noise), which may be caused by, for example, thermal electrons in the
photoelectric conversion devices 110. For this, the light-blocking region BLA, in which photoelectric conversion effect can be prevented, is provided in the image sensor, in addition to the light-receiving region PHO. Further, in order to remove the environmental effect from the image, an amount of electric charges generated in a pixel of the light-blocking region BLA is subtracted from an amount of electric charges generated in each active pixel of the light-receiving region PHO. Further, to prevent the light from being incident into the light-blocking region BLA, a light-blocking layer (e.g., of metal) may be provided on the whole top surface of the light-blocking region, while the pixel in the light-blocking region may be configured to have the same feature as the active pixel. - Referring to
FIG. 3B , the image sensor may include asemiconductor layer 100, aninterconnection layer 200, and a light-transmission layer 300. - In an example embodiment, the
semiconductor layer 100 may have first and second surfaces facing each other. Theinterconnection layer 200 may be provided on the first surface of thesemiconductor layer 100, and the light-transmission layer 300 may be provided on the second surface of thesemiconductor layer 100. - The
semiconductor layer 100 may be configured to include a bulk silicon wafer of a first conductivity type (e.g., P-type) and anepitaxial layer 115 of the first conductivity type on the bulk silicon wafer. In other example embodiments, thesemiconductor layer 100 may include only the P-type epitaxial layer 115 without the bulk silicon wafer. In still other example embodiments, thesemiconductor layer 100 may be a bulk semiconductor wafer, in which a well region of the first conductivity type is provided. In yet other example embodiments, thesemiconductor layer 100 may include an N-type epitaxial layer, a bulk silicon wafer, a silicon-on-insulator (SOI) wafer, and so forth. - In the case where an external light is incident into the
semiconductor layer 100, a penetration depth of the incident light into thesemiconductor layer 100 may vary depending on a wavelength of the incident light. In this respect, a thickness of thesemiconductor layer 100 may be determined in consideration of the wavelength of the light incident into thephotoelectric conversion devices 110. - A device isolation layer (not shown) may be formed in the
semiconductor layer 100 to define active regions. For example, referring toFIGS. 2A and 2B , the device isolation layer may be formed to define a first active region for thephotoelectric conversion device 110 and a second active region for the reading device. However, shapes and disposition of the first and second active regions may not be limited thereto. - In the
semiconductor layer 100, thephotoelectric conversion devices 110 may be arranged in a matrix shape, when viewed in plan view. For example, each of thephotoelectric conversion devices 110 may be shaped like a rectangle or tetragon, when viewed in plan view. Each of thephotoelectric conversion devices 110 may be provided in the form of a photo diode, a photo transistor, a photo gate, or a pinned photo diode (PPD). - The
interconnection layer 200 may include several devices, which may be configured to read out electrical signals generated from thephotoelectric conversion devices 110 and to control the unit pixels. - In an example embodiment, the
interconnection layer 200 may include an interlayered insulatinglayer 210, which is provided to have a multi-layered structure, and a plurality ofmetal lines 220, which are vertically stacked in the interlayered insulatinglayer 210. Themetal lines 220 may be connected to reading and logic devices through contact plugs (not shown). In an example embodiment, themetal lines 220 may be provided without dependence on the arrangement of thephotoelectric conversion devices 110. For example, themetal lines 220 may be provided to cross over thephotoelectric conversion devices 110. - In some example embodiments, the
interconnection layer 200 may be formed to be interposed between thesemiconductor layer 100 and a supporting substrate (not shown). The supporting substrate may include at least one of a semiconductor substrate, a glass substrate, and a plastic substrate. The supporting substrate may be bonded to theinterconnection layer 200 by an adhesive layer. The use of the supporting substrate makes it possible to prevent thesemiconductor layer 100 from being bent or curved when thesemiconductor layer 100 is thinned. - The light-
transmission layer 300 may be provided on the second surface of thesemiconductor layer 100 and include a light-blocking layer 310,color filters 320, andmicro lenses 330. - The light-
blocking layer 310 may be formed on the light-blocking region BLA. For example, the light-blocking layer 310 may be formed to be overlapped with the light-blocking region BLA, when viewed in plan view. The light-blocking layer 310 may be formed of a metal-containing layer (e.g., of copper). - The color filters 320 may be overlapped with the
photoelectric conversion devices 110, respectively, when viewed in plan view. For example, thecolor filter 320 may be disposed to realize red, green or blue, in accordance with a position or structure of the unit pixel. Further, thecolor filters 320 may be two-dimensionally arranged, similar or identical to the arrangement of thephotoelectric conversion devices 110. - In an example embodiment, the
color filters 320 may be disposed to form a Bayer-type RGB pixel arrangement. To realize a color image, each of thecolor filters 320 may be configured in such a way that light having a specific wavelength can be incident into a corresponding one of the unit pixels. For example, thecolor filters 320 may include red, green, and blue color filters, which filter the incident light and allow red, green, and blue lights, respectively, to transmit therethrough. In other example embodiments, thecolor filters 320 may be configured to realize other color system including cyan, magenta, or yellow. - Due to the presence of the light-
blocking layer 310, there may be a height difference between thecolor filters 320 provided on the light-blocking and light-receiving regions BLA and PHO. In an example embodiment, some of thecolor filters 320 may be removed to avoid technical problems caused by the height difference. For example, some of thecolor filters 320 may be removed from an interface region SCR (hereinafter, referred as to a “staircase region”) between the light-blocking and light-receiving regions BLA and PHO. In other words, thecolor filters 320 may be absent from the staircase region SCR. The staircase region SCR for the removal of thecolor filters 320 may include at least a portion of the light-blocking region BLA and a portion of the dummy region DUM, as will be described in more detail below. - The
micro lenses 330 may be provided on thecolor filters 320 to face or overlap the unit pixels, respectively, when viewed in plan view. Each of themicro lenses 330 may have an upward-convex shape with a specific curvature radius. Themicro lenses 330 may be formed of an optically-transparent resin. Each of themicro lenses 330 makes it possible to focus the incident light on a corresponding one of thephotoelectric conversion devices 110. For example, even when a fraction of the incident light is oriented toward a region beyond thephotoelectric conversion device 110, it can be incident into thephotoelectric conversion device 110, by virtue of themicro lens 330. As shown, themicro lenses 330 may be disposed on the light-receiving region PHO, but example embodiments of inventive concepts are not be limited thereto. For example, themicro lenses 330 may be provided on both of the light-blocking and light-receiving regions BLA and PHO. - As described above, the presence of the light-
blocking layer 310 may result in the height difference between the light-blocking and light-receiving regions BLA and PHO. In the case where thecolor filters 320 are not removed from the staircase region SCR, themicro lens 330 on the staircase region SCR may have a sloped or curved bottom surface and this makes it difficult to focus the incident light on the correspondingphotoelectric conversion device 110. In this respect, the removal of thecolor filters 320 on the staircase region SCR makes it possible to prevent technical problems, which may be caused by themicro lens 330 with the curved bottom surface. That is, it is possible to focus effectively the incident light on thephotoelectric conversion device 110. As a result, the image sensor can be configured to have high image quality without color distortion. - Optionally, a
lower planarization layer 315 may be provided between thesemiconductor layer 100 and the color filters 320. Further, anupper planarization layer 325 may be provided between thecolor filters 320 and themicro lenses 330. The lower and upper planarization layers 315 and 325 may be formed of a material with a refractive index higher than that of silicon oxide, and this makes it possible to improve light sensitivity of the image sensor. For example, the lower and upper planarization layers 315 and 325 may be formed of or include a material with a refractive index of about 1.4-4.0. For example, the lower and upper planarization layers 315 and 325 may be formed of aluminum oxide (Al2O3), cerium fluoride (CeF3), hafnium oxide (HfO2), indium tin oxide (ITO), magnesium oxide (MgO), tantalum pentoxide (Ta2O5), tin oxide (TiO2), zirconium dioxide (ZrO2), silicon (Si), germanium (Ge), zinc selenide (ZnSe), zinc sulfide (ZnS), or lead fluoride (PbF2). In another case, the lower and upper planarization layers 315 and 325 may be formed of an organic material with a high refractive index, for example, siloxane resin, benzocyclobutene (BCB), polyimide materials, acrylic materials, parylene C, poly(methyl methacrylate) (PMMA), and polyethylene terephthalate (PET). Further, the lower and upper planarization layers 315 and 325 may be formed of or include, for example, strontium titanate (SrTiO3), polycarbonate, glass, bromine, sapphire, cubic zirconia, potassium Niobate (KNbO3), silicon carbide (SiC), gallium (III) phosphide (GaP), or gallium (III) arsenide (GaAs). -
FIG. 3C is a sectional view of an image sensor according to another example embodiment of inventive concepts. Here, the sectional view ofFIG. 3C may illustrate a portion of the image sensor taken along line I-I′ ofFIG. 3A . - Referring to
FIGS. 3A and 3C , the image sensor may include the light-receiving region PHO and the light-blocking region BLA, and the light-receiving region PHO may include the effective image region EFF and the dummy region DUM. The image sensor may include thesemiconductor layer 100, theinterconnection layer 200, and the light-transmission layer 300. In an example embodiment, thesemiconductor layer 100 may include first and second surfaces facing each other. Here, theinterconnection layer 200 may be provided on the first surface of thesemiconductor layer 100, and the light-transmission layer 300 may be provided on the second surface of thesemiconductor layer 100. - As shown in
FIG. 3C , due to the presence of the light-blocking layer 310, there may be a height difference between thecolor filters 320 provided on the light-blocking and light-receiving regions BLA and PHO. Some of thecolor filters 320 on the staircase region SCR between the light-blocking and light-receiving regions BLA and PHO may be etched. - Like this, the removal of the
color filters 320 on the staircase region SCR makes it possible to prevent technical problems, which may be caused by the curved bottom surface of themicro lens 330 on the staircase region SCR. - Except for the above described differences, the image sensor of
FIG. 3C may be configured to have substantially the same features as that ofFIG. 3B . Thus, a detailed description of the image sensor ofFIG. 3C will be omitted. -
FIG. 3D is a sectional view of an image sensor according to still another example embodiment of inventive concepts. Here, the sectional view ofFIG. 3D may illustrate a portion of the image sensor taken along line I-I′ ofFIG. 3A . - Referring to
FIGS. 3A and 3D , the image sensor may include the light-receiving region PHO and the light-blocking region BLA, and the light-receiving region PHO may include the effective image region EFF and the dummy region DUM. The image sensor may include thesemiconductor layer 100, theinterconnection layer 200, and the light-transmission layer 300. - In an example embodiment, the
interconnection layer 200 may include first and second surfaces facing each other. Here, thesemiconductor layer 100 may be provided on the first surface of theinterconnection layer 200, and the light-transmission layer 300 may be provided on the second surface of theinterconnection layer 200. - The light-
transmission layer 300 and thesemiconductor layer 100 may be spaced apart from each other with theinterconnection layer 200 interposed therebetween. In an example embodiment, light-guidingpatterns 230 may be provided in theinterconnection layer 200. Each of the light-guidingpatterns 230 may be provided to face or overlap a corresponding one of thephotoelectric conversion devices 110, when viewed in plan view, and thereby to guide the light incident from a corresponding one of thecolor filters 320 to the corresponding one of thephotoelectric conversion devices 110. The light-guiding patterns may be formed of or include a material, whose refractive index is higher than that of the interlayered insulatinglayer 210 in theinterconnection layer 200. The light-guidingpatterns 230 may be formed of or include silicon oxynitride or silicon oxide. - The image sensor of
FIG. 3D may be configured to have substantially the same features as that ofFIG. 3B , except for the above described differences (e.g., the presence of the light-guidingpatterns 230 and the arrangement of thesemiconductor layer 100, theinterconnection layer 200, and the light-transmission layer 300). Thus, a detailed description of the image sensor ofFIG. 3D will be omitted. -
FIGS. 4 through 9 are sectional views illustrating a method of fabricating an image sensor, according to example embodiments of inventive concepts. - Referring to
FIG. 4 , thesemiconductor layer 100 may be provided to include thephotoelectric conversion device 110, and theinterconnection layer 200 may be provided on a surface of thesemiconductor layer 100. - For example, the
semiconductor layer 100 may be provided to include a p-type epitaxial layer 115, which may be formed on a p-type bulk wafer. Hereinafter, an exposed surface of the p-type epitaxial layer 115 will be referred to as a first surface of thesemiconductor layer 100, while an exposed surface of the p-type bulk wafer will be referred to as a second surface of thesemiconductor layer 100. Thesemiconductor layer 100 may include the p-type bulk wafer and the p-type epitaxial layer 115 grown from the p-type bulk wafer, but example embodiments of inventive concepts are not be limited thereto. The device isolation layer may be formed in thesemiconductor layer 100 to define the active regions. Thephotoelectric conversion device 110 may be formed in the active regions of thesemiconductor layer 100, which may be formed adjacent to the first surface. Each of thephotoelectric conversion devices 110 may be provided in the form of a photo diode, a photo transistor, a photo gate, or a pinned photo diode (PPD). - The
interconnection layer 200 may be formed on the second surface of thesemiconductor layer 100. For example, the formation of theinterconnection layer 200 may include forming the interlayered insulatinglayer 210, depositing a metal layer on the interlayered insulatinglayer 210, and then, patterning the metal layer to form the metal lines 220. Themetal lines 220 may be connected to each other or to a control device on thesemiconductor layer 100 through, for example, contact plugs (not shown). In another example embodiment, theinterconnection layer 200 may be formed using a damascene process. For example, the formation of theinterconnection layer 200 may include patterning the interlayered insulatinglayer 210, depositing a metal layer on the patterned interlayered insulatinglayer 210, and then, performing a planarization process to form themetal lines 220 and/or the contact plugs. Themetal line 220 may be formed of, for example, copper (Cu), aluminum (Al), tungsten (W), titanium (Ti), molybdenum (Mo), tantalum (Ta), titanium nitride (TiN), tantalum nitride (TaN), zirconium nitride (ZrN), tungsten nitride (WN), or any alloys thereof. - In an example embodiment, the insulating
layer 120 may be formed on the first surface of thesemiconductor layer 100. The insulatinglayer 120 may be formed of or include an optically-transparent material (e.g., silicon oxide). - A supporting substrate (not shown) may be bonded on the
interconnection layer 200. In a subsequent thinning process of thesemiconductor layer 100, the supporting substrate may support thesemiconductor layer 100 and prevent devices formed on thesemiconductor layer 100 from being deformed. A bulk wafer or a plastic substrate may be used as the supporting substrate. Thereafter, the thinning process may be performed to thesemiconductor layer 100 to reduce a thickness of thesemiconductor layer 100. In the case where the image sensor is configured in such a way that light is incident into thesemiconductor layer 100 through the second surface thereof, the larger a thickness of thesemiconductor layer 100, the larger the loss of the incident light. In this respect, a propagation length of the light to be incident into thephotoelectric conversion devices 110 can be reduced by thinning thesemiconductor layer 100, and this makes it possible to improve light sensitivity of thephotoelectric conversion device 110. Further, since a penetration depth of the incident light in thesemiconductor layer 100 varies depending on the wavelength of the incident light, the thinning of thesemiconductor layer 100 may be controlled in consideration of the wavelength of the incident light. - The thinning process of the
semiconductor layer 100 may include grinding or polishing the bulk wafer, and then, anisotropically and isotropically etching the bulk wafer. For example, a grinder or chemical-mechanical polishing (CMP) apparatus may be used to remove mechanically a portion of the bulk wafer, and then, an anisotropic or isotropic etching process may be performed to adjust precisely a final thickness of thesemiconductor layer 100. For example, the etching process of thesemiconductor layer 100 may be performed, in a wet etching manner, using mixture solution containing hydrogen fluoride (HF), nitric acid (HNO3), and acetic acid (CH3COOH). - Referring to
FIG. 5 , the light-blocking layer 310 may be formed on the light-blocking region BLA. - The light-
blocking layer 310 may be formed on an insulating layer to be overlapped with the light-blocking region BLA, when viewed in plan view. The light-blocking layer 310 may be formed of a metal-containing layer (e.g., of copper). - In an example embodiment, the
lower planarization layer 315 may be formed on the light-blocking layer 310 and the insulating layer. Thelower planarization layer 315 may be formed of or include an optically-transparent material (e.g., silicon oxide). Due to the presence of the light-blocking layer 310, thelower planarization layer 315 may have a stepwise profile on the staircase region SCR between the light-blocking and light-receiving regions BLA and PHO. - The
lower planarization layer 315 may be formed of a material with a refractive index higher than that of silicon oxide, and this makes it possible to improve light sensitivity of the image sensor. In an example embodiment, thelower planarization layer 315 may be formed of or include a material with a refractive index of about 1.4-4.0. For example, thelower planarization layer 315 may be formed of Al2O3, CeF3, HfO2, ITO, MgO, Ta2O5, TiO2, ZrO2, Si, Ge, ZnSe, ZnS or PbF2. In another example embodiment, thelower planarization layer 315 may be formed of an organic material with a high refractive index, for example, siloxane resin, benzocyclobutene (BCB), polyimide materials, acrylic materials, parylene C, poly(methyl methacrylate) (PMMA), and polyethylene terephthalate (PET). - Referring to
FIG. 6 , thecolor filters 320 may be formed on thelower planarization layer 315, and some of thecolor filters 320 may be removed from the staircase region SCR between the light-blocking and light-receiving regions BLA and PHO. In an example embodiment, some of thecolor filters 320 located on or near the staircase region SCR between the light-receiving region PHO and the light-blocking region BLA may be partially or wholly removed to expose a portion of thelower planarization layer 315. - Each of the
color filters 320 may be formed to face or overlap a corresponding one of thephotoelectric conversion devices 110, when viewed in plan view. The color filters 320 may be formed using a dyeing process, a pigment dispersion process, a printing process or the like. Therespective color filters 320 may be formed of a photoresist layer dyed with a color corresponding to the respective unit pixels. For example, each of thecolor filters 320 may be formed to display any one of red, green, and blue. Alternatively, each of thecolor filters 320 may be formed to display any one of cyan, magenta, or yellow. The color filters 320 may be two dimensionally disposed to have the same or similar arrangement as that of thephotoelectric conversion devices 110. In some example embodiments, thecolor filters 320 may be disposed to form a Bayer-type RGB pixel arrangement. - Hereinafter, the formation and partial removal of the
color filters 320 will be described in more detail with reference toFIGS. 7A through 7C . -
FIGS. 7A through 7C are sectional views illustrating a process of forming color filters, according to example embodiments of inventive concepts. - Referring to
FIG. 7A , aphotoresist layer 316 may be formed on thelower planarization layer 315. Due to the light-blocking layer 310 on the light-blocking region BLA, thephotoresist layer 316 may be formed to have a stepwise structure on the staircase region SCR. In detail, thelower planarization layer 315 may be formed to have the stepwise structure because of the presence of the light-blocking layer 310 locally disposed on the light-blocking region BLA, and thephotoresist layer 316 may be formed to have the stepwise structure due to the stepwise structure of thelower planarization layer 315. - Referring to
FIG. 7B , an exposing and developing process may be performed to thephotoresist layer 316 to formphotoresist patterns 317, each of which faces a corresponding one of the unit pixels. In an example embodiment, the exposing and developing process may be performed to a portion of thephotoresist layer 316 located on the staircase region SCR, and thus, the portion of thephotoresist layer 316 located on the staircase region SCR may be removed. In this case, the removal of thecolor filters 320 may be achieved without any additional etching process. - Referring to
FIG. 7C , a dyeing process to thephotoresist patterns 317 may be performed to form the color filters 320. Each of thephotoresist patterns 317 may be dyed to have a specific color in accordance with a corresponding pixel. -
FIGS. 8A through 8D are sectional views illustrating a process of forming color filters, according to other example embodiments of inventive concepts. - Referring to
FIG. 8A , thephotoresist layer 316 may be formed on thelower planarization layer 315. - Referring to
FIG. 8B , an exposing and developing process may be performed to thephotoresist layer 316 to formfirst photoresist patterns 318, each of which faces a corresponding one of the unit pixels. According toFIG. 8B , unlike that ofFIG. 7B , the exposing and developing process may not be performed to a portion of thephotoresist layer 316 located on the staircase region SCR. - Referring to
FIG. 8C , amask pattern 410 may be formed on thefirst photoresist patterns 318, and thefirst photoresist patterns 318 may be partially removed using themask pattern 410 to formsecond photoresist patterns 319. - For example, the
mask pattern 410 may be formed to expose partially thefirst photoresist patterns 318 positioned on the staircase region SCR or near the staircase region SCR between the light-receiving region PHO and the light-blocking region BLA. The exposed portion of thefirst photoresist patterns 318 may be wholly or partially etched. After the etching process, themask pattern 410 may be removed. The use of themask pattern 410 makes it easy to remove partially the exposed portion of thefirst photoresist pattern 318 and thereby to remain thefirst photoresist pattern 318 on the staircase region SCR. - Referring to
FIG. 8D , each of thesecond photoresist patterns 319 may be dyed with a specific color to form the color filters 320. - In an example embodiment, the dyeing process may be performed after the formation of the
first photoresist patterns 318 and before the formation of themask pattern 410. - Referring to
FIG. 9 , themicro lenses 330 may be formed on thecolor filters 320, after the etching process of thefirst photoresist patterns 318. - Each of the
micro lenses 330 may be formed on a corresponding one of the color filters 320. Themicro lens 330 may be formed using an optically-transparent photoresist (not shown). For example, the formation of themicro lens 330 may include forming photoresist patterns (not shown) on thephotoelectric conversion devices 110, respectively, and then, performing a reflow or etching process to the photoresist patterns. Accordingly, themicro lenses 330 may be formed to have an upward-convex shape with a specific curvature radius. - If the
color filters 320 on the staircase region SCR are not etched, themicro lenses 330 on the staircase region SCR may have bent or curved bottom surfaces, and this makes it difficult to focus properly and effectively the incident light on thephotoelectric conversion device 110. By contrast, according to example embodiments of inventive concepts, some of thecolor filters 320 may be partially or wholly removed from the staircase region SCR, and thus, themicro lenses 330 may be formed to have flat bottom surfaces. Accordingly, it is possible to focus properly and effectively the incident light on thephotoelectric conversion device 110. As a result, the image sensor can be configured to have high image quality. - Subsequently, a cleaning process may be performed to remove residues from surfaces of the
micro lenses 330. Further, a bake process may be performed to improve structural stability of themicro lens 330. - In an example embodiment, the
upper planarization layer 325 may be formed on the color filters 320. Theupper planarization layer 325 may be formed of an optically-transparent material (e.g., polyimide or polyacrylic materials). - The process of fabricating the image sensor of
FIGS. 3A and 3B has been described exemplarily. The image sensor ofFIG. 3C may be realized by adjusting an etching condition of the etching process described with reference toFIGS. 7A and 7B to remove some of the color filters. The image sensor ofFIG. 3D may be realized by changing positions of the semiconductor layer and the interconnection layer. Nevertheless, example embodiments of inventive concepts may not be limited to the afore-described examples of the fabricating process. -
FIG. 10 is a schematic block diagram illustrating a processor-based system including the image sensor according to example embodiments of inventive concepts. - Referring to
FIG. 10 , the processor-basedsystem 1000 is a system that processes output images of animage sensor 1100. - The
system 1000 may include one of a computer system, a camera system, a scanner, a mechanical clock system, a navigation system, a video phone, a monitoring system, an automatic focus system, a tracking system, an operation monitoring system, and an image stabilizing system. However, example embodiments are not limited thereto. - The processor-based
system 1000 such as a computer system may include a central processing unit (CPU) 1200 such as a microprocessor capable of communicating with an I/O device 1300 via a bus 1001. Theimage sensor 1100 may communicate with theCPU 1200 and/or the I/O device 1300 via the bus 1001 or another communication link. The processor-basedsystem 1000 may further include aRAM 1400 and/or aport 1500 capable of communicating with theCPU 1200 through the bus 1001. - The
port 1500 may be coupled with a video card, a sound card, a memory card, a USB device, or the like. Further, theport 1500 may be connected to an additional system to carry out data communication with the additional system. Theimage sensor 1100 may be integrated with a CPU, a digital signal processing device (DSP), or a microprocessor. Moreover, theimage sensor 1100 may be integrated with a memory. Alternatively, theimage sensor 1100 may be integrated in a chip different from that of a processor. -
FIG. 11 is a perspective view illustrating an electronic product including an image sensor according to an example embodiment of inventive concepts. - Referring to
FIG. 11 , the image sensor according to example embodiments of inventive concepts may be applicable to amobile phone 2000. Further, the image sensor according to the embodiment may also be applicable to cameras, camcorders, personal digital assistants (PDAs), wireless phones, laptop computers, optical mouse, facsimile machines or copying machines. In addition, the image sensor according to the embodiment may also be installed in telescopes, mobile phone handsets, scanners, endoscopes, fingerprint recognition systems, toys, game machines, household robots or automobiles. - According to example embodiments of inventive concepts, some of the color filters may be removed from the staircase region between the light-receiving and light-blocking regions, and thus, the micro lenses can have flat bottom surfaces. Accordingly, it is possible to prevent the incident light from deflecting by the micro lens, and thus, the image sensor can exhibit a reduced color distortion property.
- While example embodiments of inventive concepts have been particularly shown and described, it will be understood by one of ordinary skill in the art that variations in form and detail may be made therein without departing from the spirit and scope of the attached claims.
Claims (9)
1-8. (canceled)
9. A method of fabricating an image sensor, comprising:
forming photoelectric conversion devices in a semiconductor layer, the semiconductor layer having a light-receiving region and a light-blocking region;
forming a light-blocking layer on the semiconductor layer to cover the light-blocking region;
forming color filters on the light-blocking layer to face the photoelectric conversion devices, respectively;
removing at least a portion of the color filters; and
forming micro lenses on the remaining color filters, respectively.
10. The method of claim 9 , wherein
the light-blocking region surrounds the light-receiving region,
the light-receiving region includes an image region and a dummy region, and
the dummy region is between the image region and the light-blocking region.
11. The method of claim 10 , wherein the removing the portion of the color filters includes:
forming a photoresist layer on the semiconductor layer and the light-blocking layer;
exposing and developing the photoresist layer to form photoresist patterns on pixels, respectively, to form the color pixels and to remove the portion of the color filters; and
dyeing the photoresist patterns.
12. The method of claim 10 , wherein the removing the portion of the color filters includes:
forming a photoresist layer on the semiconductor layer and the light-blocking layer;
exposing and developing the photoresist layer to form photoresist patterns on pixels, respectively;
dyeing the photoresist patterns to form the color filters;
forming a mask pattern on the color filters; and
removing color filters exposed by the mask pattern.
13. The method of claim 9 , further comprising:
forming a planarization layer between the light-blocking layer and the color filters, wherein
the removing the portion of the color filters exposes a top surface of the planarization layer.
14. The method of claim 9 , further comprising:
forming an interconnection layer on the semiconductor layer, the semiconductor layer between the interconnection layer and the light-blocking layer.
15. The method of claim 9 , further comprising:
forming an interconnection layer between the light-blocking layer and the semiconductor layer.
16-20. (canceled)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/372,999 US20170092687A1 (en) | 2013-11-14 | 2016-12-08 | Image sensor and method of fabricating the same |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130138429A KR102149772B1 (en) | 2013-11-14 | 2013-11-14 | Image sensor and method of manufacturing the same |
KR10-2013-0138429 | 2013-11-14 | ||
US14/465,062 US9559140B2 (en) | 2013-11-14 | 2014-08-21 | Image sensor and method of fabricating the same |
US15/372,999 US20170092687A1 (en) | 2013-11-14 | 2016-12-08 | Image sensor and method of fabricating the same |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/465,062 Division US9559140B2 (en) | 2013-11-14 | 2014-08-21 | Image sensor and method of fabricating the same |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170092687A1 true US20170092687A1 (en) | 2017-03-30 |
Family
ID=53043051
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/465,062 Active US9559140B2 (en) | 2013-11-14 | 2014-08-21 | Image sensor and method of fabricating the same |
US15/372,999 Abandoned US20170092687A1 (en) | 2013-11-14 | 2016-12-08 | Image sensor and method of fabricating the same |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/465,062 Active US9559140B2 (en) | 2013-11-14 | 2014-08-21 | Image sensor and method of fabricating the same |
Country Status (2)
Country | Link |
---|---|
US (2) | US9559140B2 (en) |
KR (1) | KR102149772B1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10431624B2 (en) * | 2015-07-08 | 2019-10-01 | Samsung Electronics Co., Ltd. | Method of manufacturing image sensor including nanostructure color filter |
TWI581016B (en) * | 2016-04-15 | 2017-05-01 | 奇景光電股份有限公司 | Color-filter stack structure and method for fabricating the same |
KR102318195B1 (en) * | 2016-07-08 | 2021-10-26 | 삼성전자주식회사 | Method for fabricating image sensor |
JP6818468B2 (en) * | 2016-08-25 | 2021-01-20 | キヤノン株式会社 | Photoelectric converter and camera |
KR20200084719A (en) * | 2019-01-03 | 2020-07-13 | 삼성전자주식회사 | Image sensor and Method of fabricating the same |
KR102393910B1 (en) * | 2019-03-22 | 2022-05-03 | 아크소프트 코포레이션 리미티드 | Tiled image sensor |
US11635786B2 (en) * | 2020-06-11 | 2023-04-25 | Apple Inc. | Electronic optical sensing device |
CN117015855A (en) * | 2021-03-10 | 2023-11-07 | 索尼半导体解决方案公司 | Light detecting element |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5677202A (en) * | 1995-11-20 | 1997-10-14 | Eastman Kodak Company | Method for making planar color filter array for image sensors with embedded color filter arrays |
US20050130071A1 (en) * | 2003-12-11 | 2005-06-16 | Ju-Il Lee | Method for fabricating image sensor with inorganic microlens |
US20070238034A1 (en) * | 2006-04-07 | 2007-10-11 | Micron Technology, Inc. | Color filter array and imaging device containing such color filter array and method of fabrication |
US7695995B2 (en) * | 2006-09-26 | 2010-04-13 | Dongbu Hitek Co., Ltd. | Image sensor and method of fabricating the same |
US20120199930A1 (en) * | 2011-02-08 | 2012-08-09 | Sony Corporation | Solid-state imaging device, manufacturing method thereof, and electronic apparatus |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW369726B (en) * | 1998-05-04 | 1999-09-11 | United Microelectronics Corp | Structure and producing method of microlens on color filter of sensor device |
JP2001196571A (en) | 2000-01-07 | 2001-07-19 | Sony Corp | Solid-state image pickup device |
US6838715B1 (en) | 2002-04-30 | 2005-01-04 | Ess Technology, Inc. | CMOS image sensor arrangement with reduced pixel light shadowing |
JP4485151B2 (en) | 2003-05-30 | 2010-06-16 | パナソニック株式会社 | Solid-state imaging device manufacturing method and solid-state imaging device. |
KR20050094283A (en) | 2004-03-22 | 2005-09-27 | 엘지.필립스 엘시디 주식회사 | Color filter substrate and fabrication method therefor |
US7294818B2 (en) | 2004-08-24 | 2007-11-13 | Canon Kabushiki Kaisha | Solid state image pickup device and image pickup system comprising it |
US7432491B2 (en) | 2005-05-06 | 2008-10-07 | Micron Technology, Inc. | Pixel with spatially varying sensor positions |
US7315014B2 (en) | 2005-08-30 | 2008-01-01 | Micron Technology, Inc. | Image sensors with optical trench |
US8599301B2 (en) * | 2006-04-17 | 2013-12-03 | Omnivision Technologies, Inc. | Arrayed imaging systems having improved alignment and associated methods |
JP4346655B2 (en) | 2007-05-15 | 2009-10-21 | 株式会社東芝 | Semiconductor device |
JP4735643B2 (en) | 2007-12-28 | 2011-07-27 | ソニー株式会社 | Solid-state imaging device, camera and electronic device |
JP5422914B2 (en) * | 2008-05-12 | 2014-02-19 | ソニー株式会社 | Method for manufacturing solid-state imaging device |
KR101458052B1 (en) | 2008-06-12 | 2014-11-06 | 삼성전자주식회사 | Cmos image sensor having preventing crosstalk structure and method for manufacturing the same |
KR20100037210A (en) | 2008-10-01 | 2010-04-09 | 주식회사 동부하이텍 | Image sensor and fabricating method thereof |
JP5493461B2 (en) | 2009-05-12 | 2014-05-14 | ソニー株式会社 | Solid-state imaging device, electronic apparatus, and manufacturing method of solid-state imaging device |
KR20110007408A (en) * | 2009-07-16 | 2011-01-24 | 삼성전자주식회사 | Semiconductor device having optical filter for the single chip three-dimension color image sensor and method for manufacturing same |
KR101688084B1 (en) * | 2010-06-30 | 2016-12-20 | 삼성전자주식회사 | An image sensor and package comprising the same |
KR20140010553A (en) * | 2012-07-13 | 2014-01-27 | 삼성전자주식회사 | Pixel array, image sensor having the same, and method for compensating local dark current |
-
2013
- 2013-11-14 KR KR1020130138429A patent/KR102149772B1/en active IP Right Grant
-
2014
- 2014-08-21 US US14/465,062 patent/US9559140B2/en active Active
-
2016
- 2016-12-08 US US15/372,999 patent/US20170092687A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5677202A (en) * | 1995-11-20 | 1997-10-14 | Eastman Kodak Company | Method for making planar color filter array for image sensors with embedded color filter arrays |
US20050130071A1 (en) * | 2003-12-11 | 2005-06-16 | Ju-Il Lee | Method for fabricating image sensor with inorganic microlens |
US20070238034A1 (en) * | 2006-04-07 | 2007-10-11 | Micron Technology, Inc. | Color filter array and imaging device containing such color filter array and method of fabrication |
US7695995B2 (en) * | 2006-09-26 | 2010-04-13 | Dongbu Hitek Co., Ltd. | Image sensor and method of fabricating the same |
US20120199930A1 (en) * | 2011-02-08 | 2012-08-09 | Sony Corporation | Solid-state imaging device, manufacturing method thereof, and electronic apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR102149772B1 (en) | 2020-08-31 |
US20150130005A1 (en) | 2015-05-14 |
US9559140B2 (en) | 2017-01-31 |
KR20150055887A (en) | 2015-05-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9559140B2 (en) | Image sensor and method of fabricating the same | |
US9356067B2 (en) | Image sensors including a gate electrode surrounding a floating diffusion region | |
US11056523B2 (en) | Optical sensors including a light-impeding pattern | |
US9240512B2 (en) | Image sensors having transfer gate electrodes in trench | |
US8970768B2 (en) | Unit pixel array and image sensor having the same | |
KR20120001895A (en) | An image sensor and package comprising the same | |
EP2657972A1 (en) | Solid-state imaging device, method of manufacturing the same, and electronic apparatus | |
US20160056198A1 (en) | Complementary metal-oxide-semiconductor image sensors | |
US8941199B2 (en) | Image sensors | |
US11323667B2 (en) | Image sensor including a transparent conductive layer in a trench | |
US20110057279A1 (en) | Anti-reflective image sensor | |
US20150155328A1 (en) | Image sensor | |
US9686457B2 (en) | High efficiency image sensor pixels with deep trench isolation structures and embedded reflectors | |
US11398513B2 (en) | Image sensor | |
US20170040364A1 (en) | Image sensors and image processing devices including the same | |
US9515120B2 (en) | Image sensor | |
JP2009259934A (en) | Solid-state imaging device | |
US20110205410A1 (en) | Image sensor having color filters | |
US10186543B2 (en) | Image sensor including phase difference detectors | |
US20100128155A1 (en) | Image sensors and methods of manufacturing the same | |
KR20160032584A (en) | Image sensor including microlenses having high refractive index | |
US9263495B2 (en) | Image sensor and fabricating method thereof | |
US9219094B2 (en) | Backside illuminated image sensor | |
US11652125B2 (en) | Image sensor | |
US20220231074A1 (en) | Image sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |