WO2022190653A1 - 撮像素子、撮像装置、及び撮像素子の製造方法 - Google Patents
撮像素子、撮像装置、及び撮像素子の製造方法 Download PDFInfo
- Publication number
- WO2022190653A1 WO2022190653A1 PCT/JP2022/001700 JP2022001700W WO2022190653A1 WO 2022190653 A1 WO2022190653 A1 WO 2022190653A1 JP 2022001700 W JP2022001700 W JP 2022001700W WO 2022190653 A1 WO2022190653 A1 WO 2022190653A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lens
- imaging device
- imaging
- image sensor
- pixel substrate
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 249
- 238000004519 manufacturing process Methods 0.000 title claims description 49
- 238000000034 method Methods 0.000 title description 36
- 239000000758 substrate Substances 0.000 claims abstract description 226
- 230000003287 optical effect Effects 0.000 claims description 116
- 239000011347 resin Substances 0.000 claims description 82
- 229920005989 resin Polymers 0.000 claims description 82
- 230000008859 change Effects 0.000 claims description 16
- 230000004075 alteration Effects 0.000 claims description 11
- 230000007423 decrease Effects 0.000 claims description 5
- 238000010276 construction Methods 0.000 claims description 2
- 235000012431 wafers Nutrition 0.000 description 66
- 239000010410 layer Substances 0.000 description 64
- 238000007789 sealing Methods 0.000 description 33
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 28
- 229910052710 silicon Inorganic materials 0.000 description 28
- 239000010703 silicon Substances 0.000 description 28
- 238000010586 diagram Methods 0.000 description 27
- 238000001514 detection method Methods 0.000 description 24
- 238000004891 communication Methods 0.000 description 18
- 238000005516 engineering process Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 14
- 238000012545 processing Methods 0.000 description 14
- 239000000463 material Substances 0.000 description 13
- 238000002674 endoscopic surgery Methods 0.000 description 11
- 229910000679 solder Inorganic materials 0.000 description 10
- 239000012790 adhesive layer Substances 0.000 description 9
- 239000011229 interlayer Substances 0.000 description 8
- 239000000203 mixture Substances 0.000 description 8
- 230000002093 peripheral effect Effects 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 239000004020 conductor Substances 0.000 description 6
- 239000000470 constituent Substances 0.000 description 5
- 238000000016 photochemical curing Methods 0.000 description 5
- 239000004065 semiconductor Substances 0.000 description 5
- 238000001356 surgical procedure Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000005452 bending Methods 0.000 description 4
- 239000010949 copper Substances 0.000 description 4
- 238000001723 curing Methods 0.000 description 4
- 238000005530 etching Methods 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 208000005646 Pneumoperitoneum Diseases 0.000 description 3
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 3
- 241000724291 Tobacco streak virus Species 0.000 description 3
- 239000000853 adhesive Substances 0.000 description 3
- 230000001070 adhesive effect Effects 0.000 description 3
- 238000010336 energy treatment Methods 0.000 description 3
- 230000001747 exhibiting effect Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000001459 lithography Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 2
- 229910004298 SiO 2 Inorganic materials 0.000 description 2
- GWEVSGVZZGPLCZ-UHFFFAOYSA-N Titan oxide Chemical compound O=[Ti]=O GWEVSGVZZGPLCZ-UHFFFAOYSA-N 0.000 description 2
- MCMNRKCIXSYSNV-UHFFFAOYSA-N Zirconium dioxide Chemical compound O=[Zr]=O MCMNRKCIXSYSNV-UHFFFAOYSA-N 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 229910052802 copper Inorganic materials 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 2
- 229960004657 indocyanine green Drugs 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 2
- 229910052721 tungsten Inorganic materials 0.000 description 2
- 239000010937 tungsten Substances 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000018199 S phase Effects 0.000 description 1
- 229910052581 Si3N4 Inorganic materials 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 229910001080 W alloy Inorganic materials 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- CETPSERCERDGAM-UHFFFAOYSA-N ceric oxide Chemical compound O=[Ce]=O CETPSERCERDGAM-UHFFFAOYSA-N 0.000 description 1
- 229910000422 cerium(IV) oxide Inorganic materials 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001312 dry etching Methods 0.000 description 1
- 238000002073 fluorescence micrograph Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 229910010272 inorganic material Inorganic materials 0.000 description 1
- 239000011147 inorganic material Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000002406 microsurgery Methods 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000011368 organic material Substances 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 1
- 229920005591 polysilicon Polymers 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- SBIBMFFZSBJNJF-UHFFFAOYSA-N selenium;zinc Chemical compound [Se]=[Zn] SBIBMFFZSBJNJF-UHFFFAOYSA-N 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 235000012239 silicon dioxide Nutrition 0.000 description 1
- 239000000377 silicon dioxide Substances 0.000 description 1
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 1
- 229910052814 silicon oxide Inorganic materials 0.000 description 1
- 238000005549 size reduction Methods 0.000 description 1
- 229910052950 sphalerite Inorganic materials 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 238000004544 sputter deposition Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000002344 surface layer Substances 0.000 description 1
- 229910052715 tantalum Inorganic materials 0.000 description 1
- GUVRBAGPIYLISA-UHFFFAOYSA-N tantalum atom Chemical compound [Ta] GUVRBAGPIYLISA-UHFFFAOYSA-N 0.000 description 1
- 229920001187 thermosetting polymer Polymers 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- MAKDTFFYCIMFQP-UHFFFAOYSA-N titanium tungsten Chemical compound [Ti].[W] MAKDTFFYCIMFQP-UHFFFAOYSA-N 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 229910052984 zinc sulfide Inorganic materials 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14634—Assemblies, i.e. Hybrid structures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/001—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/001—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
- G02B13/0055—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras employing a special optical element
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0025—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
- G02B27/0037—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration with diffracting elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0025—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
- G02B27/005—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration for correction of secondary colour or higher-order chromatic aberrations
- G02B27/0056—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration for correction of secondary colour or higher-order chromatic aberrations by using a diffractive optical element
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14618—Containers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14636—Interconnect structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
- H01L27/14685—Process for coatings or optical elements
Definitions
- the present disclosure relates to an imaging device, an imaging device, and a method for manufacturing an imaging device.
- a general imaging device shooting light is focused on an image sensor (for example, a CMOS image sensor) using the geometrical optics refraction of the lens.
- an image sensor for example, a CMOS image sensor
- Patent Document 1 discloses an optical lens that utilizes light diffraction.
- the compact lens configuration may cause problems in optical characteristics (e.g., chromatic aberration). need to be suppressed.
- the present disclosure has been made in view of the circumstances described above, and provides a technology that is advantageous for acquiring high-quality images with a small device configuration.
- One aspect of the present disclosure is a pixel substrate including an image sensor, a translucent cover body facing the image sensor, a diffraction lens having a plurality of projecting lens parts projecting from the cover body toward the image sensor, and in which a space is provided between a plurality of projecting lens portions.
- the imaging element may be positioned between the image sensor and the diffractive lens, and may include a photocurable resin film in contact with the plurality of projecting lens portions.
- the imaging element may be provided with an inorganic film positioned between the image sensor and the diffraction lens and in contact with the plurality of projecting lens portions.
- the imaging device may comprise a plurality of lens-constituting layers that are superimposed on each other, and each of the plurality of lens-constituting layers may include a cover body and a diffractive lens.
- the diffractive lens may be positioned at a distance of 60 ⁇ m or less from the image sensor.
- planar sizes of the plurality of projecting lens portions change periodically based on the distance from the optical axis.
- the planar size of the protruding lens portion may decrease with increasing distance from the optical axis in each period.
- planar sizes of the plurality of projecting lens portions change periodically based on the distance from the optical axis. and the planar size of the protruding lens portion may increase with increasing distance from the optical axis in each period.
- the imaging element may be provided with a fixing portion that is positioned between the pixel substrate and the cover and fixes the cover to the pixel substrate.
- a space may be provided between the image sensor and the diffraction lens.
- the imaging element may include a support that supports the pixel substrate, and a fixing portion that is positioned between the support and the cover substrate and fixes the cover to the support.
- Another aspect of the present disclosure includes a pixel substrate including an image sensor, a translucent cover body facing the image sensor, and a diffraction lens having a plurality of projecting lens parts projecting from the cover body toward the image sensor. , and an imaging lens located on the opposite side of the pixel substrate with respect to the cover body, and a space is formed between the plurality of projecting lens portions.
- the diffractive lens may reduce chromatic aberration of the imaging lens.
- the diffractive lens may emit light at a principal ray incident angle smaller than the principal ray incident angle of the light directed from the imaging lens to the diffractive lens.
- Another aspect of the present disclosure includes a step of fixing a translucent cover body to a pixel substrate having an image sensor, the cover bodies forming a diffractive lens and having a space therebetween.
- the present invention relates to a method for manufacturing an imaging device, in which a plurality of protruding lens portions are fixed, and a cover body is fixed to a pixel substrate so that the plurality of protruding lens portions are positioned between the cover body and the pixel substrate.
- a method for manufacturing an imaging device includes a step of applying a photocurable resin on a pixel substrate, and a step of curing a portion of the photocurable resin on the pixel substrate that covers an image sensor by light irradiation,
- the cover body may be fixed to the pixel substrate while a plurality of protruding lens portions face the portions of the upper photocurable resin cured by light irradiation.
- the method for manufacturing the imaging device may include a step of applying an inorganic film on the pixel substrate, and the cover body may be fixed to the pixel substrate while facing the plurality of projecting lens portions to the inorganic film.
- FIG. 1 is a diagram illustrating focal points of short wavelength light and long wavelength light that have passed through a unit including a plurality of geometrical optics lenses.
- FIG. 2 is a diagram illustrating focal points of short wavelength light and long wavelength light passing through a diffractive lens using optical diffraction.
- FIG. 3 is a drawing illustrating focal points of short wavelength light and long wavelength light that have passed through an optical lens system including a geometrical optics lens and a diffractive lens.
- FIG. 4 is a cross-sectional view showing an example of an imaging device according to the first embodiment.
- 5 is a cross-sectional view showing an enlarged part of the imaging element shown in FIG. 4.
- FIG. 6 is a cross-sectional view showing an example of the structure of the lower substrate and the upper substrate, showing an enlarged part of the imaging element.
- FIG. 7 is a cross-sectional view showing an example of an imaging device including a geometrical optics lens and an imaging device.
- 8 is a cross-sectional view showing an enlarged part of the imaging element shown in FIG. 7.
- FIG. 9 is a cross-sectional view showing an enlarged part of the imaging element, and is a diagram for exemplifying a case where color mixture occurs between adjacent image sensors.
- FIG. 10 is a cross-sectional view showing an enlarged part of the image pickup device, and is a diagram illustrating a case where the diffractive lens refracts photographing light toward an appropriate image sensor to prevent color mixture.
- FIG. 11A is a diagram for explaining an example of a method of manufacturing a diffractive lens.
- FIG. 11B is a diagram for explaining an example of a method of manufacturing a diffractive lens.
- FIG. 11C is a diagram for explaining an example of a method of manufacturing a diffractive lens;
- FIG. 11D is a diagram for explaining an example of a method of manufacturing a diffractive lens;
- FIG. 11E is a diagram for explaining an example of a method of manufacturing a diffractive lens;
- FIG. 12 is a perspective view showing a plurality of diffractive lenses formed on the cover wafer.
- FIG. 13A is a perspective view showing an example of a method for manufacturing an image sensor.
- FIG. 13A is a perspective view showing an example of a method for manufacturing an image sensor.
- FIG. 13B is a perspective view showing an example of a method for manufacturing an imaging device
- FIG. 13C is a perspective view showing an example of a method for manufacturing an image sensor
- FIG. 13D is a perspective view showing an example of a method for manufacturing an image sensor.
- FIG. 14 is a cross-sectional view of the imaging device showing an example of the method of manufacturing the imaging device.
- FIG. 15 is a cross-sectional view of the imaging device showing an example of the method of manufacturing the imaging device.
- FIG. 16 is a cross-sectional view of the imaging device showing an example of the method of manufacturing the imaging device.
- FIG. 17 is a cross-sectional view showing another example of the method of manufacturing the imaging device.
- FIG. 18 is a cross-sectional view showing an enlarged part of the imaging device manufactured by the manufacturing method shown in FIG.
- FIG. 19 is a cross-sectional view showing an example of an imaging device according to the second embodiment.
- FIG. 20 is a cross-sectional view showing an example of an imaging device according to the third embodiment.
- FIG. 21 is a cross-sectional view showing an example of an imaging device according to the fourth embodiment.
- FIG. 22 is a cross-sectional view showing an example of an imaging device according to the fifth embodiment.
- FIG. 23 is a cross-sectional view showing an example of an imaging device according to the sixth embodiment.
- FIG. 24 is a perspective view showing a structural example of a diffractive lens.
- FIG. 25 is an enlarged plan view showing an outline of an example of a diffractive lens.
- FIG. 26A is a diagram for explaining the phase difference of light diffraction of the diffractive lens (plurality of projecting lens portions).
- FIG. 26B is a diagram for explaining the refraction angle of light diffraction of the diffractive lens (plurality of projecting lens portions).
- FIG. 26C is a diagram for explaining the refraction angle and focal length of light diffraction of the diffractive lens (plurality of projecting lens portions).
- FIG. 27 is a plan view showing an arrangement example of a plurality of projecting lens portions that constitute one diffractive lens.
- 1 is a block diagram showing an example of a schematic configuration of a vehicle control system; FIG. FIG.
- FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit; 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system; FIG. 3 is a block diagram showing an example of functional configurations of a camera head and a CCU; FIG.
- FIG. 1 is a diagram illustrating the focal points of short-wavelength light L1 and long-wavelength light L2 that have passed through a unit including a plurality of geometrical-optical lenses (hereinafter also simply referred to as "geometrical-optical lens 21").
- the geometrical optics lens 21 that uses geometrical optics refraction exhibits a smaller refractive index and a longer focal length as the wavelength of light increases (see “long wavelength light L2" shown in FIG. 1). That is, the geometrical optics lens 21 exhibits a larger refractive index and a shorter focal length as the wavelength of light becomes shorter (see “short wavelength light L1" shown in FIG. 1).
- the photographing light is condensed onto the image sensor by the geometrical optics lens 21 having the above characteristics, it is necessary to combine a plurality of lenses in order to suppress chromatic aberration.
- FIG. 2 is a diagram illustrating the focal points of the short wavelength light L1 and the long wavelength light L2 that have passed through the diffraction lens 22 that utilizes optical diffraction.
- the diffraction lens 22 exhibits a larger refractive index and a shorter focal length as the wavelength of light becomes longer (see “long wavelength light L2" shown in FIG. 2). That is, the diffraction lens 22 exhibits a smaller refractive index and a longer focal length as the wavelength of light becomes shorter (see “short wavelength light L1” in FIG. 2).
- the geometrical optics lens 21 and the diffractive lens 22 exhibit mutually opposite refraction characteristics with respect to the wavelength of incident light. Therefore, by combining the geometrical optics lens 21 and the diffractive lens 22, it is possible to effectively reduce the chromatic aberration while suppressing the enlargement of the size of the optical lens system in the direction along the optical axis Ax (hereinafter also referred to as "optical axis direction"). It is possible.
- FIG. 3 is a diagram illustrating the focal points of the short wavelength light L1 and the long wavelength light L2 that have passed through an optical lens system including the geometrical optics lens 21 and the diffraction lens 22.
- the diffractive lens 22 is attached to the surface of the lens closest to the subject among the units of the geometrical optics lens 21 .
- the focal point of the short wavelength light L1 and the focal point of the long wavelength light L2 can be matched or brought close to each other while suppressing expansion of the optical lens system in the optical axis direction. is possible.
- the optical characteristics of the entire optical lens system can be improved, but the installation of the diffraction lens 22 is not necessarily easy.
- Advanced lens optical characteristics can also be achieved by increasing the number of lenses included in the geometrical optics lens 21 as described above.
- imaging devices mounted on mobile terminals such as smartphones, there is a demand for smaller, thinner, and lighter optical lens systems from the viewpoint of improving mobility.
- An increase in the number of lenses for improving the performance of an optical lens system and a size reduction for improving mobility are mutually contradictory demands.
- the incident angle of light to the image sensor tends to increase. decreases and the quality of the captured image deteriorates.
- the principal ray incident angle approaches 0°
- the traveling direction of the light toward the image sensor approaches the optical axis direction. approach direction.
- FIG. 4 is a cross-sectional view showing an example of the imaging element 11 according to the first embodiment.
- FIG. 5 is a cross-sectional view showing an enlarged part of the imaging element 11 shown in FIG.
- the imaging element 11 shown in FIGS. 4 and 5 is a semiconductor package in which a pixel substrate 33 including a laminated lower substrate 31 and an upper substrate 32 is packaged.
- the imaging device 11 receives imaging light traveling from top to bottom in FIG. 4, converts the imaging light into an electrical signal, and outputs the electrical signal (that is, image data).
- a plurality of solder balls 34 are formed on the lower substrate 31 as back electrodes for electrical connection with an external substrate (not shown).
- R (red), G (green), and B (blue) color filters 35 and an on-chip lens 36 covering the color filters 35 are provided on the upper surface of the upper substrate 32 .
- a cover body 38 is fixed to the upper substrate 32 via a seal resin 37 .
- the sealing resin 37 functions as an adhesive layer that bonds the cover body 38 to the upper substrate 32 and functions as a sealing layer that shields the color filter 35 and the on-chip lens 36 from the outside.
- the upper substrate 32 is formed with a pixel region having a plurality of two-dimensionally arranged image sensors (photoelectric conversion elements) and a control circuit for controlling the plurality of image sensors.
- a logic circuit such as a circuit for processing pixel signals from a plurality of image sensors is formed on the lower substrate 31 .
- only the pixel region may be formed on the upper substrate 32 , and the control circuit and logic circuit may be formed on the lower substrate 31 .
- FIG. 6 is a cross-sectional view showing an example of the structure of the lower substrate 31 and the upper substrate 32, showing an enlarged portion of the imaging element 11. As shown in FIG.
- a multilayer wiring layer 82 is formed on the upper side (upper substrate 32 side) of a semiconductor substrate 81 (hereinafter also referred to as "silicon substrate 81") made of silicon (Si), for example.
- the multilayer wiring layer 82 constitutes, for example, the control circuit and the logic circuit described above.
- the multilayer wiring layer 82 includes a plurality of wiring layers 83 including an uppermost wiring layer 83a closest to the upper substrate 32, an intermediate wiring layer 83b, and a lowermost wiring layer 83c closest to the silicon substrate 81; It is composed of an interlayer insulating film 84 formed between wiring layers 83 .
- the plurality of wiring layers 83 are formed using, for example, copper (Cu), aluminum (Al), tungsten (W), etc., and the interlayer insulating film 84 is formed using, for example, a silicon oxide film, a silicon nitride film, or the like. .
- Each of the plurality of wiring layers 83 and interlayer insulating films 84 may be formed of the same material in all layers, or two or more materials may be used depending on the layer.
- a silicon through hole 85 is formed through the silicon substrate 81 at a predetermined position of the silicon substrate 81 .
- a connection conductor 87 is embedded in the inner wall of the silicon through hole 85 via an insulating film 86 to form a silicon through electrode (TSV: Through Silicon Via) 88 .
- the insulating film 86 can be formed of, for example, a SiO 2 film, a SiN film, or the like.
- the insulating film 86 and the connection conductor 87 are formed along the inner wall surface, and the inside of the silicon through hole 85 is hollow.
- the entire interior may be filled with connecting conductors 87 .
- the inside of the through-hole may be filled with a conductor or may be partially hollow. This also applies to a chip through electrode (TCV: Through Chip Via) 105 and the like, which will be described later.
- connection conductor 87 of the silicon through electrode 88 is connected to the rewiring 90 formed on the lower surface side of the silicon substrate 81, and the rewiring 90 is connected to the solder balls 34.
- the connection conductor 87 and the rewiring 90 can be made of, for example, copper (Cu), tungsten (W), titanium (Ti), tantalum (Ta), titanium-tungsten alloy (TiW), or polysilicon.
- solder mask (solder resist) 91 is formed on the lower surface side of the silicon substrate 81 so as to cover the rewiring 90 and the insulating film 86 except for the regions where the solder balls 34 are formed.
- a multilayer wiring layer 102 is formed on the lower side (lower substrate 31 side) of a semiconductor substrate 101 made of silicon (Si) (hereinafter also referred to as "silicon substrate 101").
- the multilayer wiring layer 102 constitutes, for example, a pixel circuit in the pixel region.
- the multilayer wiring layer 102 includes a plurality of wiring layers 103 including a top wiring layer 103a closest to the silicon substrate 101, an intermediate wiring layer 103b, and a bottom wiring layer 103c closest to the lower substrate 31, It is composed of an interlayer insulating film 104 formed between each wiring layer 103 .
- Materials used for the plurality of wiring layers 103 and the interlayer insulating film 104 can employ the same materials as those of the wiring layer 83 and the interlayer insulating film 84 described above.
- the plurality of wiring layers 103 and interlayer insulating films 104 may be formed by selectively using one or more materials, as in the case of the wiring layers 83 and interlayer insulating films 84 described above.
- the multilayer wiring layer 102 of the upper substrate 32 is composed of three wiring layers 103, and the multilayer wiring layer 82 of the lower substrate 31 is composed of four wiring layers 83.
- the total number of wiring layers is not limited to this, and any number of layers can be formed.
- An image sensor 40 composed of a photodiode formed by a PN junction is formed for each pixel in the silicon substrate 101 .
- a plurality of pixel transistors such as a first transfer transistor 52 and a second transfer transistor 54, a memory section (MEM) 53, and the like are also formed in the multilayer wiring layer 102 and the silicon substrate 101. ing.
- Silicon through electrodes 109 connected to the wiring layer 103a of the upper substrate 32 and the wiring layer 83a of the lower substrate 31 are provided at predetermined positions of the silicon substrate 101 where the color filter 35 and the on-chip lens 36 are not formed.
- a connected chip through electrode 105 is formed.
- the chip through electrode 105 and silicon through electrode 109 are connected by a connection wiring 106 formed on the upper surface of the silicon substrate 101 .
- An insulating film 107 is formed between each of the silicon through electrode 109 and the chip through electrode 105 and the silicon substrate 101 .
- a color filter 35 and an on-chip lens 36 are formed on the upper surface of the silicon substrate 101 with an insulating film (flattening film) 108 interposed therebetween.
- the pixel substrate 33 of the solid-state imaging device 1 shown in FIG. 1 has a laminated structure in which the multilayer wiring layer 82 side of the lower substrate 31 and the multilayer wiring layer 102 side of the upper substrate 32 are bonded together. ing.
- the bonding surface between the multilayer wiring layer 82 of the lower substrate 31 and the multilayer wiring layer 102 of the upper substrate 32 is indicated by a dashed line.
- the wiring layer 103 of the upper substrate 32 and the wiring layer 83 of the lower substrate 31 are connected by two through electrodes, ie, the silicon through electrode 109 and the chip through electrode 105 .
- the wiring layer 83 of the lower substrate 31 and the solder balls (rear electrodes) 14 are connected by silicon through electrodes 88 and rewirings 90 . Thereby, the plane area of the solid-state imaging device 1 can be minimized.
- a small semiconductor device semiconductor package
- the imaging device 11 includes the pixel substrate 33 having the image sensor 40 on which the imaging light is incident, and the translucent cover body 38 facing the image sensor 40 .
- a seal resin 37 positioned between the pixel substrate 33 and the cover body 38 functions as a fixing portion to fix the cover body 38 to the pixel substrate 33 .
- the pixel substrate 33 and the cover body 38 are integrated.
- the imaging device 11 of this embodiment includes a diffraction lens 22 attached to the cover body 38 .
- the cover body 38 shown in FIGS. 4 to 6 has two flat surfaces (that is, a front surface and a rear surface) that are separated in the optical axis direction and extend in a direction perpendicular to the optical axis Ax. .
- the diffractive lens 22 is attached to one of these flat surfaces of the cover body 38 (that is, the back surface facing the pixel substrate 33).
- the diffractive lens 22 has a plurality of protruding lens portions 23 protruding from the cover body 38 toward the pixel substrate 33 (especially the image sensor 40), and spaces (that is, air gaps 24) are provided between the protruding lens portions 23. be provided.
- the air gap 24 formed between the adjacent protruding lens portions 23 is not filled with a member such as the sealing resin 37, and maintains a space state.
- the diffractive lens 22 retains its original fine unevenness shape, and exhibits and maintains excellent optical characteristics.
- the diffractive lens 22 of this embodiment can inherently have high lens performance in terms of refraction, and can be configured as a lens exhibiting a high refractive index.
- the actual refractive index of the diffractive lens 22 can be changed by appropriately adjusting the shape, size, etc. of each projecting lens portion 23 . Therefore, the diffractive lens 22 of the present embodiment can accommodate a wide refractive index range, and is provided to selectively exhibit a desired refractive index within that refractive index range.
- the diffraction lens 22 As described above, according to the present embodiment, it is possible to design the diffraction lens 22 exhibiting desired optical refraction characteristics with a high degree of freedom.
- the diffractive lens 22 (particularly, the distal end portions of the plurality of projecting lens portions 23) of the present embodiment contacts the sealing resin 37, and the pixel substrate 33 (that is, the lower substrate 31 and the upper substrate 32) through the sealing resin 37. To support.
- the pixel substrate 33 for example, the upper substrate 32 including the image sensor 40
- the distance between the diffraction lens 22 and the image sensor 40 can be stably and uniformly maintained at a desired distance.
- the diffractive lens 22 has high refractive performance.
- the pixel substrate 33 tends to bend, and for example, the pixel substrate 33 may unintentionally warp toward the diffraction lens 22. be.
- the pixel substrate 33 bends, the position of the photographic light condensed by the diffraction lens 22 and the light incident surface (that is, the imaging surface) of the image sensor 40 deviate from each other, resulting in photographing in a defocused state. The quality of the captured image deteriorates.
- the pixel substrate 33 is supported by the diffraction lens 22, so that the pixel substrate 33 is prevented from bending (in particular, warping toward the diffraction lens 22). can be suppressed. Thereby, the distance between the diffractive lens 22 and the pixel substrate 33 (especially the image sensor 40) can be kept constant over the entire imaging surface, and the light gathering performance of the imaging element 11 can be improved.
- the pixel substrate 33 is supported by the diffraction lens 22 via the sealing resin 37, the distance between the diffraction lens 22 and the pixel substrate 33 (that is, the optical path length) is suppressed, and the pixel substrate 33 is Unintended bending can be suppressed.
- the pixel substrate 33 when the pixel substrate 33 is supported by a supporting member provided on the opposite side of the cover member 38 with the pixel substrate 33 interposed therebetween (see FIG. 23 described later), the bending of the pixel substrate 33 can be prevented, but the optical path It tends to grow in length.
- the pixel substrate 33 and an external substrate are connected to each other via wiring such as wire bonding (WB).
- WB wire bonding
- the chromatic aberration of the photographing light condensed on the image sensor 40 via the diffraction lens 22 increases, and the size (thickness) of the entire imaging element 11 in the optical axis direction increases.
- the imaging element 11 of this embodiment shown in FIGS. 4 to 6 a support for supporting the pixel substrate 33 from the outside is unnecessary. Therefore, the solder balls 34 can be provided as wiring for connecting the imaging element 11 and the external substrate, and wire bonding wiring for connecting the pixel substrate 33 and the external substrate (not shown) is unnecessary. Therefore, the diffraction lens 22 can be installed close to the pixel substrate 33 (especially the image sensor 40), and the optical path length can be shortened.
- the image sensor 11 of this embodiment can acquire a high-quality captured image with suppressed chromatic aberration, and the size of the image sensor 11 in the optical axis direction can be reduced.
- FIG. 7 is a cross-sectional view showing an example of the imaging device 10 including the geometrical optics lens 21 and the imaging device 11.
- FIG. 8 is a cross-sectional view showing an enlarged part of the imaging device 11 shown in FIG.
- the imaging device 10 shown in FIG. 7 includes the imaging element 11 shown in FIG. 4 described above, and the geometrical optics lens 21 (imaging lens) located on the opposite side of the pixel substrate 33 with the cover body 38 interposed therebetween.
- the imaging device 11 includes the diffractive lens 22, the diffractive lens 22 has a plurality of projecting lens portions 23 projecting from the cover body 38 toward the image sensor 40, and the plurality of projecting lens portions 23 are mutually aligned. An air gap 24 is formed therebetween.
- the imaging light L of the subject image passes through the geometric optics lens 21, the cover body 38, the diffraction lens 22, the on-chip lens 36, and the color filter 35, and enters the image sensor 40 (see FIG. 6) of the pixel substrate 33.
- the photographing light L is mainly refracted by the geometrical optics lens 21 , the diffraction lens 22 and the on-chip lens 36 to adjust the direction of travel, and is guided toward the image sensor 40 . Therefore, the optical characteristics and configuration of the diffraction lens 22 can be determined according to the optical characteristics (for example, refractive characteristics) of the geometrical optics lens 21 actually used. Alternatively, the optical characteristics and configuration of the geometrical optics lens 21 can be determined according to the optical characteristics (for example, refractive characteristics) of the actually used diffraction lens 22 .
- the optical properties (especially refractive properties) and configuration of the diffraction lens 22 are determined so that the chromatic aberration of the geometrical optics lens 21 is reduced by the diffraction lens 22 .
- the optical characteristics of the diffraction lens 22 are such that the diffraction lens 22 emits the photographing light L at a principal ray incident angle (CRA) smaller than the principal ray incident angle (CRA) of the photographing light L directed from the geometrical optics lens 21 to the diffraction lens 22. (especially the refractive properties) and configuration are determined.
- CRA principal ray incident angle
- CRA principal ray incident angle
- the diffractive lens 22 can function not only as a lens that corrects the chromatic aberration of the geometrical optics lens 21, but also as a lens that improves the incident angle of the principal ray, thereby improving the shading characteristics of the entire optical lens system. can be done.
- FIG. 9 is a cross-sectional view showing an enlarged part of the imaging element 11, and is a diagram for exemplifying a case where color mixture occurs between adjacent image sensors 40.
- FIG. FIG. 10 is a cross-sectional view showing an enlarged part of the image sensor 11, and illustrates a case where the diffractive lens 22 refracts the photographing light L toward an appropriate image sensor 40 to prevent color mixture. It is a diagram.
- the optical path of the photographing light L is changed by the geometrical optics lens 21 and then also changed by the diffraction lens 22 .
- the traveling direction of the photographing light L that has passed through the diffraction lens 22 does not necessarily completely match the direction perpendicular to the imaging surface of the image sensor 40 (that is, the optical axis direction).
- the direction of travel of the photographing light L is changed by the diffraction lens 22 so as to approach the optical axis direction, but the photographing light L includes a light component traveling in a direction oblique to the optical axis direction.
- the photographing light L traveling in a direction that is inclined with respect to the optical axis direction travels to the other side adjacent to the corresponding image sensor 40. are more likely to enter the image sensor 40, resulting in color mixture.
- the inventor of the present invention actually considered the occurrence of color mixture while changing the configuration of the image sensor 11.
- the diffraction lens 22 should be positioned in the direction of the optical axis from the image sensor 40 (especially the imaging surface). It has been found that it is effective to position them apart from each other by 60 ⁇ m or less.
- the diffraction lens 22 is preferably positioned at a distance of 50 ⁇ m or less from the image sensor 40 in the optical axis direction, more preferably at a distance of 40 ⁇ m or less, and more preferably at a distance of 30 ⁇ m or less.
- the distance d in the optical axis direction between the diffraction lens 22 and the on-chip lens 36 is set to 60 ⁇ m or less to prevent color mixture.
- the inventor of the present invention has obtained the knowledge that the occurrence can be effectively prevented.
- the optical axis direction distance d between the diffraction lens 22 and the on-chip lens 36 is preferably 50 ⁇ m or less, more preferably 40 ⁇ m or less, and more preferably 30 ⁇ m or less.
- FIG. 11A to 11E are diagrams for explaining an example of a method of manufacturing the diffractive lens 22.
- FIG. FIG. 12 is a perspective view showing a plurality of diffractive lenses 22 formed on the cover wafer 45. As shown in FIG.
- the diffractive lens 22 of this embodiment is formed on the flat surface (especially on the back surface) of the cover body 38 .
- a cover body wafer 45 including a plurality of cover bodies 38 is prepared (see FIG. 11A).
- the cover body wafer 45 has two flat surfaces located on opposite sides of each other. These flat surfaces of the cover body wafer 45 correspond to the front and back surfaces of the individual cover bodies 38, respectively.
- the constituent material of the cover body wafer 45 (that is, the cover body 38) is not limited, and the cover body wafer 45 is made of glass, for example.
- a lens substrate film 41 made of the constituent material of the diffractive lens 22 is attached to one flat surface of the cover wafer 45 (see FIG. 11B).
- the constituent material of the lens base film 41 is not limited.
- the lens substrate film 41 is provided as a transparent inorganic film made of an inorganic material exhibiting a high refractive index (such as SiN, ZrO2, ZnSe, ZnS, TiO2 or CeO2 ).
- the method of attaching the lens substrate film 41 to the cover wafer 45 is not limited, and the lens substrate film 41 is attached to the cover wafer 45 using any means (for example, coating such as spin coater or spraying).
- the thickness of the lens base film 41 on the cover wafer 45 is not limited, the thickness of the lens base film 41 on the cover wafer 45 varies depending on the thickness of the diffractive lens 22 (that is, the plurality of projecting lens portions 23). be decided. Typically, a lens substrate film 41 having a thickness of several tens of nm (nanometers) to several hundred nm is formed on the cover body wafer 45 .
- a resist 42 is attached on the lens substrate film 41, and patterning is performed (see FIG. 11C). That is, the lens substrate film 41 is covered with the resist 42 having a pattern configuration corresponding to the shape and arrangement of the diffractive lens 22 (that is, the plurality of projecting lens portions 23).
- the constituent material of the resist 42 and the method of attaching the resist 42 to the lens substrate film 41 are not limited.
- the lens substrate film 41 is etched, and portions of the lens substrate film 41 not covered with the resist 42 are removed from the cover wafer 45 (FIG. 11D).
- a specific method of etching performed here is not limited, and dry etching is typically performed.
- the resist 42 is removed from the lens substrate film 41 (FIG. 11E).
- a method for removing the resist 42 is not limited. Typically, the resist 42 is removed using chemicals selected according to the materials of the lens base film 41 and the resist 42 .
- a plurality of diffractive lenses 22 that is, a plurality of projecting lens portions 23
- the cover wafer 45 see FIG. 12).
- the diffractive lens 22 can be formed on a flat surface instead of a curved surface.
- the diffractive lens 22 must have slits sufficiently small with respect to the wavelength of the photographing light L in order to efficiently refract the photographing light L using diffraction. It may have a nano-level fine structure.
- precise adjustment of the height direction position of the protruding lens portion 23 is required. It is not easy to form the diffractive lens 22 (plurality of protruding lens portions 23) having such a fine structure on the curved surface with high accuracy, and the manufacturing cost increases.
- the diffractive lens 22 is formed on the cover body 38 (cover body wafer 45) as in the present embodiment, a plurality of projecting lens portions 23 are formed on the highly flat surface of the cover body 38 (cover body wafer 45). can be made. Therefore, the diffractive lens 22 having a fine structure of several tens of nanometers can be formed in advance on the flat surface of the cover body 38 (cover body wafer 45) with high accuracy by lithography and etching techniques.
- a plurality of diffractive lenses 22 can be formed on an integrated cover body wafer 45 including a plurality of cover bodies 38. can. That is, a plurality of diffractive lenses 22 can be simultaneously formed on a plurality of portions of the cover body wafer 45 corresponding to the respective cover bodies 38 . As a result, it is possible to efficiently mass-produce the cover bodies 38 to which the diffractive lenses 22 are attached, and to reduce the manufacturing cost.
- cover body wafer 45 to which the plurality of diffraction lenses 22 are attached may be used for manufacturing the imaging device 11 as it is in the state of a monolithic wafer, or may be cut and separated into individual cover bodies 38. may be
- a typical example of the manufacturing method of the imaging device 11 includes a step of fixing the cover body 38 manufactured as described above to the pixel substrate 33 .
- the cover body 38 is attached to the pixel substrate 33 so that the diffractive lens 22 (that is, the plurality of projecting lens portions 23) attached to the cover body 38 is positioned between the cover body 38 and the pixel substrate 33. fixed against.
- the color filter 35 and the on-chip lens 36 are often already attached to the pixel substrate 33 immediately before the cover body 38 is attached, and the color filter 35 and the on-chip lens 36 are often made of an organic material. many. Therefore, the heat resistance of the pixel substrate 33 immediately before the cover body 38 is attached is severely restricted.
- the diffractive lens 22 is formed on the cover 38 separated from the pixel substrate 33 as described above, it is preferable to form the diffractive lens 22 on the surface of the cover 38 that is highly flat because the restrictions on heat resistance are loose. can be done. Therefore, the diffractive lens 22 having a nano-level structure can be easily and accurately formed on the cover body 38 using lithography and etching techniques.
- the cover body 38 is fixed to the pixel substrate 33 in a state in which the air gap 24 is secured between the projecting lens portions 23 as described above.
- the manufacturing method of the imaging element 11 may include, for example, the following steps.
- FIG. 13A to 13D are perspective views showing an example of a method for manufacturing the imaging element 11.
- FIG. 13A to 13D are perspective views showing an example of a method for manufacturing the imaging element 11.
- FIG. 14 to 16 are cross-sectional views of the imaging device 11 showing an example of a method for manufacturing the imaging device 11.
- FIG. 14 to 16 focus on one pixel substrate 33 for ease of understanding, but in the manufacturing method of this example, each substrate wafer 46 including a plurality of pixel substrates 33 is shown in FIGS. The manufacturing process shown in is performed.
- the seal resin 37 positioned between the pixel substrate 33 (particularly the image sensor 40) and the diffraction lens 22 is composed of a photocurable resin film, and the plurality of projecting lens portions 23 contacts the cured photocurable resin film.
- an integrated substrate wafer 46 including a plurality of pixel substrates 33 is prepared (see FIG. 13A).
- Color filters 35 and on-chip lenses 36 are already attached to the substrate wafer 46 at locations corresponding to the respective pixel substrates 33 .
- an uncured photocurable resin that constitutes the sealing resin 37 is applied to one surface of the substrate wafer 46 (that is, the plurality of pixel substrates 33) (see FIG. 13B).
- the photocurable resin is applied to the substrate wafer 46 so as to cover the surface of each pixel substrate 33 on which the color filter 35 and the on-chip lens 36 are provided.
- the photocurable resin (sealing resin 37) on the substrate wafer 46 is irradiated with light (see FIG. 13C).
- the portion of the photocurable resin on the substrate wafer 46 that covers the image sensor 40 of each pixel substrate 33 is cured by light irradiation (see “resin cured portion 37a" shown in FIG. 13C).
- the portion of the photocurable resin on the substrate wafer 46 outside the image sensor 40 of each pixel substrate 33 is not irradiated with light and remains in an uncured state.
- the cover wafer 45 (see FIG. 12) to which the diffractive lens 22 is attached is adhered to the substrate wafer 46 via the photocurable resin (seal resin 37) (see FIG. 13D).
- the uncured portion (that is, the unexposed portion) of the photocurable resin (seal resin 37) located outside the image sensor 40 functions as an adhesive to bond the cover wafer 45 and the substrate wafer 46 together.
- each of the cover body wafers 45 is placed while the diffraction lens 22 (that is, the plurality of protruding lens portions 23) faces the portion of the photocurable resin on each pixel substrate 33 of the substrate wafer 46 that has been cured by light irradiation.
- a cover body 38 is fixed to the corresponding pixel substrate 33 .
- each pixel substrate 33 is made compatible. is adhered to the cover body 38 (see FIG. 15).
- the uncured portion of the photo-curing resin (sealing resin 37) in contact with the cover wafer 45 and substrate wafer 46 is irradiated with light, and the photo-curing resin (sealing resin) is applied to the cover wafer 45 and substrate wafer 46. 37) are affixed.
- Components such as solder balls 34, TSVs and backside wiring are then formed in a monolithic wafer structure including a cover wafer 45 and a substrate wafer 46 (see FIG. 16).
- the wafer structure is cut and separated into individual imaging elements 11 .
- the plurality of protruding lens portions 23 are brought into contact with the sealing resin 37 in a state in which the portion of the sealing resin 37 facing the diffractive lens 22 (that is, the plurality of protruding lens portions 23) is cured. .
- each air gap 24 can maintain a space state without being filled with the sealing resin 37, and the diffractive lens 22 can maintain desired optical characteristics.
- FIG. 17A and 17B are cross-sectional views showing another example of the method for manufacturing the imaging device 11.
- FIG. 18 is a cross-sectional view showing an enlarged part of the imaging device 11 manufactured by the manufacturing method shown in FIG.
- the tips of the plurality of projecting lens portions 23 contact the inorganic film 50 located between the pixel substrate 33 (especially the image sensor 40) and the diffraction lens 22.
- the substrate wafer 46 to which the color filters 35 and the on-chip lenses 36 are already attached is prepared.
- a material film forming the seal resin 37 is applied to one surface of the substrate wafer 46 (that is, the plurality of pixel substrates 33).
- a specific material of the sealing resin 37 is not limited, and may be a photo-curing resin or a thermosetting resin.
- the sealing resin 37 on the substrate wafer 46 is semi-cured.
- the method for semi-curing the seal resin 37 is not limited, and the seal resin 37 is semi-cured by appropriate means such as light irradiation or heating.
- a translucent inorganic film 50 is applied onto the semi-cured seal resin 37 on the substrate wafer 46 (that is, the plurality of pixel substrates 33). Thereby, the image sensor 40 of each pixel substrate 33 of the substrate wafer 46 is covered with the cured inorganic film 50 .
- the portion of the sealing resin 37 on the substrate wafer 46 outside the image sensor 40 of each pixel substrate 33 is not covered with the inorganic film 50 and remains exposed in a semi-cured state.
- a specific material of the inorganic film 50 is not limited, and the inorganic film 50 may be composed of silicon dioxide (SiO 2 ), for example.
- the method of applying the inorganic film 50 to the sealing resin 37 is not limited, and the inorganic film 50 can be adhered to the sealing resin 37 by sputtering, for example.
- a mask 48 is interposed between the sealing resin 37 on the pixel substrate 33 and the film forming device 51, and the range of the sealing resin 37 where the inorganic film 50 is formed is formed.
- the film forming device 51 performs the film forming process of the inorganic film 50 while being exposed to the device 51 .
- the cover wafer 45 (see FIG. 12) to which the diffractive lens 22 is attached is adhered to the substrate wafer 46 via the sealing resin 37 .
- the cover body 38 is fixed to the pixel substrate 33 while the plurality of projecting lens portions 23 are facing the inorganic film 50 .
- each pixel substrate 33 is attached to the corresponding cover body 38 . (See FIG. 18).
- the uncured portion of the seal resin 37 that contacts the cover wafer 45 and substrate wafer 46 is cured, and the seal resin 37 is fixed to the cover wafer 45 and substrate wafer 46 .
- Components such as solder balls 34 , TSVs and backside wiring are then formed in a monolithic wafer structure including a cover body wafer 45 and a substrate wafer 46 .
- the wafer structure is cut and separated into individual imaging elements 11 .
- the plurality of protruding lens portions 23 are brought into contact with the cured inorganic film 50 .
- the cover body 38 with the diffraction lens 22 mounted thereon is attached to the pixel substrate 33 via the seal resin 37 . can be fixed by pressing against
- each air gap 24 can maintain a space state without being filled with the sealing resin 37, and the diffractive lens 22 can maintain desired optical characteristics.
- the cover body 38 is attached to the seal resin 37 while the plurality of projecting lens portions 23 of the diffraction lens 22 face the curing member (that is, the seal resin 37 (photocuring resin) or the inorganic film 50). is pressed against the pixel substrate 33 via the . This reliably prevents the sealing resin 37 from entering the air gap 24 between the protruding lens portions 23, and the diffraction lens 22 maintains the fine unevenness shape to exhibit its original optical characteristics.
- FIG. 19 is a cross-sectional view showing an example of the imaging device 10 according to the second embodiment.
- the imaging device 10 shown in FIG. 19 includes a diffractive lens unit 56 including a plurality of lens-constituting layers 55 .
- Each lens configuration layer 55 includes a cover body 38 and a diffractive lens 22 (a plurality of projecting lens portions 23) attached to the back surface of the cover body 38. As shown in FIG.
- the diffraction lens unit 56 shown in FIG. 19 includes three lens configuration layers 55, the number of lens configuration layers 55 included in the diffraction lens unit 56 is not limited.
- Adjacent lens-constituting layers 55 are adhered to each other via adhesive layers 57 . That is, the diffractive lens 22 of one lens configuration layer 55 (the upper lens configuration layer 55 in FIG. 19) and the other lens configuration layer 55 (the lower lens configuration layer in FIG. 19) of the adjacent lens configuration layers 55 55) is adhered to the same adhesive layer 57 as the cover body 38 (especially the surface).
- the portion that does not face the image sensor 40 is attached to the pixel substrate with the seal resin 37 interposed therebetween. 33.
- the diffractive lens 22 is attached over the entire back surface of each cover body 38.
- the diffractive lens 22 may be attached only partially.
- the adhesive layer 57 and the sealing resin 37 are attached to the peripheral region of the back surface of the cover body 38 where the diffraction lens 22 is not attached, and the central region of the back surface of the cover body 38 to which the diffraction lens 22 is attached. can be provided so that it does not adhere to the
- FIG. 19 Other configurations of the imaging device 10 shown in FIG. 19 are the same as those of the imaging device 10 according to the first embodiment described above.
- the imaging device 10 and the imaging device 11 of the present embodiment it is possible to adjust the optical path of the photographing light L by using the plurality of diffractive lenses 22 having a laminated structure, and to improve problems in optical characteristics such as chromatic aberration. be.
- the imaging device 10 and the imaging element 11 with higher optical characteristics, or use a simpler and/or cheaper geometrical optics lens 21 .
- the size of the entire optical lens system in the optical axis direction can be reduced, and the imaging apparatus 10 as a whole can be slimmed down.
- the optical lens system as a whole can exhibit various optical characteristics.
- FIG. 20 is a cross-sectional view showing an example of the imaging device 10 according to the third embodiment.
- the imaging device 10 shown in FIG. 20 includes a diffractive lens unit 56 including a plurality of lens configuration layers 55, but does not include a geometrical optics lens. That is, the optical lens system of this embodiment includes only the plurality of diffractive lenses 22 and does not include geometrical optics lenses.
- a diffractive lens unit 56 shown in FIG. 20 includes six lens configuration layers 55 .
- the adhesive structure between the adjacent lens-constituting layers 55 and the adhesive structure between the lens-constituting layer 55 located closest to the pixel substrate 33 and the pixel substrate 33 are examples of the above-described second embodiment (see FIG. 19). is similar to
- FIG. 20 Other configurations of the imaging device 10 shown in FIG. 20 are the same as those of the imaging device 10 according to the above-described second embodiment.
- the imaging device 10 and the imaging element 11 of this embodiment no geometrical optics lens is required. Therefore, it is possible to simplify the device configuration and reduce the size of the entire imaging device 10 in the optical axis direction.
- the optical lens system as a whole can exhibit various optical characteristics.
- FIG. 21 is a cross-sectional view showing an example of the imaging device 10 according to the fourth embodiment.
- the seal resin 37 is filled in the air gap 24 between the projecting lens portions 23 that constitute the diffraction lens 22 .
- the diffractive lens 22 of this embodiment exists over a range facing the image sensor 40 in the optical axis direction on the back surface of the cover body 38 as in the above-described first embodiment. It does not exist in part or all of the peripheral range not facing the sensor 40 .
- imaging device 10 of this example are the same as those of the imaging device 10 according to the above-described first embodiment.
- the diffraction lens 22 (the plurality of projecting lens portions 23) is attached to the flat surface of the cover body 38. Therefore, the diffractive lens 22 having a desired shape can be provided at a desired position on the cover body 38 with high precision.
- FIG. 22 is a cross-sectional view showing an example of the imaging device 10 according to the fifth embodiment.
- a space is provided between the image sensor 40 and the diffraction lens 22 in the imaging device 11 shown in FIG. More specifically, there is no sealing resin 37 and a space between the on-chip lens 36 and the diffractive lens 22 (plural protruding lens portions 23) in the optical axis direction.
- the sealing resin 37 that adheres and fixes the cover body 38 to the pixel substrate 33, as in the first embodiment described above.
- the seal resin 37 does not exist in the range corresponding to the central region of the pixel substrate 33 (in particular, the region where the image sensor 40 exists).
- the imaging device 11 of this example has a cavity structure having a space surrounded by the cover body 38, the sealing resin 37, and the pixel substrate 33, and the color filter 35, the on-chip lens 36, and the diffraction lens 22 are positioned in the space. .
- FIG. 22 Other configurations of the imaging device 10 shown in FIG. 22 are the same as those of the imaging device 10 according to the first embodiment described above.
- the imaging device 10 and the imaging device 11 of the present embodiment since there is a large refractive index difference between the diffraction lens 22 and the space adjacent to the diffraction lens 22, the diffraction performance of the diffraction lens 22 (that is, the refractive performance ) can be improved. This makes it possible to relax the restrictions on the design of the geometrical optics lens 21 and the diffractive lens 22 .
- FIG. 23 is a cross-sectional view showing an example of the imaging device 10 according to the sixth embodiment.
- the imaging device 11 shown in FIG. 23 includes a support 60 that supports the pixel substrate 33 from the outside, and an adhesive layer 61 positioned between the support 60 and the cover 38 .
- the support 60 has a hollow structure with a space inside, and includes a support bottom extending in a direction perpendicular to the optical axis Ax and a support peripheral edge extending from the support bottom in the optical axis direction.
- the pixel substrate 33 , the color filter 35 and the on-chip lens 36 are fixed to the support bottom, and the whole is arranged in the inner space of the support 60 .
- the diffractive lens 22 attached to the back surface of the cover body 38 is entirely arranged in the space surrounded by the support body 60, the adhesive layer 61 and the cover body 38.
- the adhesive layer 61 is positioned between the end surface of the support peripheral edge portion of the support 60 and the rear surface of the peripheral edge portion of the cover body 38 (especially the portion located outside the diffraction lens 22), and is attached to the support body 60. It works as a fixing portion for fixing the cover body 38 .
- the imaging element 11 of this embodiment does not require the seal resin 37 provided in the imaging elements 11 of the above-described first to fourth embodiments. Therefore, a space exists over the entire area between the on-chip lens 36 and the diffractive lens 22 (the plurality of projecting lens portions 23).
- the pixel substrate 33 is connected to an external substrate (not shown) via wire bond wiring 62 .
- the entire wire bond wiring 62 and the portion of the external substrate to which the wire bond wiring 62 is connected are located in the inner space of the support 60 .
- FIG. 23 Other configurations of the imaging device 10 shown in FIG. 23 are the same as those of the imaging device 10 according to the first embodiment described above.
- the imaging device 10 and the imaging device 11 of the present embodiment since there is a large refractive index difference between the diffraction lens 22 and the space adjacent to the diffraction lens 22, the diffraction performance of the diffraction lens 22 (that is, the refractive performance ) can be improved.
- the pixel substrate 33 is supported from the outside by the support 60, it is possible to prevent the pixel substrate 33 from bending and warping.
- the imaging device 10 and the imaging element 11 to which the diffraction lens 22 exemplarily described below can be applied are not limited. Therefore, the diffraction lens 22 described below may be applied to the imaging device 10 and the imaging device 11 according to each of the above-described embodiments, or may be applied to other imaging devices 10 and imaging devices 11. good.
- FIG. 24 is a perspective view showing a structural example of the diffraction lens 22.
- FIG. 24 shows a state in which a plurality of projecting lens portions 23 are regularly arranged along the vertical and horizontal directions perpendicular to each other on the upper surface of the rectangular parallelepiped cover body 38. It is shown.
- the actual state of the cover body 38 and the diffractive lens 22 (the plurality of projecting lens portions 23) may differ from the state shown in FIG.
- the structure of the diffractive lens 22 is mainly determined by the size (height h) of each projecting lens 23 in the optical axis direction, the distance (pitch P) between adjacent projecting lens parts 23, and the size in the direction perpendicular to the optical axis Ax. It is determined according to (vertical width D and horizontal width W).
- the diffraction lens 22 needs to have a slit sufficiently small with respect to the wavelength of the photographing light L in order to refract the photographing light L appropriately.
- each projecting lens portion 23 has a height h of about 200 to 1000 nm, a vertical width D and a width W of about 100 to 800 nm, and about 300 to 800 nm. has a pitch P of
- FIG. 25 is an enlarged plan view schematically showing an example of the diffraction lens 22.
- FIG. 26A to 26C are diagrams for explaining the optical diffraction phase difference s, refraction angle ⁇ , and focal length f of the diffractive lens 22 (plural projecting lens portions 23).
- the planar size (for example, the width W and the vertical width D) of the projecting lens portion 23 decreases from the center toward the outside while the pitch P of the projecting lens portion 23 is kept constant.
- the photographing light L diffracted by the diffraction lens 22 has a phase difference s (see FIG. 26A), and the photographing light L emitted from the projecting lens portion 23 is refracted (see FIG. 26B).
- the diffractive lens 22 needs to refract the photographing light L at a larger refraction angle ⁇ at a position farther from the optical axis Ax. have.
- phase difference s (see FIG. 26A) of the photographing light L caused by the protruding lens portions 23 changes little by little according to the number of protruding lens portions 23 arranged outward from the center.
- a phase difference s of 360° occurs at the position.
- FIG. 27 is a plan view showing an arrangement example of a plurality of projecting lens portions 23 forming one diffractive lens 22.
- FIG. 27 is a plan view showing an arrangement example of a plurality of projecting lens portions 23 forming one diffractive lens 22.
- the plane size (e.g., width W and length D) of the projecting lens portion 23 is periodically varied outward from the center (that is, the optical axis Ax) so that the diffraction lens 22 It is possible to make the whole function as a convex lens.
- the diffraction lens 22 shown in FIG. 27 includes protruding lens portions 23 with a first period S1, a second period S2 and a third period S3.
- the first period S1 is a range including the optical axis Ax.
- the second period S2 is the next closest area to the optical axis Ax after the first period S1 and is positioned adjacent to the first period S1.
- the third period S3 is the next closest area to the optical axis Ax after the second period S2 and is positioned adjacent to the second period S2.
- a plurality of projecting lens portions 23 included in each of the first period S1, the second period S2, and the third period S3 exhibit a phase difference in the range of 0° to 360° with respect to optical diffraction.
- W3 has a relationship of "W1>W2>W3". That is, the lateral widths W of the projecting lens portions 23 corresponding to each other in each cycle satisfy the relationship of "W1>W2>W3".
- the width W1 of the protruding lens portion 23 located closest to the optical axis Ax in the first period S1 the width W2 of the protruding lens portion 23 closest to the optical axis Ax in the second period S2, and the width W2 of the protruding lens portion 23 closest to the optical axis Ax in the third period S3
- the lateral width W3 of the projecting lens portion 23 positioned on the optical axis Ax side satisfies "W1>W2>W3".
- W1>W2>W3 is a relationship that is satisfied when a plurality of projecting lens portions 23 arranged in the direction of the width W of each projecting lens portion 23 is focused.
- the size J1 of each projecting lens portion 23 in the first period S1 along the direction, the size J2 of each projecting lens portion 23 in the second period S2, And J3 of each projecting lens portion 23 of the third period S3 satisfies "J1>J2>J3". That is, the sizes of the protruding lens portions 23 corresponding to each other in each cycle satisfy the relationship of "J1>J2>J3".
- the relationship between the projecting lens portions 23 corresponding to the first period S1 to the third period S3 is "D1>D2> D3" relationship is satisfied. That is, the vertical width D1 of the protruding lens portion 23 closest to the optical axis Ax in the first period S1, the second period S2 of the protruding lens portion 23 closest to the optical axis Ax in the second period S2, and the most light in the third period S3.
- a vertical width D3 of the projecting lens portion 23 on the axis Ax side satisfies the relationship of "D1>D2>D3".
- the plurality of projecting lens portions 23 in each period are provided at the same pitch P.
- the pitch P1 of the protruding lens portions 23 included in the first period S1 the pitch P2 of the protruding lens portions 23 included in the second period S2, and the pitch P3 of the protruding lens portions 23 included in the third period S3 are P1>P2>P3”.
- the diffractive lens 22 that satisfies "W1>W2>W3", "D1>D2>D3", "J1>J2>J3" and "P1>P2>P3" described above constitutes a convex lens as a whole.
- the diffractive lens 22 that satisfies "W1 ⁇ W2 ⁇ W3", “D1 ⁇ D2 ⁇ D3", “J1 ⁇ J2 ⁇ J3” and “P1 ⁇ P2 ⁇ P3" constitutes a concave lens as a whole.
- the planar size (that is, the size on the plane perpendicular to the optical axis Ax) of the plurality of projecting lens portions 23 of the diffractive lens 22 constituting the convex lens is periodic with respect to the distance from the optical axis Ax.
- change to The period of change of the planar size of the plurality of projecting lens portions 23 is based on the 360° phase difference of the light diffraction of the plurality of projecting lens portions 23 .
- the plane size of the projecting lens portion 23 becomes smaller as the distance from the optical axis Ax increases in each period.
- the planar size of the plurality of projecting lens portions 23 of the diffraction lens 22 that constitutes the concave lens changes periodically based on the distance from the optical axis Ax. is based on the 360° phase difference of light diffraction of the plurality of projecting lens portions 23 .
- the plane size of the projecting lens portion 23 increases with distance from the optical axis Ax in each period.
- Imaging device 10 Examples of electronic devices to which the imaging device 10, the imaging device 11, and the method for manufacturing the imaging device 10 and the imaging device 11 can be applied will be described below. Note that the imaging device 10, the imaging device 11, and the method of manufacturing the imaging device 10 and the imaging device 11 described above can also be applied to any system, device, method, etc. other than the electronic devices described below.
- the technology (the present technology) according to the present disclosure can be applied to various products.
- the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
- FIG. 28 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (Interface) 12053 are illustrated.
- the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
- the body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
- the body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
- the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
- the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
- the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
- the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
- the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
- the in-vehicle information detection unit 12040 detects in-vehicle information.
- the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
- the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
- the microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit.
- a control command can be output to 12010 .
- the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation of vehicles, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, etc. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation of vehicles, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, etc. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation of vehicles, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
- the microcomputer 12051 can output a control command to the body system control unit 12030 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle.
- the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
- the audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062 and an instrument panel 12063 are illustrated as output devices.
- the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
- FIG. 29 is a diagram showing an example of the installation position of the imaging unit 12031.
- the imaging unit 12031 has imaging units 12101, 12102, 12103, 12104, and 12105.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose, side mirrors, rear bumper, back door, and windshield of the vehicle 12100, for example.
- An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
- Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
- An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
- the imaging unit 12105 provided above the windshield in the passenger compartment is mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
- FIG. 29 shows an example of the imaging range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
- the imaging range 12114 The imaging range of an imaging unit 12104 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
- the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the traveling path of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
- automatic brake control including following stop control
- automatic acceleration control including following start control
- the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 .
- recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian.
- the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
- the technology according to the present disclosure can be applied to cameras including, for example, the imaging units 12031, 12101, 12102, 12103, 12104, and 12105 and the driver state detection unit 12041 among the configurations described above. These cases are also advantageous for acquiring high-quality images with a compact device configuration.
- the technology (the present technology) according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 30 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (this technology) can be applied.
- FIG. 30 shows a state in which an operator (doctor) 11131 is performing surgery on a patient 11132 on a patient bed 11133 using an endoscopic surgery system 11000 .
- an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as a pneumoperitoneum tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 for supporting the endoscope 11100. , and a cart 11200 loaded with various devices for endoscopic surgery.
- An endoscope 11100 is composed of a lens barrel 11101 whose distal end is inserted into the body cavity of a patient 11132 and a camera head 11102 connected to the proximal end of the lens barrel 11101 .
- an endoscope 11100 configured as a so-called rigid scope having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible scope having a flexible lens barrel. good.
- the tip of the lens barrel 11101 is provided with an opening into which the objective lens is fitted.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel 11101 by a light guide extending inside the lens barrel 11101, where it reaches the objective. Through the lens, the light is irradiated toward the observation object inside the body cavity of the patient 11132 .
- the endoscope 11100 may be a straight scope, a perspective scope, or a side scope.
- An optical system and an imaging element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the imaging element by the optical system.
- the imaging element photoelectrically converts the observation light to generate an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image.
- the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
- CCU Camera Control Unit
- the CCU 11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the operations of the endoscope 11100 and the display device 11202 in an integrated manner. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various image processing such as development processing (demosaicing) for displaying an image based on the image signal.
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201 .
- the light source device 11203 is composed of a light source such as an LED (light emitting diode), for example, and supplies the endoscope 11100 with irradiation light for imaging a surgical site or the like.
- a light source such as an LED (light emitting diode)
- LED light emitting diode
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- the user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204 .
- the user inputs an instruction or the like to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100 .
- the treatment instrument control device 11205 controls driving of the energy treatment instrument 11112 for tissue cauterization, incision, blood vessel sealing, or the like.
- the pneumoperitoneum device 11206 inflates the body cavity of the patient 11132 for the purpose of securing the visual field of the endoscope 11100 and securing the operator's working space, and injects gas into the body cavity through the pneumoperitoneum tube 11111. send in.
- the recorder 11207 is a device capable of recording various types of information regarding surgery.
- the printer 11208 is a device capable of printing various types of information regarding surgery in various formats such as text, images, and graphs.
- the light source device 11203 that supplies the endoscope 11100 with irradiation light for photographing the surgical site can be composed of, for example, a white light source composed of an LED, a laser light source, or a combination thereof.
- a white light source is configured by a combination of RGB laser light sources
- the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. It can be carried out.
- the laser light from each of the RGB laser light sources is irradiated to the observation object in a time division manner, and by controlling the driving of the imaging device of the camera head 11102 in synchronization with the irradiation timing, each of RGB can be handled. It is also possible to pick up images by time division. According to this method, a color image can be obtained without providing a color filter in the imaging device.
- the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time.
- the drive of the imaging device of the camera head 11102 in synchronism with the timing of the change in the intensity of the light to obtain an image in a time-division manner and synthesizing the images, a high dynamic A range of images can be generated.
- the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
- special light observation for example, the wavelength dependence of light absorption in body tissues is used to irradiate a narrower band of light than the irradiation light (i.e., white light) used during normal observation, thereby observing the mucosal surface layer.
- irradiation light i.e., white light
- Narrow Band Imaging in which a predetermined tissue such as a blood vessel is imaged with high contrast, is performed.
- fluorescence observation may be performed in which an image is obtained from fluorescence generated by irradiation with excitation light.
- the body tissue is irradiated with excitation light and the fluorescence from the body tissue is observed (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is examined.
- a fluorescence image can be obtained by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
- the light source device 11203 can be configured to be able to supply narrowband light and/or excitation light corresponding to such special light observation.
- FIG. 31 is a block diagram showing an example of functional configurations of the camera head 11102 and CCU 11201 shown in FIG.
- the camera head 11102 has a lens unit 11401, an imaging section 11402, a drive section 11403, a communication section 11404, and a camera head control section 11405.
- the CCU 11201 has a communication section 11411 , an image processing section 11412 and a control section 11413 .
- the camera head 11102 and the CCU 11201 are communicably connected to each other via a transmission cable 11400 .
- a lens unit 11401 is an optical system provided at a connection with the lens barrel 11101 . Observation light captured from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401 .
- a lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
- the number of imaging elements constituting the imaging unit 11402 may be one (so-called single-plate type) or plural (so-called multi-plate type).
- image signals corresponding to RGB may be generated by each image pickup element, and a color image may be obtained by synthesizing the image signals.
- the imaging unit 11402 may be configured to have a pair of imaging elements for respectively acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display.
- the 3D display enables the operator 11131 to more accurately grasp the depth of the living tissue in the surgical site.
- a plurality of systems of lens units 11401 may be provided corresponding to each imaging element.
- the imaging unit 11402 does not necessarily have to be provided in the camera head 11102 .
- the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
- the drive unit 11403 is configured by an actuator, and moves the zoom lens and focus lens of the lens unit 11401 by a predetermined distance along the optical axis under control from the camera head control unit 11405 . Thereby, the magnification and focus of the image captured by the imaging unit 11402 can be appropriately adjusted.
- the communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400 .
- the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies it to the camera head control unit 11405 .
- the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and/or information to specify the magnification and focus of the captured image. Contains information about conditions.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately designated by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. good.
- the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
- the camera head control unit 11405 controls driving of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102 .
- the communication unit 11411 receives image signals transmitted from the camera head 11102 via the transmission cable 11400 .
- the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102 .
- Image signals and control signals can be transmitted by electric communication, optical communication, or the like.
- the image processing unit 11412 performs various types of image processing on the image signal, which is RAW data transmitted from the camera head 11102 .
- the control unit 11413 performs various controls related to imaging of the surgical site and the like by the endoscope 11100 and display of the captured image obtained by imaging the surgical site and the like. For example, the control unit 11413 generates control signals for controlling driving of the camera head 11102 .
- control unit 11413 causes the display device 11202 to display a captured image showing the surgical site and the like based on the image signal that has undergone image processing by the image processing unit 11412 .
- the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edges of objects included in the captured image, thereby detecting surgical tools such as forceps, specific body parts, bleeding, mist during use of the energy treatment tool 11112, and the like. can recognize.
- the control unit 11413 may use the recognition result to display various types of surgical assistance information superimposed on the image of the surgical site. By superimposing and presenting the surgery support information to the operator 11131, the burden on the operator 11131 can be reduced and the operator 11131 can proceed with the surgery reliably.
- a transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
- wired communication is performed using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the technology according to the present disclosure may also be applied to, for example, a microsurgery system.
- the technical categories that embody the above technical ideas are not limited.
- the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus.
- the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
- a pixel substrate having an image sensor on which imaging light is incident; a translucent cover facing the image sensor; a diffraction lens having a plurality of protruding lens parts protruding from the cover body toward the image sensor, An imaging device, wherein spaces are provided between the plurality of projecting lens portions.
- each of the plurality of lens-constituting layers includes the cover body and the diffractive lens.
- the planar sizes of the plurality of projecting lens portions change periodically based on the distance from the optical axis, and the period of change in the planar size of the plurality of projecting lens portions is determined by the optical diffraction of the plurality of projecting lens portions. is based on the 360° phase difference of 6.
- the imaging device according to any one of items 1 to 5, wherein the planar size of the protruding lens portion decreases with increasing distance from the optical axis in each cycle.
- the planar sizes of the plurality of projecting lens portions change periodically based on the distance from the optical axis, and the period of change in the planar size of the plurality of projecting lens portions is determined by the optical diffraction of the plurality of projecting lens portions. is based on the 360° phase difference of 7.
- the imaging device according to any one of items 1 to 6, wherein the planar size of the projecting lens portion increases with distance from the optical axis in each cycle.
- [Item 10] a support that supports the pixel substrate; 10.
- a pixel substrate having an image sensor on which imaging light is incident; a translucent cover facing the image sensor; a diffraction lens having a plurality of protruding lens parts protruding from the cover body toward the image sensor; an imaging lens located on the opposite side of the pixel substrate through the cover body, An imaging device, wherein a space is formed between the plurality of projecting lens portions.
- a step of applying an inorganic film on the pixel substrate 16.
- Imaging device 11 Imaging device 21 Geometrical optics lens 22 Diffractive lens 23 Protruding lens part 24 Air gap 31 Lower substrate 32 Upper substrate 33 Pixel substrate 34 Solder ball 35 Color filter 36 On-chip lens 37 Sealing resin 38 Cover body 40 Image sensor 41 Lens substrate film 42 Resist 45 Cover wafer 46 Substrate wafer 48 Mask 49 Exposure device 50 Inorganic film 51 Film forming device 55 Lens constituent layer 56 Diffractive lens unit 57 Adhesive layer 60 Support 61 Adhesive layer 62 Wire bonding wiring Ax Optical axis D Vertical width L Shooting light L1 Short wavelength light L2 Long wavelength light P Pitch S1 First period S2 Second period S3 Third period s Phase difference W Horizontal width ⁇ Angle of refraction
Abstract
Description
図4は、第1実施形態に係る撮像素子11の一例を示す断面図である。図5は、図4に示す撮像素子11の一部分を拡大して示す断面図である。
本実施形態において、上述の第1実施形態と同一又は対応の要素には同一の符号を付し、その詳細な説明を省略する。
本実施形態において、上述の第1実施形態及び第2実施形態と同一又は対応の要素には同一の符号を付し、その詳細な説明を省略する。
本実施形態において、上述の第1実施形態~第3実施形態と同一又は対応の要素には同一の符号を付し、その詳細な説明を省略する。
本実施形態において、上述の第1実施形態~第4実施形態と同一又は対応の要素には同一の符号を付し、その詳細な説明を省略する。
本実施形態において、上述の第1実施形態~第5実施形態と同一又は対応の要素には同一の符号を付し、その詳細な説明を省略する。
次に、回折レンズ22の構造例について説明する。
以下に、上述の撮像装置10、撮像素子11、及び撮像装置10及び撮像素子11の製造方法を応用可能な電子機器の例について説明する。なお、上述の撮像装置10、撮像素子11、及び撮像装置10及び撮像素子11の製造方法は、下述の電子機器以外の任意のシステム、装置及び方法等に対しても応用可能である。
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
本開示は以下の構成を取ることもできる。
撮影光が入射するイメージセンサを具備する画素基板と、
前記イメージセンサに対面する透光性のカバー体と、
前記カバー体から前記イメージセンサに向かって突出する複数の突出レンズ部を有する回折レンズと、を備え、
前記複数の突出レンズ部の相互間に空間が設けられている
撮像素子。
前記イメージセンサと前記回折レンズとの間に位置し、前記複数の突出レンズ部に接触する光硬化樹脂膜を備える
項目1に記載の撮像素子。
前記イメージセンサと前記回折レンズとの間に位置し、前記複数の突出レンズ部に接触する無機膜を備える
項目1又は2に記載の撮像素子。
お互いに重ねられる複数のレンズ構成層を備え、
前記複数のレンズ構成層の各々は、前記カバー体及び前記回折レンズを含む
項目1~3のいずれに記載の撮像素子。
前記回折レンズは、前記イメージセンサから60μm以下離れて位置する
項目1~4のいずれに記載の撮像素子。
前記複数の突出レンズ部の平面サイズは、光軸からの距離を基準に、周期的に変化し、 前記複数の突出レンズ部の平面サイズの変化の周期は、前記複数の突出レンズ部の光回折の360°の位相差に基づいており、
各周期において、光軸から離れるに従って突出レンズ部の平面サイズが小さくなる
項目1~5のいずれに記載の撮像素子。
前記複数の突出レンズ部の平面サイズは、光軸からの距離を基準に、周期的に変化し、 前記複数の突出レンズ部の平面サイズの変化の周期は、前記複数の突出レンズ部の光回折の360°の位相差に基づいており、
各周期において、光軸から離れるに従って突出レンズ部の平面サイズが大きくなる
項目1~6のいずれに記載の撮像素子。
前記画素基板と前記カバー体との間に位置し、前記画素基板に対して前記カバー体を固定する固定部を備える
項目1~7のいずれに記載の撮像素子。
前記イメージセンサと前記回折レンズとの間に空間が設けられている項目1~8のいずれに記載の撮像素子。
前記画素基板を支持する支持体と、
前記支持体とカバー基板との間に位置し、前記支持体に対して前記カバー体を固定する固定部と、を備える
項目1~9のいずれに記載の撮像素子。
撮影光が入射するイメージセンサを具備する画素基板と、
前記イメージセンサに対面する透光性のカバー体と、
前記カバー体から前記イメージセンサに向かって突出する複数の突出レンズ部を有する回折レンズと、
前記カバー体を介して前記画素基板とは反対側に位置する撮像レンズと、を備え、
前記複数の突出レンズ部の相互間に空間が形成されている
撮像装置。
前記回折レンズは、前記撮像レンズの色収差を低減する
項目11に記載の撮像装置。
前記回折レンズは、前記撮像レンズから前記回折レンズに向かう前記撮影光の主光線入射角度よりも小さい主光線入射角度で、前記撮影光を出射する
項目11又は12に記載の撮像装置。
イメージセンサを有する画素基板に対し、透光性のカバー体を固定する工程を含み、
前記カバー体には、回折レンズを構成し且つ相互間に空間が設けられている複数の突出レンズ部が固定されており、
前記複数の突出レンズ部が前記カバー体と前記画素基板との間に位置するように、前記カバー体は前記画素基板に対して固定される
撮像素子の製造方法。
前記画素基板上に、光硬化樹脂を付与する工程と、
前記画素基板上の前記光硬化樹脂のうち前記イメージセンサを覆う部分を、光照射によって硬化させる工程と、を含み、
前記画素基板上の前記光硬化樹脂のうち前記光照射によって硬化した部分に前記複数の突出レンズ部を対面させつつ、前記カバー体が前記画素基板に対して固定される
項目14に記載の撮像素子の製造方法。
前記画素基板上に、無機膜を付与する工程を含み、
前記無機膜に前記複数の突出レンズ部を対面させつつ、前記カバー体が前記画素基板に対して固定される
項目14又は15に記載の撮像素子の製造方法。
11 撮像素子
21 幾何光学レンズ
22 回折レンズ
23 突出レンズ部
24 エアーギャップ
31 下側基板
32 上側基板
33 画素基板
34 はんだボール
35 カラーフィルタ
36 オンチップレンズ
37 シール樹脂
38 カバー体
40 イメージセンサ
41 レンズ基材膜
42 レジスト
45 カバー体ウエハー
46 基板ウエハー
48 マスク
49 露光装置
50 無機膜
51 成膜装置
55 レンズ構成層
56 回折レンズユニット
57 接着層
60 支持体
61 接着層
62 ワイヤーボンド配線
Ax 光軸
D 縦幅
L 撮影光
L1 短波長光
L2 長波長光
P ピッチ
S1 第1周期
S2 第2周期
S3 第3周期
s 位相差
W 横幅
θ 屈折角
Claims (16)
- イメージセンサを具備する画素基板と、
前記イメージセンサに対面する透光性のカバー体と、
前記カバー体から前記イメージセンサに向かって突出する複数の突出レンズ部を有する回折レンズと、を備え、
前記複数の突出レンズ部の相互間に空間が設けられている
撮像素子。 - 前記イメージセンサと前記回折レンズとの間に位置し、前記複数の突出レンズ部に接触する光硬化樹脂膜を備える
請求項1に記載の撮像素子。 - 前記イメージセンサと前記回折レンズとの間に位置し、前記複数の突出レンズ部に接触する無機膜を備える
請求項1に記載の撮像素子。 - お互いに重ねられる複数のレンズ構成層を備え、
前記複数のレンズ構成層の各々は、前記カバー体及び前記回折レンズを含む
請求項1に記載の撮像素子。 - 前記回折レンズは、前記イメージセンサから60μm以下離れて位置する
請求項1に記載の撮像素子。 - 前記複数の突出レンズ部の平面サイズは、光軸からの距離を基準に、周期的に変化し、 前記複数の突出レンズ部の平面サイズの変化の周期は、前記複数の突出レンズ部の光回折の360°の位相差に基づいており、
各周期において、光軸から離れるに従って突出レンズ部の平面サイズが小さくなる
請求項1に記載の撮像素子。 - 前記複数の突出レンズ部の平面サイズは、光軸からの距離を基準に、周期的に変化し、 前記複数の突出レンズ部の平面サイズの変化の周期は、前記複数の突出レンズ部の光回折の360°の位相差に基づいており、
各周期において、光軸から離れるに従って突出レンズ部の平面サイズが大きくなる
請求項1に記載の撮像素子。 - 前記画素基板と前記カバー体との間に位置し、前記画素基板に対して前記カバー体を固定する固定部を備える
請求項1に記載の撮像素子。 - 前記イメージセンサと前記回折レンズとの間に空間が設けられている請求項1に記載の撮像素子。
- 前記画素基板を支持する支持体と、
前記支持体とカバー基板との間に位置し、前記支持体に対して前記カバー体を固定する固定部と、を備える
請求項1に記載の撮像素子。 - イメージセンサを具備する画素基板と、
前記イメージセンサに対面する透光性のカバー体と、
前記カバー体から前記イメージセンサに向かって突出する複数の突出レンズ部を有する回折レンズと、
前記カバー体を介して前記画素基板とは反対側に位置する撮像レンズと、を備え、
前記複数の突出レンズ部の相互間に空間が形成されている
撮像装置。 - 前記回折レンズは、前記撮像レンズの色収差を低減する
請求項11に記載の撮像装置。 - 前記回折レンズは、前記撮像レンズから前記回折レンズに向かう光の主光線入射角度よりも小さい主光線入射角度で、光を出射する
請求項11に記載の撮像装置。 - イメージセンサを有する画素基板に対し、透光性のカバー体を固定する工程を含み、
前記カバー体には、回折レンズを構成し且つ相互間に空間が設けられている複数の突出レンズ部が固定されており、
前記複数の突出レンズ部が前記カバー体と前記画素基板との間に位置するように、前記カバー体は前記画素基板に対して固定される
撮像素子の製造方法。 - 前記画素基板上に、光硬化樹脂を付与する工程と、
前記画素基板上の前記光硬化樹脂のうち前記イメージセンサを覆う部分を、光照射によって硬化させる工程と、を含み、
前記画素基板上の前記光硬化樹脂のうち前記光照射によって硬化した部分に前記複数の突出レンズ部を対面させつつ、前記カバー体が前記画素基板に対して固定される
請求項14に記載の撮像素子の製造方法。 - 前記画素基板上に、無機膜を付与する工程を含み、
前記無機膜に前記複数の突出レンズ部を対面させつつ、前記カバー体が前記画素基板に対して固定される
請求項14に記載の撮像素子の製造方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112022001388.5T DE112022001388T5 (de) | 2021-03-09 | 2022-01-19 | Bildgebungselement, bildgebungsvorrichtung und verfahren zum herstellen eines bildgebungselements |
US18/548,647 US20240145510A1 (en) | 2021-03-09 | 2022-01-19 | Imaging element, imaging device, and method for manufacturing imaging element |
CN202280010971.7A CN116783710A (zh) | 2021-03-09 | 2022-01-19 | 摄像元件、摄像装置和摄像元件的制造方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021037662A JP2022137929A (ja) | 2021-03-09 | 2021-03-09 | 撮像素子、撮像装置、及び撮像素子の製造方法 |
JP2021-037662 | 2021-03-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190653A1 true WO2022190653A1 (ja) | 2022-09-15 |
Family
ID=83226587
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/001700 WO2022190653A1 (ja) | 2021-03-09 | 2022-01-19 | 撮像素子、撮像装置、及び撮像素子の製造方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240145510A1 (ja) |
JP (1) | JP2022137929A (ja) |
CN (1) | CN116783710A (ja) |
DE (1) | DE112022001388T5 (ja) |
WO (1) | WO2022190653A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009145809A (ja) * | 2007-12-18 | 2009-07-02 | Fujinon Corp | 撮像レンズおよび撮像装置 |
JP2010098066A (ja) * | 2008-10-15 | 2010-04-30 | Olympus Corp | 固体撮像装置、固体撮像装置の製造方法 |
JP2013038164A (ja) * | 2011-08-05 | 2013-02-21 | Sony Corp | 固体撮像装置、電子機器 |
-
2021
- 2021-03-09 JP JP2021037662A patent/JP2022137929A/ja active Pending
-
2022
- 2022-01-19 CN CN202280010971.7A patent/CN116783710A/zh active Pending
- 2022-01-19 US US18/548,647 patent/US20240145510A1/en active Pending
- 2022-01-19 DE DE112022001388.5T patent/DE112022001388T5/de active Pending
- 2022-01-19 WO PCT/JP2022/001700 patent/WO2022190653A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009145809A (ja) * | 2007-12-18 | 2009-07-02 | Fujinon Corp | 撮像レンズおよび撮像装置 |
JP2010098066A (ja) * | 2008-10-15 | 2010-04-30 | Olympus Corp | 固体撮像装置、固体撮像装置の製造方法 |
JP2013038164A (ja) * | 2011-08-05 | 2013-02-21 | Sony Corp | 固体撮像装置、電子機器 |
Also Published As
Publication number | Publication date |
---|---|
CN116783710A (zh) | 2023-09-19 |
US20240145510A1 (en) | 2024-05-02 |
DE112022001388T5 (de) | 2023-12-28 |
JP2022137929A (ja) | 2022-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017159174A1 (ja) | 固体撮像装置、及び、固体撮像装置の製造方法 | |
WO2018139254A1 (en) | Camera module, method of producing the same, and electronic device | |
US20220115427A1 (en) | Solid-state imaging device and electronic apparatus | |
WO2019220696A1 (ja) | 撮像素子および撮像装置 | |
JPWO2019235250A1 (ja) | 撮像装置 | |
US20220231062A1 (en) | Imaging device, method of producing imaging device, imaging apparatus, and electronic apparatus | |
US20220068991A1 (en) | Imaging element and manufacturing method of imaging element | |
WO2021193266A1 (ja) | 固体撮像装置 | |
WO2019065295A1 (ja) | 撮像素子およびその製造方法、並びに電子機器 | |
WO2023013444A1 (ja) | 撮像装置 | |
WO2022190653A1 (ja) | 撮像素子、撮像装置、及び撮像素子の製造方法 | |
US20230411430A1 (en) | Solid-state imaging device and electronic apparatus | |
WO2020246293A1 (ja) | 撮像装置 | |
WO2021049302A1 (ja) | 撮像装置、電子機器、製造方法 | |
WO2019097909A1 (ja) | 半導体素子、半導体装置および半導体素子の製造方法 | |
TWI826558B (zh) | 攝像元件及攝像元件之製造方法 | |
JP2020064893A (ja) | センサモジュールおよび電子機器 | |
WO2021140936A1 (ja) | 受光装置 | |
US20230299110A1 (en) | Sensor device and electronic apparatus | |
US20230343803A1 (en) | Semiconductor device, method of producing the same, and electronic apparatus | |
WO2023013394A1 (ja) | 撮像装置 | |
WO2024075253A1 (ja) | 光検出装置および電子機器 | |
WO2021095562A1 (ja) | 撮像装置および電子機器 | |
WO2020100709A1 (ja) | 固体撮像装置及び電子機器 | |
WO2020158216A1 (ja) | 固体撮像装置及び電子機器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22766621 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280010971.7 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18548647 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022001388 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22766621 Country of ref document: EP Kind code of ref document: A1 |