WO2019207978A1 - Image capture element and method of manufacturing image capture element - Google Patents

Image capture element and method of manufacturing image capture element Download PDF

Info

Publication number
WO2019207978A1
WO2019207978A1 PCT/JP2019/009673 JP2019009673W WO2019207978A1 WO 2019207978 A1 WO2019207978 A1 WO 2019207978A1 JP 2019009673 W JP2019009673 W JP 2019009673W WO 2019207978 A1 WO2019207978 A1 WO 2019207978A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pixel
photoelectric conversion
color filter
incident
Prior art date
Application number
PCT/JP2019/009673
Other languages
French (fr)
Japanese (ja)
Inventor
博信 深川
健児 池田
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2019207978A1 publication Critical patent/WO2019207978A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise

Definitions

  • the present disclosure relates to an image sensor and a method for manufacturing the image sensor. Specifically, the present invention relates to an image sensor having a light shielding region outside a region where pixels are arranged, and a method for manufacturing the image sensor.
  • an imaging apparatus such as a camera
  • an imaging apparatus that has been downsized by arranging an imaging element close to a photographing lens is used.
  • a phenomenon is known in which the amount of light incident on the peripheral pixel is reduced with respect to the amount of light incident on the central pixel of the image sensor, and the sensitivity of the peripheral pixel of the image sensor is decreased. ing. This phenomenon is called shading.
  • ⁇ Pupil correction is performed to correct this decrease in sensitivity.
  • An on-chip lens is disposed in the above-described pixel, and light from the subject is condensed on the pixel by the on-chip lens. Since the light from the subject is vertically incident on the pixel arranged at the center of the pixel region, the on-chip lens is arranged at the center of the pixel. On the other hand, in the pixels arranged at the peripheral edge of the image sensor, the light from the subject is incident obliquely, so that the on-chip lens is shifted from the center of the pixel toward the center of the image sensor. Thereby, oblique incident light can be condensed on the pixel, and shading can be corrected.
  • an imaging device in which an upper layer film and an on-chip lens are sequentially stacked on a plurality of pixels formed on a semiconductor substrate
  • an imaging device that changes the correction amount of the on-chip lens position related to pupil correction has been proposed (for example, , See Patent Document 1).
  • the correction amount of the on-chip lens position is determined according to the distance from the center of the region where the pixels of the image sensor are arranged to the on-chip lens and the film thickness of the upper layer film at the position of the on-chip lens To change.
  • An insulating film, a color filter, and a planarizing film are disposed between the on-chip lens and the semiconductor substrate.
  • These insulating film, color filter, and planarizing film constitute the above-mentioned upper layer film.
  • the flattening film is a film for flattening the surface of the color filter. Therefore, the film thickness becomes relatively thick and the film thickness greatly changes in the pixel region. Since the above-described conventional technique corrects the decrease in sensitivity by pupil correction, there is a problem that the decrease in sensitivity cannot be sufficiently corrected when the film thickness of the planarization film or the like changes greatly.
  • the present disclosure has been made in view of the above-described problems, and an object thereof is to reduce a change in sensitivity in pixels arranged at a peripheral portion of a pixel region.
  • the present disclosure has been made in order to solve the above-described problems.
  • the first aspect of the present disclosure is that a photoelectric conversion unit that performs photoelectric conversion based on incident light and an opening are arranged to block the incident light.
  • a light shielding film that transmits incident light through the opening, a color filter that transmits incident light of a predetermined wavelength out of the incident light, a planarization film that planarizes the surface of the color filter, and the planarization
  • a pixel region disposed adjacent to a film and provided with a pixel including the color filter and an on-chip lens for condensing the incident light on the photoelectric conversion unit through the opening of the light shielding film; and the opening
  • the photoelectric conversion in the pixel in the vicinity of the light-shielding region is provided with a light-shielded pixel that is the pixel including the light-shielding film in which no part is disposed and a light-shielding region adjacent to the pixel region.
  • An image pickup element for
  • the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be further adjusted according to the color filter of the pixel.
  • the amount of light incident on the photoelectric conversion unit in the pixel near the light shielding region may be adjusted by changing the shape of the light shielding film.
  • the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be adjusted by changing the shape of the on-chip lens.
  • the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be adjusted by changing the refractive index of the on-chip lens.
  • the amount of incident light on the photoelectric conversion unit in the pixel near the light shielding area may be adjusted by changing the shape of the color filter.
  • the second aspect of the present disclosure includes a photoelectric conversion unit forming step that performs photoelectric conversion based on incident light, and a light shielding film that transmits the incident light through the opening while the opening is disposed to block the incident light.
  • An on-chip lens forming step for forming a plurality of pixels, the photoelectric conversion portion forming step, a second light shielding film forming step for forming a light shielding film in which the opening is not disposed, and the cap A light-shielding pixel forming step comprising a filter forming step, the planarizing film forming step, and the on-chip lens forming step, wherein the plurality of pixels to be formed are arranged in the light-shielding pixel forming step.
  • Forming the light-shielding pixels around the pixel area, and the manufacturing process of the pixels includes calculating the amount of incident light to the photoelectric conversion unit in the pixels in the vicinity of the light-shielding area where the light-shielding pixels are formed. It is a manufacturing method of an image sensor further provided with the process to adjust.
  • the amount of incident light to the photoelectric conversion unit of the pixel arranged in the vicinity of the boundary between the light shielding region and the pixel region is adjusted. It is assumed that the amount of incident light is adjusted according to the change in sensitivity in the pixels in the vicinity of the light shielding region based on the change in the film thickness of the flattening film or the like.
  • FIG. 3 is a top view illustrating a configuration example of a pixel according to an embodiment of the present disclosure.
  • FIG. It is a figure which shows an example of the image signal which concerns on embodiment of this indication.
  • 3 is a top view illustrating a configuration example of a pixel according to the first embodiment of the present disclosure.
  • FIG. 6 is a top view illustrating a configuration example of a pixel according to a second embodiment of the present disclosure.
  • FIG. 14 is a top view illustrating a configuration example of a pixel according to a third embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating a configuration example of an imaging element according to an embodiment of the present disclosure.
  • the image pickup device 1 of FIG. 1 includes a pixel array unit 10, a vertical drive unit 20, a column signal processing unit 30, and a control unit 40.
  • the pixel array unit 10 is configured by arranging the pixels 100 in a two-dimensional grid.
  • the pixel 100 generates an image signal corresponding to the irradiated light.
  • the pixel 100 includes a photoelectric conversion unit that generates charges according to the irradiated light.
  • the pixel 100 further includes a pixel circuit. This pixel circuit generates an image signal based on the charges generated by the photoelectric conversion unit. The generation of the image signal is controlled by a control signal generated by the vertical drive unit 20 described later.
  • signal lines 21 and 31 are arranged in an XY matrix.
  • the signal line 21 is a signal line that transmits a control signal for the pixel circuit in the pixel 100, and is arranged for each row of the pixel array unit 10 and wired in common to the pixels 100 arranged in each row.
  • the signal line 31 is a signal line that transmits an image signal generated by the pixel circuit of the pixel 100, and is arranged for each column of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each column.
  • the vertical drive unit 20 generates a control signal for the pixel circuit of the pixel 100.
  • the vertical drive unit 20 transmits the generated control signal to the pixel 100 via the signal line 21 shown in FIG.
  • the column signal processing unit 30 processes the image signal generated by the pixel 100.
  • the column signal processing unit 30 processes the image signal transmitted from the pixel 100 via the signal line 31 shown in FIG.
  • the processing in the column signal processing unit 30 corresponds to, for example, analog-digital conversion that converts an analog image signal generated in the pixel 100 into a digital image signal.
  • the image signal processed by the column signal processing unit 30 is output as an image signal of the image sensor 1.
  • the control unit 40 controls the entire image sensor 1.
  • the control unit 40 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical driving unit 20 and the column signal processing unit 30.
  • the control signal generated by the control unit 40 is transmitted to the vertical drive unit 20 and the column signal processing unit 30 through signal lines 41 and 42, respectively.
  • FIG. 2 is a diagram illustrating a configuration example of the pixel array unit according to the embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating a configuration example of the pixel array unit 10.
  • the pixel array unit 10 shown in FIG. 1 includes a pixel area 110 and a light shielding area 120.
  • the pixel area 110 is an area where the pixel 100 described in FIG. 1 is arranged.
  • Pixels 100a and 100b in the figure represent the pixels 100 arranged at the center and the periphery of the pixel region 110, respectively.
  • the light shielding area 120 is an area where the light shielding pixels 200 are arranged.
  • the light shielding pixel 200 is a pixel in which light from a subject is shielded, and is a pixel used for detecting a black level of an image signal.
  • a plurality of light-shielding pixels 200 are arranged around the pixel region 110 to form the light-shielding region 120.
  • FIG. 3 is a cross-sectional view illustrating a configuration example of a pixel according to an embodiment of the present disclosure.
  • This figure is a cross-sectional view illustrating a configuration example of the pixels 100a and 100b and the light-shielding pixel 200, and is a cross-sectional view along the line AA ′ in FIG.
  • this figure is a diagram illustrating a basic configuration of a pixel of the present disclosure.
  • the pixel 100 and the light-shielding pixel 200 in the figure are composed of a semiconductor substrate 121, a wiring region composed of a wiring layer 124 and an insulating layer 123, a support substrate 125, an insulating film 126, a light-shielding film 131, a color filter 141, and a flat surface.
  • a chemical film 161 and an on-chip lens 151 are provided.
  • the semiconductor substrate 121 is a semiconductor substrate on which the photoelectric conversion part of the pixel 100 and the semiconductor part of the pixel circuit are formed.
  • the photoelectric conversion unit 101 is described as an example.
  • the photoelectric conversion unit 101 is formed in a p-type well region formed in the semiconductor substrate 121.
  • the semiconductor substrate 121 constitutes a well region.
  • An n-type semiconductor region 122 is formed in the p-type well region.
  • a photodiode serving as the photoelectric conversion unit 101 is configured by a pn junction formed between the p-type well region and the n-type semiconductor region 122.
  • the wiring layer 124 is a wiring that connects elements formed on the semiconductor substrate 121.
  • the wiring layer 124 is also used for transmitting control signals and image signals for the pixels 100.
  • the wiring layer 124 can be composed of, for example, copper (Cu) or tungsten (W).
  • the insulating layer 123 insulates the wiring layer 124.
  • the insulating layer 123 can be made of, for example, silicon oxide (SiO 2 ).
  • the wiring layer 124 and the insulating layer 124 constitute a wiring region. Note that the wiring region in the figure is arranged on the surface of the semiconductor substrate 121.
  • the support substrate 125 is a substrate that is disposed adjacent to the wiring region and supports the semiconductor substrate 121.
  • the support substrate 125 is a substrate that improves the strength when the image pickup device 1 is manufactured.
  • the insulating film 126 is a substrate that is disposed on the back surface of the semiconductor substrate 121 and insulates the semiconductor substrate 121. This insulating film 126 can be made of, for example, SiO 2 .
  • the color filter 141 is an optical filter that transmits light of a predetermined wavelength among incident light.
  • a color filter 141 that transmits red light, green light, and blue light can be used.
  • any one of three types of color filters 141 that transmit red light, green light, and blue light is disposed.
  • the color filter 141 is configured to have a different thickness depending on the wavelength of light to be transmitted.
  • the color filter 141 corresponding to green light is configured to be thicker than the color filter 141 corresponding to red light and blue light. This is due to the characteristics of the color filter 141 and restrictions on the manufacturing process.
  • the on-chip lens 151 is a lens that condenses incident light on the photoelectric conversion unit 101.
  • the on-chip lens 151 has a hemispherical shape and collects incident light through the color filter 141.
  • the on-chip lens 151 can be made of, for example, an acrylic resin.
  • the image pickup device 1 shown in FIG. 1 is a back-illuminated image pickup device in which a color filter 141 and an on-chip lens 151 are disposed on the back surface of a semiconductor substrate and images incident light emitted from the back surface of the semiconductor substrate 121. .
  • the on-chip lens 151 is shifted in the direction of the center of the pixel region 110 with respect to the center of the photoelectric conversion unit 101 by pupil correction.
  • the planarizing film 161 is a film that planarizes the surface of the color filter 141. As described above, the surface of the color filter 141 has a different film thickness for each corresponding color. For this reason, the planarization film 161 is disposed, and the surface on which the on-chip lens 151 is formed is planarized.
  • the planarization film 161 can be made of the same material as the on-chip lens 151, for example. Specifically, when the on-chip lens 151 is formed, the surface of the color filter 141 can be flattened by thickly coating the surface of the color filter 141 with the material of the on-chip lens 151.
  • the light shielding film 131 is a film that shields incident light.
  • the light shielding film 131 is configured in different shapes in the pixel 100 and the light shielding pixel 200.
  • a light shielding film 131 having an opening 132 is disposed in the pixel 100.
  • Incident light transmitted through the on-chip lens 151 and the color filter 141 is irradiated to the photoelectric conversion unit 101 through the opening 132.
  • the light shielding film 131 disposed in the pixel 100 shields light incident obliquely from the adjacent pixel 100. Specifically, the light transmitted through the color filter 141 of the adjacent pixel 100 is prevented from entering the photoelectric conversion unit 101 of the pixel 100 itself. Thereby, the occurrence of crosstalk can be prevented.
  • the light shielding film 131 can be made of metal, for example.
  • the light-shielding pixel 200 a light-shielding film 131 having no opening is disposed. For this reason, in the light-shielding pixel 200, all light from the subject is shielded.
  • the image signal generated by such a light shielding pixel 200 is a signal corresponding to the black level of the image signal generated by the pixel 100.
  • the light shielding film 131 and the color filter 141 can be formed simultaneously in the pixel region and the light shielding region.
  • the color filter 141 is disposed adjacent to the insulating film 126 in the opening 132. That is, the light shielding film 131 is embedded in the opening 132.
  • the color filter 141 is laminated on the surface of the light shielding film 131. For this reason, the height from the semiconductor substrate 121 to the surface of the color filter 141 is higher in the light-shielded pixel 200 than in the pixel 100. Thus, a step is generated in the surface height of the color filter 141 between the pixel region 110 and the light shielding region 120.
  • the planarization film 161 When the planarization film 161 is formed on such a pixel array unit 10, the unevenness of the surface of the color filter 141 for each pixel 100 is planarized. On the other hand, the step between the pixel region 110 and the light shielding region 120 is not planarized, and the planarization film 161 is formed at different heights in the pixel region 110 and the light shielding region 120. Further, as shown in the figure, in the pixel 100b arranged in the vicinity of the light shielding region 120, the height of the planarization film 161 gradually decreases from the end of the pixel region 110 toward the center.
  • the distance between the on-chip lens 151 and the photoelectric conversion unit 101 is different from that of the pixel 100a, and the condensing position of incident light by the on-chip lens 151 is different from that of the pixel 100a.
  • FIG. 4 is a top view illustrating a configuration example of the pixel according to the embodiment of the present disclosure.
  • This figure is a top view showing a configuration example of the on-chip lens 151 and the light shielding film 131 described in FIG.
  • a dotted rectangle represents the pixel 100
  • a solid line rectangle represents the opening 132 of the light shielding film 131
  • a broken circle represents the on-chip lens 151.
  • “R”, “G”, and “B” in the figure represent the types of color filters 141 arranged in the pixels 100.
  • the pixel 100 in which “R”, “G”, and “B” are described represents the pixel 100 in which the color filters 141 corresponding to red light, green light, and blue light are arranged, respectively.
  • the pixels 100 on which the color filters 141 corresponding to red light, green light, and blue light are arranged are referred to as a red pixel 100, a green pixel 100, and a blue pixel 100, respectively.
  • the red pixel 100, the green pixel 100, and the blue pixel 100 are configured in a Bayer array is shown.
  • the Bayer arrangement is a pixel arrangement method in which the green pixels 100 are arranged in a checkered pattern, and the green pixels 100 and the blue pixels 100 are arranged between the green pixels 100.
  • a represents the configuration of the pixel 100 (pixel 100a) in the center of the pixel region 110
  • b in the drawing represents the configuration of the pixel 100 (pixel 100b) in the peripheral portion of the pixel region 110.
  • the on-chip lens 151 is disposed at the center of the pixel 100a.
  • pupil correction is performed, and the on-chip lens 151 is shifted to the left side in FIG.
  • FIG. 5 is a diagram illustrating an example of an image signal according to the embodiment of the present disclosure. This figure shows the relationship between the position of the pixel 100 in the pixel array unit 10 and the image signal.
  • a, b, and c represent image signals of the green pixel 100, the red pixel 100, and the blue pixel 100, respectively.
  • the vertical axis represents the level of the image signal
  • the horizontal axis represents the position of the pixel 100 along the line AA ′.
  • a solid line represents a graph of the image signal.
  • the image signal of the pixel 100 arranged in the center of the pixel array unit 10 has a relatively high level, and the pixel signal arranged in the peripheral portion of the pixel array unit 10 has a low level of pixel signal.
  • the pixel 100 arranged in the central portion of the pixel array unit 10 light from the subject enters perpendicularly.
  • the light from the subject is incident obliquely on the pixels 100 arranged in the peripheral portion. Since the amount of light per area of the opening 132 described with reference to FIG. 3 is reduced and the incident light of the pixel 100 is reduced, the level of the pixel signal is lowered in the pixel 100 arranged in the peripheral portion.
  • the dotted line in the figure is a graph showing an image signal based on a decrease in the amount of incident light. A phenomenon in which the level of the image signal of the pixels 100 arranged at the peripheral edge of the pixel array unit 10 in this way is called shading. *
  • the level of the image signal changes in the pixel 100 in the vicinity of the light shielding area. This is because, as described above, in the pixel 100 in the vicinity of the light-shielding region, the optical path length changes as the film thickness of the planarization film 161 increases. Further, the characteristics of the image signal are different for each of the green pixel 100, the red pixel 100, and the blue pixel 100. Specifically, a relatively high level image signal is generated in the green pixel 100 a in FIG. In the red pixel 100 of b in the figure, a higher level image signal is generated (region 302). On the other hand, a relatively low level image signal is generated in the blue pixel 100 of FIG.
  • the refractive index when passing through the on-chip lens 151 differs depending on the wavelength of light.
  • blue light having a short wavelength has a high refractive index, and is thus focused on the photoelectric conversion unit 101 in the vicinity of the surface of the semiconductor substrate 121. For this reason, the blue pixel 100 is easily affected by the film thickness of the planarization film 161.
  • FIG. 6 is a cross-sectional view illustrating a configuration example of the imaging element according to the first embodiment of the present disclosure.
  • FIG. 3 is a cross-sectional view illustrating a configuration example of the pixel 100a and the pixel 100b.
  • the pixel 100a is described as a comparative example and is the same as the pixel 100a described in FIG.
  • the pixel 100b in the figure is different from the light shielding film 131 of the pixel 100a in the shape of the light shielding film 131.
  • the size of the opening 132 of the light shielding film 131 is different from that of the opening 132 of the pixel 100a.
  • the openings 132b and 132c in the same figure correspond to the openings of the light shielding film 131 disposed in the green pixel 100 and the blue pixel 100, respectively.
  • FIG. 7 is a top view illustrating a configuration example of the pixel according to the first embodiment of the present disclosure.
  • This figure is a diagram showing the configuration of the on-chip lens 151 and the light shielding film 131 of the pixel 100b, similarly to b in FIG.
  • Openings 132a, 132b, and 132c in the figure represent the openings of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively.
  • These openings 132a, 132b, and 132c are configured to be larger than the opening 132 described in FIG. Increasing the opening can increase the amount of incident light.
  • large openings 132 are arranged in the order of the blue pixel 100, the green pixel 100, and the blue pixel 100.
  • the incident light amount of the green pixel 100 and the red pixel 100 with relatively high sensitivity is decreased, and the incident light amount of the blue pixel 100 with relatively low sensitivity is increased.
  • the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
  • FIGS. 8 and 9 are diagrams illustrating an example of a method for manufacturing the imaging element according to the first embodiment of the present disclosure.
  • 8 and 9 are diagrams showing the manufacturing process of the imaging device 1, and are diagrams showing the manufacturing processes of the pixels 100a and 100b and the light-shielding pixel 200 described in FIG.
  • the n-type semiconductor region 122 is formed in the p-type well region formed in the semiconductor substrate 121, and the photoelectric conversion part 101 is formed (a in FIG. 8).
  • This step is an example of the photoelectric conversion part forming step described in the claims.
  • a wiring region (not shown) is formed on the semiconductor substrate 121, and a support substrate 125 (not shown) is bonded.
  • an insulating film 126 is formed on the back surface of the semiconductor substrate 121. This can be performed, for example, by depositing the material of the insulating film 126 such as SiO 2 using CVD (Chemical Vapor Deposition) or the like (b in FIG. 8).
  • a metal film 401 and a resist 402 as materials for the light shielding film 131 are sequentially stacked on the surface of the insulating film 126.
  • openings 403, 403 a (not shown), 403 b and 403 c are formed in the resist 402.
  • the openings 403 and the like are formed in a size and a position corresponding to the openings 132, 132a, 132b, and 132c described in FIGS. 6 and 7 (c in FIG. 8).
  • the metal film 401 is etched using the resist 402 as a mask. This can be performed, for example, by dry etching.
  • the light shielding film 131 including the openings 132, 132a (not shown), 132b, and 132c can be formed (d in FIG. 8). Thereby, the pixels 100a and 100b and the light shielding film 131 of the light shielding pixel 200 can be formed simultaneously. Further, by changing the opening of the light shielding film 131 of the pixel 100b to 132a, 132b, and 132c, the amount of light incident on the photoelectric conversion unit 101 of the pixel 100b can be adjusted.
  • This step is an example of the first light shielding film forming step and the second light shielding film forming step described in the claims. Moreover, the said process is an example of the process of adjusting the incident light quantity as described in a claim.
  • the color filter 141 is formed on the surfaces of the insulating film 126 and the light shielding film 131. This can be done for each type of color filter 141. For example, a resin that is a material of the color filter 141 corresponding to green is applied, and openings are formed in regions where the color filters 141 of the red pixel 100 and the blue pixel 100 are disposed, and are cured. Next, this opening can be carried out by placing a resin as a material for the color filter 141 corresponding to red and blue (e in FIG. 9). This step is an example of the color filter forming step described in the claims.
  • a resin 404 that is a material of an on-chip lens is applied to the surface of the color filter 141.
  • the surface of the color filter 141 is flattened.
  • the vicinity of the surface of the resin 404 is constituted by an on-chip lens 151.
  • the resin 404 in the region adjacent to the color filter 141 is formed on the planarizing film 161 (f in FIG. 9). This step is an example of a planarization film forming step described in the claims.
  • the surface of the resin 404 is processed into a hemispherical shape to form an on-chip lens 151.
  • This can be performed, for example, by placing a resist having the same shape as the on-chip lens 151 on the surface of the resin 404 and performing dry etching to transfer the shape of the resist to the resin 404 (in FIG. 9).
  • This step is an example of an on-chip lens forming step described in the claims.
  • the pixels 100a and 100b in the pixel region 110 can be formed, and the light-shielding pixel 200 in the light-shielding region can be formed.
  • the imaging device 1 changes the shape of the light shielding film 131 of the pixel 100 arranged in the vicinity of the light shielding region for each pixel 100 and the color filter of the pixel 100. It adjusts according to the kind of 141. Thereby, the incident light quantity of the pixel 100 arranged in the vicinity of the light shielding region can be adjusted, and the change in sensitivity can be corrected.
  • Second Embodiment> In the image pickup device 1 of the first embodiment described above, the shape of the opening 132 of the light shielding film 131 of the pixel 100b is changed with respect to the pixel 100a. On the other hand, the imaging device 1 according to the second embodiment of the present disclosure is different from the above-described first embodiment in that the shape of the on-chip lens 151 is changed.
  • FIG. 10 is a cross-sectional view illustrating a configuration example of an imaging element according to the second embodiment of the present disclosure.
  • the pixel 100b in the figure is different from the pixel 100b described in FIG. 6 in that on-chip lenses 152b and 152c are provided instead of the on-chip lens 151.
  • the on-chip lenses 152b and 152c are on-chip lenses configured in a shape different from the on-chip lens 151. As will be described later, the on-chip lenses 152b and 152c are configured to have a smaller curvature than the on-chip lens 151, and correspond to the on-chip lenses disposed in the green pixel 100 and the blue pixel 100, respectively.
  • FIG. 11 is a top view illustrating a configuration example of a pixel according to the second embodiment of the present disclosure.
  • the on-chip lenses 152a, 152b, and 152c in the figure represent the on-chip lenses of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively.
  • the on-chip lenses 152a, 152b, and 152c are configured to have a smaller curvature than the on-chip lens 151. By reducing the curvature, the condensing distance can be increased. The condensing distance can be adjusted according to the increase in the thickness of the planarizing film 161. Further, as shown in the figure, the red pixel 100, the green pixel 100, and the blue pixel 100 are configured in a large shape in this order, and are configured in a small curvature in this order.
  • the blue light condensing position in the blue pixel 100 can be the photoelectric conversion unit 101 near the surface of the semiconductor substrate 121. it can.
  • the condensing position in the red pixel 100 and the green pixel 100 can be set near the end of the photoelectric conversion unit 101 by being configured with a relatively large curvature. Thereby, the sensitivity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of the image signal can be corrected.
  • the imaging device 1 adjusts the amount of light incident on the pixels 100 arranged in the vicinity of the light-shielding region by changing the shape of the on-chip lens. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
  • the shape of the on-chip lens of the pixel 100b is changed with respect to the pixel 100a.
  • the imaging device 1 according to the third embodiment of the present disclosure is different from the above-described second embodiment in that the refractive index of the on-chip lens is changed.
  • FIG. 12 is a cross-sectional view illustrating a configuration example of an imaging element according to the third embodiment of the present disclosure.
  • the pixel 100b in the figure is different from the pixel 100b described in FIG. 10 in that on-chip lenses 153b and 153c are provided instead of the on-chip lenses 152c and 152d.
  • the on-chip lenses 153b and 153c are on-chip lenses configured to have a refractive index different from that of the on-chip lens 151. As will be described later, the on-chip lenses 153b and 151c are configured to have a refractive index smaller than that of the on-chip lens 151, and correspond to the on-chip lenses disposed in the green pixel 100 and the blue pixel 100, respectively.
  • FIG. 13 is a top view illustrating a configuration example of a pixel according to the third embodiment of the present disclosure.
  • On-chip lenses 153a, 153b, and 153c in the figure represent on-chip lenses of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively.
  • the on-chip lenses 153a, 153b, and 153c are configured to have a smaller refractive index than the on-chip lens 151.
  • the on-chip lenses 153a, 153b, and 153c are configured to have a small refractive index in this order.
  • the condensing distance can be increased, and the thickness of the planarization film 161 is increased. It is possible to reduce the influence of the change in the condensing position due to.
  • the refractive index of the on-chip lens 153 c of the blue pixel 100 is made smaller than that of the red pixel 100 and the green pixel 100, the blue light condensing position in the blue pixel 100 is shifted to the photoelectric conversion unit 101 near the surface of the semiconductor substrate 121. can do. Thereby, the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
  • the on-chip lenses 153a, 153b, and 153c can be made of a material different from that of the on-chip lens 151.
  • the on-chip lenses 153a, 153b, and 153c can also be configured as on-chip lenses having different refractive indexes by changing materials.
  • the on-chip lenses 153a, 153b, and 153c can be formed, for example, by performing the on-chip lens forming process described in FIG. 9g after forming the planarizing film 161, respectively.
  • the imaging device 1 adjusts the amount of incident light on the pixels 100 arranged in the vicinity of the light-shielding region by changing the refractive index of the on-chip lens 151. To do. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
  • the shape of the opening 132 of the light shielding film 131 of the pixel 100b is changed with respect to the pixel 100a.
  • the imaging device 1 according to the fourth embodiment of the present disclosure is different from the above-described first embodiment in that the shape of the color filter 141 is changed.
  • FIG. 14 is a cross-sectional view illustrating a configuration example of an imaging element according to the fourth embodiment of the present disclosure.
  • the pixel 100b in the figure is different from the pixel 100b described in FIG. 6 in that color filters 142b and 142c are provided instead of the color filter 141.
  • the color filters 142b and 142c are color filters configured in a shape different from that of the color filter 141. Specifically, the color filters 142b and 142c are formed with a film thickness thinner than that of the color filter 141, and correspond to the color filters arranged in the green pixel 100 and the blue pixel 100, respectively. Although not shown, the color filter 142 a having a thickness smaller than that of the color filter 141 is also disposed in the red pixel 100. In addition, the color filters 142a, 142b, and 142c are configured to have a thin film thickness in this order.
  • the incident light quantity of the pixel 100b can be increased, and the influence of the change in the incident light quantity accompanying the increase in the film thickness of the planarizing film 161 is reduced. can do.
  • the film thickness of the color filter 142c of the blue pixel 100 thinner than that of the red pixel 100 and the green pixel 100, the incident light quantity of the blue pixel 100 can be increased. Thereby, the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
  • the color filters 142a, 142b, and 142c have rounded corners. This is because it is possible to prevent so-called “vignetting” that prevents light from entering the adjacent pixel 100b.
  • the imaging device 1 adjusts the amount of incident light on the pixels 100 arranged in the vicinity of the light shielding region by changing the shape of the color filter 142. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
  • the technology according to the present disclosure can be applied to various products.
  • the present technology may be realized as an imaging element mounted on an imaging device such as a camera.
  • FIG. 15 is a block diagram illustrating a schematic configuration example of a camera that is an example of an imaging apparatus to which the present technology can be applied.
  • the camera 1000 shown in FIG. 1 includes a lens 1001, an image sensor 1002, an imaging control unit 1003, a lens driving unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, And a recording unit 1009.
  • the lens 1001 is a photographing lens of the camera 1000.
  • the lens 1001 collects light from the subject and makes it incident on an image sensor 1002 described later to form an image of the subject.
  • the imaging element 1002 is a semiconductor element that images light from the subject condensed by the lens 1001.
  • the image sensor 1002 generates an analog image signal corresponding to the irradiated light, converts it into a digital image signal, and outputs it.
  • the imaging control unit 1003 controls imaging in the imaging element 1002.
  • the imaging control unit 1003 controls the imaging element 1002 by generating a control signal and outputting the control signal to the imaging element 1002.
  • the imaging control unit 1003 can perform autofocus in the camera 1000 based on the image signal output from the imaging element 1002.
  • the autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it.
  • a method image plane phase difference autofocus
  • an image plane phase difference is detected by a phase difference pixel arranged in the image sensor 1002 to detect a focal position
  • a method (contrast autofocus) in which a position where the contrast of an image is the highest is detected as a focal position can be applied.
  • the imaging control unit 1003 adjusts the position of the lens 1001 via the lens driving unit 1004 based on the detected focal position, and performs autofocus.
  • the imaging control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
  • DSP Digital Signal Processor
  • the lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003.
  • the lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
  • the image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic that generates an image signal of insufficient color among image signals corresponding to red, green, and blue for each pixel, noise reduction that removes noise of the image signal, and encoding of the image signal. Applicable.
  • the image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
  • the operation input unit 1006 receives an operation input from the user of the camera 1000.
  • the operation input unit 1006 for example, a push button or a touch panel can be used.
  • the operation input received by the operation input unit 1006 is transmitted to the imaging control unit 1003 and the image processing unit 1005. Thereafter, processing according to the operation input, for example, processing such as imaging of a subject is started.
  • the frame memory 1007 is a memory for storing frames that are image signals for one screen.
  • the frame memory 1007 is controlled by the image processing unit 1005 and holds a frame in the course of image processing.
  • the display unit 1008 displays the image processed by the image processing unit 1005.
  • a liquid crystal panel can be used for the display unit 1008.
  • the recording unit 1009 records the image processed by the image processing unit 1005.
  • a memory card or a hard disk can be used.
  • the camera to which the present invention can be applied has been described above.
  • the present technology can be applied to the image sensor 1002 among the configurations described above.
  • the image sensor 1 described in FIG. 1 can be applied to the image sensor 1002.
  • a change in sensitivity of the pixel 100 in the vicinity of the light-shielding region can be corrected, and deterioration in image quality of an image generated by the camera 1000 can be prevented.
  • FIG. 16 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure can be applied.
  • FIG. 16 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
  • an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
  • the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
  • An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
  • CCU Camera Control Unit
  • the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
  • the pneumoperitoneum device 11206 passes gas into the body cavity via the insufflation tube 11111.
  • the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
  • the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
  • the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
  • a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
  • the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
  • FIG. 17 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the imaging unit 11402 includes an imaging element.
  • One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used.
  • image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
  • the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
  • 3D 3D
  • the imaging unit 11402 is not necessarily provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • AE Auto Exposure
  • AF Automatic Focus
  • AWB Auto White Balance
  • the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102.
  • the image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
  • the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
  • the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
  • the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
  • communication is performed by wire using the transmission cable 11400.
  • communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above.
  • the imaging element 1 in FIG. 1 can be applied to the imaging unit 10402.
  • a high-quality surgical part image can be obtained, so that the surgeon can surely check the surgical part.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device that is mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 18 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
  • the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
  • the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
  • the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
  • the vehicle interior information detection unit 12040 detects vehicle interior information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
  • the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
  • the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generation device, the steering mechanism, the braking device, and the like based on information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
  • the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 19 is a diagram illustrating an example of an installation position of the imaging unit 12031.
  • the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031. *
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper portion of a windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
  • the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 19 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
  • the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
  • the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104.
  • pedestrian recognition is, for example, whether or not a person is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
  • the audio image output unit 12052 displays a rectangular contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to be superimposed and displayed. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above.
  • the imaging device 1 in FIG. 1 can be applied to the imaging unit 12031.
  • this technique can also take the following structures.
  • a photoelectric conversion unit that performs photoelectric conversion based on incident light, a light-shielding film in which an opening is disposed and blocks the incident light while transmitting the incident light, and a predetermined wavelength of the incident light
  • a color filter that transmits the incident light, a planarizing film that planarizes the surface of the color filter, and the entrance through the openings of the color filter and the light shielding film that are disposed adjacent to the planarizing film.
  • the formation step of the light shielding pixel is a step of forming the light shielding pixel around a pixel region where the plurality of pixels to be formed are arranged.
  • the manufacturing method of the image pickup device further includes a step of adjusting an amount of incident light to the photoelectric conversion unit in the pixel in the vicinity of the light shielding region, which is a region where the light shielding pixel is formed.

Abstract

The purpose of the present invention is to reduce changes in sensitivity of pixels arranged in the periphery of a pixel region. A light-blocking region is provided with a pixel region and a light-blocking region. In the pixel region, pixels are arranged that include a photoelectric conversion portion for performing photoelectric conversion based on incident light, a light-blocking film having an opening portion and being transmissive to the incident light in the opening portion while blocking the incident light, a color filter transmissive to a predetermined wavelength of the incident light, a planarization film for planarizing a surface of the color filter, and an on-chip lens which is disposed adjacent to the planarization film and focuses the incident light at the photoelectric conversion portion via the color filter and the opening portion of the light-blocking film. In the light-blocking region, light-blocking pixels which are pixels provided with a light-blocking film having no opening portion are arranged. The light-blocking region adjoins the pixel region. The amount of light that enters the photoelectric conversion portion in the pixels adjacent to the light-blocking region is adjusted.

Description

撮像素子および撮像素子の製造方法Image sensor and method for manufacturing image sensor
 本開示は、撮像素子および撮像素子の製造方法に関する。詳しくは、画素が配置された領域の外側に遮光領域を有する撮像素子および当該撮像素子の製造方法に関する。 The present disclosure relates to an image sensor and a method for manufacturing the image sensor. Specifically, the present invention relates to an image sensor having a light shielding region outside a region where pixels are arranged, and a method for manufacturing the image sensor.
 従来、カメラ等の撮像装置において、撮像素子を撮影レンズに近接して配置することにより小型化した撮像装置が使用されている。このような撮像装置においては、撮像素子の中央部の画素に入射する光量に対して周縁部の画素に入射する光量が減少し、撮像素子の周縁部の画素の感度が低下する現象が知られている。この現象は、シェーディングと称される。 2. Description of the Related Art Conventionally, in an imaging apparatus such as a camera, an imaging apparatus that has been downsized by arranging an imaging element close to a photographing lens is used. In such an imaging device, a phenomenon is known in which the amount of light incident on the peripheral pixel is reduced with respect to the amount of light incident on the central pixel of the image sensor, and the sensitivity of the peripheral pixel of the image sensor is decreased. ing. This phenomenon is called shading.
 この感度の低下を補正するため、瞳補正が行われる。上述の画素には、オンチップレンズが配置され、このオンチップレンズにより被写体からの光が画素に集光される。画素領域の中央に配置された画素には被写体からの光が垂直に入射するため、オンチップレンズは画素の中心に配置される。これに対し、撮像素子の周縁部に配置された画素では、被写体からの光が斜めに入射するため、オンチップレンズを画素の中心から撮像素子の中心寄りに偏移させて配置する。これにより、斜めの入射光を画素に集光することができ、シェーディングを補正することができる。例えば、半導体基板に形成された複数の画素に上層膜およびオンチップレンズが順に積層された撮像素子において、瞳補正に係るオンチップレンズ位置の補正量を変化させる撮像素子が提案されている(例えば、特許文献1参照。)。この従来技術の撮像素子においては、撮像素子の画素が配置される領域の中心からオンチップレンズまでの距離とオンチップレンズの位置における上層膜の膜厚とに応じてオンチップレンズ位置の補正量を変化させる。 瞳 Pupil correction is performed to correct this decrease in sensitivity. An on-chip lens is disposed in the above-described pixel, and light from the subject is condensed on the pixel by the on-chip lens. Since the light from the subject is vertically incident on the pixel arranged at the center of the pixel region, the on-chip lens is arranged at the center of the pixel. On the other hand, in the pixels arranged at the peripheral edge of the image sensor, the light from the subject is incident obliquely, so that the on-chip lens is shifted from the center of the pixel toward the center of the image sensor. Thereby, oblique incident light can be condensed on the pixel, and shading can be corrected. For example, in an imaging device in which an upper layer film and an on-chip lens are sequentially stacked on a plurality of pixels formed on a semiconductor substrate, an imaging device that changes the correction amount of the on-chip lens position related to pupil correction has been proposed (for example, , See Patent Document 1). In this conventional image sensor, the correction amount of the on-chip lens position is determined according to the distance from the center of the region where the pixels of the image sensor are arranged to the on-chip lens and the film thickness of the upper layer film at the position of the on-chip lens To change.
特開2014-072471号公報JP 2014-072471 A
 オンチップレンズと半導体基板との間には、絶縁膜、カラーフィルタおよび平坦化膜が配置される。これらの絶縁膜、カラーフィルタおよび平坦化膜が上述の上層膜を構成することとなる。このうち、平坦化膜はカラーフィルタの表面を平坦化する膜であるため、膜厚が比較的厚くなるとともに画素領域において膜厚が大きく変化する。上述の従来技術は、瞳補正により感度の低下を補正するため、平坦化膜等の膜厚が大きく変化した場合に、感度の低下を十分に補正できないという問題がある。 An insulating film, a color filter, and a planarizing film are disposed between the on-chip lens and the semiconductor substrate. These insulating film, color filter, and planarizing film constitute the above-mentioned upper layer film. Among these, the flattening film is a film for flattening the surface of the color filter. Therefore, the film thickness becomes relatively thick and the film thickness greatly changes in the pixel region. Since the above-described conventional technique corrects the decrease in sensitivity by pupil correction, there is a problem that the decrease in sensitivity cannot be sufficiently corrected when the film thickness of the planarization film or the like changes greatly.
 本開示は、上述した問題点に鑑みてなされたものであり、画素領域の周縁部に配置された画素における感度の変化を軽減することを目的としている。 The present disclosure has been made in view of the above-described problems, and an object thereof is to reduce a change in sensitivity in pixels arranged at a peripheral portion of a pixel region.
 本開示は、上述の問題点を解消するためになされたものであり、その第1の態様は、入射光に基づく光電変換を行う光電変換部と、開口部が配置されて上記入射光を遮光しながら上記開口部において入射光を透過させる遮光膜と、上記入射光のうち所定の波長の入射光を透過させるカラーフィルタと、上記カラーフィルタの表面を平坦化する平坦化膜と、上記平坦化膜に隣接して配置されて上記カラーフィルタおよび上記遮光膜の開口部を介して上記入射光を上記光電変換部に集光するオンチップレンズとを備える画素が配置される画素領域と、上記開口部が配置されない上記遮光膜を備える上記画素である遮光画素が配置されるとともに上記画素領域に隣接する遮光領域とを具備し、上記遮光領域の近傍の上記画素における上記光電変換部への入射光量を調整する撮像素子である。 The present disclosure has been made in order to solve the above-described problems. The first aspect of the present disclosure is that a photoelectric conversion unit that performs photoelectric conversion based on incident light and an opening are arranged to block the incident light. A light shielding film that transmits incident light through the opening, a color filter that transmits incident light of a predetermined wavelength out of the incident light, a planarization film that planarizes the surface of the color filter, and the planarization A pixel region disposed adjacent to a film and provided with a pixel including the color filter and an on-chip lens for condensing the incident light on the photoelectric conversion unit through the opening of the light shielding film; and the opening The photoelectric conversion in the pixel in the vicinity of the light-shielding region is provided with a light-shielded pixel that is the pixel including the light-shielding film in which no part is disposed and a light-shielding region adjacent to the pixel region. An image pickup element for adjusting the amount of light entering.
 また、この第1の態様において、上記遮光領域の近傍の上記画素における上記光電変換部への入射光量を当該画素のカラーフィルタに応じてさらに調整してもよい。 In the first aspect, the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be further adjusted according to the color filter of the pixel.
 また、この第1の態様において、上記遮光膜の形状を変更することにより上記遮光領域の近傍の上記画素における上記光電変換部への入射光量を調整してもよい。 In the first aspect, the amount of light incident on the photoelectric conversion unit in the pixel near the light shielding region may be adjusted by changing the shape of the light shielding film.
 また、この第1の態様において、上記オンチップレンズの形状を変更することにより上記遮光領域の近傍の上記画素における上記光電変換部への入射光量を調整してもよい。 In the first aspect, the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be adjusted by changing the shape of the on-chip lens.
 また、この第1の態様において、上記オンチップレンズの屈折率を変更することにより上記遮光領域の近傍の上記画素における上記光電変換部への入射光量を調整してもよい。 In this first aspect, the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be adjusted by changing the refractive index of the on-chip lens.
 また、この第1の態様において、上記カラーフィルタの形状を変更することにより上記遮光領域の近傍の上記画素における上記光電変換部への入射光量を調整してもよい。 In the first aspect, the amount of incident light on the photoelectric conversion unit in the pixel near the light shielding area may be adjusted by changing the shape of the color filter.
 また、本開示の第2の態様は、入射光に基づく光電変換を行う光電変換部形成工程と、開口部が配置されて上記入射光を遮光しながら上記開口部において入射光を透過させる遮光膜を形成する第1の遮光膜形成工程と、上記入射光のうち所定の波長の入射光を透過させるカラーフィルタを形成するカラーフィルタ形成工程と、上記カラーフィルタの表面を平坦化する平坦化膜を形成する平坦化膜形成工程と、上記平坦化膜に隣接して配置されて上記カラーフィルタおよび上記遮光膜の開口部を介して上記入射光を上記光電変換部に集光するオンチップレンズを形成するオンチップレンズ形成工程とを備える複数の画素の形成工程と、上記光電変換部形成工程と、上記開口部が配置されない遮光膜を形成する第2の遮光膜形成工程と、上記カラーフィルタ形成工程と、上記平坦化膜形成工程と、上記オンチップレンズ形成工程とを備える遮光画素の形成工程とを具備し、上記遮光画素の形成工程は、上記形成される複数の画素が配置される画素領域の周囲に上記遮光画素を形成する工程であり、上記画素の製造工程は、上記遮光画素が形成される領域である遮光領域の近傍の上記画素における上記光電変換部への入射光量を調整する工程をさらに備える撮像素子の製造方法である。 In addition, the second aspect of the present disclosure includes a photoelectric conversion unit forming step that performs photoelectric conversion based on incident light, and a light shielding film that transmits the incident light through the opening while the opening is disposed to block the incident light. A first light shielding film forming step for forming a color filter, a color filter forming step for forming a color filter that transmits incident light of a predetermined wavelength among the incident light, and a planarizing film for flattening the surface of the color filter. Forming a planarizing film forming step, and forming an on-chip lens arranged adjacent to the planarizing film and condensing the incident light on the photoelectric conversion unit through the color filter and the opening of the light shielding film An on-chip lens forming step for forming a plurality of pixels, the photoelectric conversion portion forming step, a second light shielding film forming step for forming a light shielding film in which the opening is not disposed, and the cap A light-shielding pixel forming step comprising a filter forming step, the planarizing film forming step, and the on-chip lens forming step, wherein the plurality of pixels to be formed are arranged in the light-shielding pixel forming step. Forming the light-shielding pixels around the pixel area, and the manufacturing process of the pixels includes calculating the amount of incident light to the photoelectric conversion unit in the pixels in the vicinity of the light-shielding area where the light-shielding pixels are formed. It is a manufacturing method of an image sensor further provided with the process to adjust.
 上記の態様を採ることにより、遮光領域と画素領域との境界の近傍に配置された画素の光電変換部への入射光量が調整されるという作用をもたらす。平坦化膜等の膜厚の変化に基づく遮光領域の近傍の画素における感度の変化に応じた入射光量の調整が想定される。 By adopting the above aspect, there is an effect that the amount of incident light to the photoelectric conversion unit of the pixel arranged in the vicinity of the boundary between the light shielding region and the pixel region is adjusted. It is assumed that the amount of incident light is adjusted according to the change in sensitivity in the pixels in the vicinity of the light shielding region based on the change in the film thickness of the flattening film or the like.
 本開示によれば、画素領域の周縁部に配置された画素における感度の変化を軽減するという優れた効果を奏する。 According to the present disclosure, there is an excellent effect of reducing a change in sensitivity in the pixels arranged in the peripheral portion of the pixel region.
本開示の実施の形態に係る撮像素子の構成例を示す図である。It is a figure showing an example of composition of an image sensor concerning an embodiment of this indication. 本開示の実施の形態に係る画素アレイ部の構成例を示す図である。It is a figure which shows the structural example of the pixel array part which concerns on embodiment of this indication. 本開示の実施の形態に係る画素の構成例を示す断面図である。It is sectional drawing which shows the structural example of the pixel which concerns on embodiment of this indication. 本開示の実施の形態に係る画素の構成例を示す上面図である。3 is a top view illustrating a configuration example of a pixel according to an embodiment of the present disclosure. FIG. 本開示の実施の形態に係る画像信号の一例を示す図である。It is a figure which shows an example of the image signal which concerns on embodiment of this indication. 本開示の第1の実施の形態に係る撮像素子の構成例を示す断面図である。It is a sectional view showing an example of composition of an image sensor concerning a 1st embodiment of this indication. 本開示の第1の実施の形態に係る画素の構成例を示す上面図である。3 is a top view illustrating a configuration example of a pixel according to the first embodiment of the present disclosure. FIG. 本開示の第1の実施の形態に係る撮像素子の製造方法の一例を示す図である。It is a figure showing an example of a manufacturing method of an image sensor concerning a 1st embodiment of this indication. 本開示の第1の実施の形態に係る撮像素子の製造方法の一例を示す図である。It is a figure showing an example of a manufacturing method of an image sensor concerning a 1st embodiment of this indication. 本開示の第2の実施の形態に係る画素の構成例を示す断面図である。It is a sectional view showing an example of composition of a pixel concerning a 2nd embodiment of this indication. 本開示の第2の実施の形態に係る画素の構成例を示す上面図である。6 is a top view illustrating a configuration example of a pixel according to a second embodiment of the present disclosure. FIG. 本開示の第3の実施の形態に係る画素の構成例を示す断面図である。It is a sectional view showing an example of composition of a pixel concerning a 3rd embodiment of this indication. 本開示の第3の実施の形態に係る画素の構成例を示す上面図である。14 is a top view illustrating a configuration example of a pixel according to a third embodiment of the present disclosure. FIG. 本開示の第4の実施の形態に係る画素の構成例を示す断面図である。It is sectional drawing which shows the structural example of the pixel which concerns on 4th Embodiment of this indication. 本開示が適用され得る撮像装置の一例であるカメラの概略的な構成例を示すブロック図である。It is a block diagram which shows the schematic structural example of the camera which is an example of the imaging device with which this indication can be applied. 内視鏡手術システムの概略的な構成の一例を示す図である。It is a figure which shows an example of a schematic structure of an endoscopic surgery system. カメラヘッド及びCCUの機能構成の一例を示すブロック図である。It is a block diagram which shows an example of a function structure of a camera head and CCU. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of a schematic structure of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of a vehicle exterior information detection part and an imaging part.
 次に、図面を参照して、本開示を実施するための形態(以下、実施の形態と称する)を説明する。以下の図面において、同一または類似の部分には同一または類似の符号を付している。ただし、図面は、模式的なものであり、各部の寸法の比率等は現実のものとは必ずしも一致しない。また、図面相互間においても互いの寸法の関係や比率が異なる部分が含まれることは勿論である。また、本技術に係る撮像素子の構成の説明をしたうえで、以下の順序で実施の形態の説明を行う。
 1.第1の実施の形態
 2.第2の実施の形態
 3.第3の実施の形態
 4.第4の実施の形態
 5.カメラへの応用例
 6.内視鏡手術システムへの応用例
 7.移動体への応用例
Next, modes for carrying out the present disclosure (hereinafter referred to as embodiments) will be described with reference to the drawings. In the following drawings, the same or similar parts are denoted by the same or similar reference numerals. However, the drawings are schematic, and the dimensional ratios of the respective parts do not necessarily match the actual ones. Also, it goes without saying that the drawings include portions having different dimensional relationships and ratios. In addition, after describing the configuration of the imaging device according to the present technology, the embodiments will be described in the following order.
1. 1. First embodiment 2. Second embodiment 3. Third embodiment 4. Fourth embodiment Application example to camera 6. Application example to endoscopic surgery system Application examples for moving objects
 <撮像素子の構成>
 図1は、本開示の実施の形態に係る撮像素子の構成例を示す図である。同図の撮像素子1は、画素アレイ部10と、垂直駆動部20と、カラム信号処理部30と、制御部40とを備える。
<Configuration of image sensor>
FIG. 1 is a diagram illustrating a configuration example of an imaging element according to an embodiment of the present disclosure. The image pickup device 1 of FIG. 1 includes a pixel array unit 10, a vertical drive unit 20, a column signal processing unit 30, and a control unit 40.
 画素アレイ部10は、画素100が2次元格子状に配置されて構成されたものである。ここで、画素100は、照射された光に応じた画像信号を生成するものである。この画素100は、照射された光に応じた電荷を生成する光電変換部を有する。また画素100は、画素回路をさらに有する。この画素回路は、光電変換部により生成された電荷に基づく画像信号を生成する。画像信号の生成は、後述する垂直駆動部20により生成された制御信号により制御される。画素アレイ部10には、信号線21および31がXYマトリクス状に配置される。信号線21は、画素100における画素回路の制御信号を伝達する信号線であり、画素アレイ部10の行毎に配置され、各行に配置される画素100に対して共通に配線される。信号線31は、画素100の画素回路により生成された画像信号を伝達する信号線であり、画素アレイ部10の列毎に配置され、各列に配置される画素100に対して共通に配線される。これら光電変換部および画素回路は、半導体基板に形成される。 The pixel array unit 10 is configured by arranging the pixels 100 in a two-dimensional grid. Here, the pixel 100 generates an image signal corresponding to the irradiated light. The pixel 100 includes a photoelectric conversion unit that generates charges according to the irradiated light. The pixel 100 further includes a pixel circuit. This pixel circuit generates an image signal based on the charges generated by the photoelectric conversion unit. The generation of the image signal is controlled by a control signal generated by the vertical drive unit 20 described later. In the pixel array unit 10, signal lines 21 and 31 are arranged in an XY matrix. The signal line 21 is a signal line that transmits a control signal for the pixel circuit in the pixel 100, and is arranged for each row of the pixel array unit 10 and wired in common to the pixels 100 arranged in each row. The signal line 31 is a signal line that transmits an image signal generated by the pixel circuit of the pixel 100, and is arranged for each column of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each column. The These photoelectric conversion units and pixel circuits are formed on a semiconductor substrate.
 垂直駆動部20は、画素100の画素回路の制御信号を生成するものである。この垂直駆動部20は、生成した制御信号を同図の信号線21を介して画素100に伝達する。カラム信号処理部30は、画素100により生成された画像信号を処理するものである。このカラム信号処理部30は、同図の信号線31を介して画素100から伝達された画像信号の処理を行う。カラム信号処理部30における処理には、例えば、画素100において生成されたアナログの画像信号をデジタルの画像信号に変換するアナログデジタル変換が該当する。カラム信号処理部30により処理された画像信号は、撮像素子1の画像信号として出力される。制御部40は、撮像素子1の全体を制御するものである。この制御部40は、垂直駆動部20およびカラム信号処理部30を制御する制御信号を生成して出力することにより、撮像素子1の制御を行う。制御部40により生成された制御信号は、信号線41および42により垂直駆動部20およびカラム信号処理部30に対してそれぞれ伝達される。 The vertical drive unit 20 generates a control signal for the pixel circuit of the pixel 100. The vertical drive unit 20 transmits the generated control signal to the pixel 100 via the signal line 21 shown in FIG. The column signal processing unit 30 processes the image signal generated by the pixel 100. The column signal processing unit 30 processes the image signal transmitted from the pixel 100 via the signal line 31 shown in FIG. The processing in the column signal processing unit 30 corresponds to, for example, analog-digital conversion that converts an analog image signal generated in the pixel 100 into a digital image signal. The image signal processed by the column signal processing unit 30 is output as an image signal of the image sensor 1. The control unit 40 controls the entire image sensor 1. The control unit 40 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical driving unit 20 and the column signal processing unit 30. The control signal generated by the control unit 40 is transmitted to the vertical drive unit 20 and the column signal processing unit 30 through signal lines 41 and 42, respectively.
 [画素アレイ部の構成]
 図2は、本開示の実施の形態に係る画素アレイ部の構成例を示す図である。同図は、画素アレイ部10の構成例を表した図である。同図の画素アレイ部10は、画素領域110および遮光領域120により構成される。画素領域110は、図1において説明した画素100が配置される領域である。同図の画素100aおよび100bは、それぞれ画素領域110の中央部および周縁部に配置された画素100を表す。
[Configuration of pixel array section]
FIG. 2 is a diagram illustrating a configuration example of the pixel array unit according to the embodiment of the present disclosure. FIG. 2 is a diagram illustrating a configuration example of the pixel array unit 10. The pixel array unit 10 shown in FIG. 1 includes a pixel area 110 and a light shielding area 120. The pixel area 110 is an area where the pixel 100 described in FIG. 1 is arranged. Pixels 100a and 100b in the figure represent the pixels 100 arranged at the center and the periphery of the pixel region 110, respectively.
 遮光領域120は、遮光画素200が配置される領域である。ここで遮光画素200とは、被写体からの光が遮光された画素であり、画像信号の黒レベルの検出のために使用される画素である。複数の遮光画素200が画素領域110の周囲に配置され、遮光領域120を構成する。 The light shielding area 120 is an area where the light shielding pixels 200 are arranged. Here, the light shielding pixel 200 is a pixel in which light from a subject is shielded, and is a pixel used for detecting a black level of an image signal. A plurality of light-shielding pixels 200 are arranged around the pixel region 110 to form the light-shielding region 120.
 [画素の構成]
 図3は、本開示の実施の形態に係る画素の構成例を示す断面図である。同図は、画素100aおよび100bならびに遮光画素200の構成例を表す断面図であり、図2におけるA-A’線に沿った断面図である。また、同図は、本開示の画素の基本的な構成を表す図である。
[Pixel configuration]
FIG. 3 is a cross-sectional view illustrating a configuration example of a pixel according to an embodiment of the present disclosure. This figure is a cross-sectional view illustrating a configuration example of the pixels 100a and 100b and the light-shielding pixel 200, and is a cross-sectional view along the line AA ′ in FIG. In addition, this figure is a diagram illustrating a basic configuration of a pixel of the present disclosure.
 同図の画素100および遮光画素200は、半導体基板121と、配線層124および絶縁層123からなる配線領域と、支持基板125と、絶縁膜126と、遮光膜131と、カラーフィルタ141と、平坦化膜161と、オンチップレンズ151とを備える。 The pixel 100 and the light-shielding pixel 200 in the figure are composed of a semiconductor substrate 121, a wiring region composed of a wiring layer 124 and an insulating layer 123, a support substrate 125, an insulating film 126, a light-shielding film 131, a color filter 141, and a flat surface. A chemical film 161 and an on-chip lens 151 are provided.
 半導体基板121は、画素100の光電変換部や画素回路の半導体部分が形成される半導体基板である。同図においては、光電変換部101を例として記載した。この光電変換部101は、半導体基板121に形成されたp型ウェル領域に形成される。便宜上、半導体基板121は、ウェル領域を構成するものと想定する。このp型ウェル領域内にn型半導体領域122が形成される。p型ウェル領域およびn型半導体領域122の間に形成されるpn接合により光電変換部101であるフォトダイオードが構成される。 The semiconductor substrate 121 is a semiconductor substrate on which the photoelectric conversion part of the pixel 100 and the semiconductor part of the pixel circuit are formed. In the figure, the photoelectric conversion unit 101 is described as an example. The photoelectric conversion unit 101 is formed in a p-type well region formed in the semiconductor substrate 121. For convenience, it is assumed that the semiconductor substrate 121 constitutes a well region. An n-type semiconductor region 122 is formed in the p-type well region. A photodiode serving as the photoelectric conversion unit 101 is configured by a pn junction formed between the p-type well region and the n-type semiconductor region 122.
 配線層124は、半導体基板121に形成された素子同士を接続する配線である。また配線層124は、画素100の制御信号や画像信号の伝達にも使用される。図1において説明した信号線11および12は、配線層124により構成される。配線層124は、例えば、銅(Cu)やタングステン(W)により構成することができる。絶縁層123は、配線層124を絶縁するものである。この絶縁層123は、例えば、酸化シリコン(SiO)により構成することができる。配線層124および絶縁層124は、配線領域を構成する。なお、同図の配線領域は、半導体基板121の表面に配置される。支持基板125は、配線領域に隣接して配置され、半導体基板121を支持する基板である。この支持基板125は、撮像素子1を製造する際の強度を向上させる基板である。絶縁膜126は、半導体基板121の裏面に配置され、半導体基板121を絶縁する基板である。この絶縁膜126は、例えば、SiOにより構成することができる。 The wiring layer 124 is a wiring that connects elements formed on the semiconductor substrate 121. The wiring layer 124 is also used for transmitting control signals and image signals for the pixels 100. The signal lines 11 and 12 described with reference to FIG. The wiring layer 124 can be composed of, for example, copper (Cu) or tungsten (W). The insulating layer 123 insulates the wiring layer 124. The insulating layer 123 can be made of, for example, silicon oxide (SiO 2 ). The wiring layer 124 and the insulating layer 124 constitute a wiring region. Note that the wiring region in the figure is arranged on the surface of the semiconductor substrate 121. The support substrate 125 is a substrate that is disposed adjacent to the wiring region and supports the semiconductor substrate 121. The support substrate 125 is a substrate that improves the strength when the image pickup device 1 is manufactured. The insulating film 126 is a substrate that is disposed on the back surface of the semiconductor substrate 121 and insulates the semiconductor substrate 121. This insulating film 126 can be made of, for example, SiO 2 .
 カラーフィルタ141は、入射光のうち所定の波長の光を透過する光学的なフィルタである。このカラーフィルタ141には、例えば、赤色光、緑色光および青色光を透過するカラーフィルタ141を使用することができる。画素100には、赤色光、緑色光および青色光を透過する3種類のカラーフィルタ141の何れか1つが配置される。また、カラーフィルタ141は、透過させる光の波長に応じて異なる厚さに構成される。例えば、緑色光に対応するカラーフィルタ141は、赤色光および青色光に対応するカラーフィルタ141より厚い膜厚に構成される。これは、カラーフィルタ141の特性や製造工程の制約等によるものである。 The color filter 141 is an optical filter that transmits light of a predetermined wavelength among incident light. As the color filter 141, for example, a color filter 141 that transmits red light, green light, and blue light can be used. In the pixel 100, any one of three types of color filters 141 that transmit red light, green light, and blue light is disposed. The color filter 141 is configured to have a different thickness depending on the wavelength of light to be transmitted. For example, the color filter 141 corresponding to green light is configured to be thicker than the color filter 141 corresponding to red light and blue light. This is due to the characteristics of the color filter 141 and restrictions on the manufacturing process.
 オンチップレンズ151は、入射光を光電変換部101に集光するレンズである。このオンチップレンズ151は、半球形状に構成され、カラーフィルタ141を介して入射光を集光する。オンチップレンズ151は、例えば、アクリル系樹脂により構成することができる。なお、同図の撮像素子1は、半導体基板の裏面にカラーフィルタ141やオンチップレンズ151が配置され、半導体基板121の裏面から照射された入射光の撮像を行う裏面照射型の撮像素子である。なお、画素領域110の周縁部に配置された画素100bでは、瞳補正によりオンチップレンズ151が光電変換部101の中心に対して画素領域110の中心の方向に偏移して配置される。 The on-chip lens 151 is a lens that condenses incident light on the photoelectric conversion unit 101. The on-chip lens 151 has a hemispherical shape and collects incident light through the color filter 141. The on-chip lens 151 can be made of, for example, an acrylic resin. Note that the image pickup device 1 shown in FIG. 1 is a back-illuminated image pickup device in which a color filter 141 and an on-chip lens 151 are disposed on the back surface of a semiconductor substrate and images incident light emitted from the back surface of the semiconductor substrate 121. . Note that, in the pixel 100 b arranged at the peripheral portion of the pixel region 110, the on-chip lens 151 is shifted in the direction of the center of the pixel region 110 with respect to the center of the photoelectric conversion unit 101 by pupil correction.
 平坦化膜161は、カラーフィルタ141の表面を平坦化する膜である。上述のようにカラーフィルタ141の表面は、対応する色毎に異なる膜厚に構成される。このため、平坦化膜161を配置し、オンチップレンズ151を形成する面の平坦化を行う。この平坦化膜161は、例えば、オンチップレンズ151と同じ材料により構成することができる。具体的には、オンチップレンズ151を形成する際、オンチップレンズ151の材料をカラーフィルタ141の表面に厚塗りすることにより、カラーフィルタ141の表面の平坦化を行うことができる。 The planarizing film 161 is a film that planarizes the surface of the color filter 141. As described above, the surface of the color filter 141 has a different film thickness for each corresponding color. For this reason, the planarization film 161 is disposed, and the surface on which the on-chip lens 151 is formed is planarized. The planarization film 161 can be made of the same material as the on-chip lens 151, for example. Specifically, when the on-chip lens 151 is formed, the surface of the color filter 141 can be flattened by thickly coating the surface of the color filter 141 with the material of the on-chip lens 151.
 遮光膜131は、入射光を遮光する膜である。この遮光膜131は、画素100および遮光画素200において異なる形状に構成される。画素100では、開口部132を有する遮光膜131が配置される。この開口部132を介してオンチップレンズ151およびカラーフィルタ141を透過した入射光が光電変換部101に照射される。画素100に配置された遮光膜131は、隣接する画素100から斜めに入射する光を遮光する。具体的には、隣接する画素100のカラーフィルタ141を透過した光の自身の画素100の光電変換部101への入射を防止する。これにより、クロストークの発生を防ぐことができる。遮光膜131は、例えば、金属により構成することができる。 The light shielding film 131 is a film that shields incident light. The light shielding film 131 is configured in different shapes in the pixel 100 and the light shielding pixel 200. In the pixel 100, a light shielding film 131 having an opening 132 is disposed. Incident light transmitted through the on-chip lens 151 and the color filter 141 is irradiated to the photoelectric conversion unit 101 through the opening 132. The light shielding film 131 disposed in the pixel 100 shields light incident obliquely from the adjacent pixel 100. Specifically, the light transmitted through the color filter 141 of the adjacent pixel 100 is prevented from entering the photoelectric conversion unit 101 of the pixel 100 itself. Thereby, the occurrence of crosstalk can be prevented. The light shielding film 131 can be made of metal, for example.
 一方、遮光画素200では、開口部のない遮光膜131が配置される。このため、遮光画素200では、被写体からの光が全て遮光される。このような遮光画素200により生成された画像信号は、画素100により生成された画像信号の黒レベルに該当する信号となる。 On the other hand, in the light-shielding pixel 200, a light-shielding film 131 having no opening is disposed. For this reason, in the light-shielding pixel 200, all light from the subject is shielded. The image signal generated by such a light shielding pixel 200 is a signal corresponding to the black level of the image signal generated by the pixel 100.
 後述するように、遮光膜131およびカラーフィルタ141は、画素領域および遮光領域において同時に形成することができる。同図に表したように、画素100の遮光膜131では、カラーフィルタ141は、開口部132において絶縁膜126と隣接して配置される。すなわち、遮光膜131の開口部132に埋め込まれて配置される。これに対し、遮光画素200では、カラーフィルタ141は、遮光膜131の表面に積層される。このため、半導体基板121からカラーフィルタ141の表面までの高さは、遮光画素200の方が画素100より高くなる。このように画素領域110および遮光領域120の間においてカラーフィルタ141の表面高さに段差を生じる。 As described later, the light shielding film 131 and the color filter 141 can be formed simultaneously in the pixel region and the light shielding region. As shown in the drawing, in the light shielding film 131 of the pixel 100, the color filter 141 is disposed adjacent to the insulating film 126 in the opening 132. That is, the light shielding film 131 is embedded in the opening 132. On the other hand, in the light shielding pixel 200, the color filter 141 is laminated on the surface of the light shielding film 131. For this reason, the height from the semiconductor substrate 121 to the surface of the color filter 141 is higher in the light-shielded pixel 200 than in the pixel 100. Thus, a step is generated in the surface height of the color filter 141 between the pixel region 110 and the light shielding region 120.
 このような画素アレイ部10に平坦化膜161が形成されると、画素100毎のカラーフィルタ141の表面の凹凸が平坦化される。一方、画素領域110および遮光領域120の間の段差は平坦化されず、平坦化膜161は画素領域110および遮光領域120において異なる高さに形成される。また、同図に表したように遮光領域120の近傍に配置された画素100bでは、平坦化膜161の高さが画素領域110の端部から中央部に向かって徐々に低くなる。このため、画素100bでは、オンチップレンズ151と光電変換部101との間の距離が画素100aとは異なる値となり、オンチップレンズ151による入射光の集光位置が画素100aと異なる。 When the planarization film 161 is formed on such a pixel array unit 10, the unevenness of the surface of the color filter 141 for each pixel 100 is planarized. On the other hand, the step between the pixel region 110 and the light shielding region 120 is not planarized, and the planarization film 161 is formed at different heights in the pixel region 110 and the light shielding region 120. Further, as shown in the figure, in the pixel 100b arranged in the vicinity of the light shielding region 120, the height of the planarization film 161 gradually decreases from the end of the pixel region 110 toward the center. For this reason, in the pixel 100b, the distance between the on-chip lens 151 and the photoelectric conversion unit 101 is different from that of the pixel 100a, and the condensing position of incident light by the on-chip lens 151 is different from that of the pixel 100a.
 [オンチップレンズおよび遮光膜の構成]
 図4は、本開示の実施の形態に係る画素の構成例を示す上面図である。同図は、画素100のうち、図3において説明したオンチップレンズ151および遮光膜131の構成例を表した上面図である。同図において、点線の矩形は画素100を表し、実線の矩形は遮光膜131の開口部132を表し、破線の円はオンチップレンズ151を表す。また、同図の「R」、「G」および「B」は、画素100に配置されるカラーフィルタ141の種類を表す。すなわち、「R」、「G」および「B」が記載された画素100は、それぞれ赤色光、緑色光および青色光に対応するカラーフィルタ141が配置された画素100を表す。以下、赤色光、緑色光および青色光に対応するカラーフィルタ141が配置された画素100をそれぞれ赤色画素100、緑色画素100および青色画素100と称する。なお、同図においては、赤色画素100、緑色画素100および青色画素100がベイヤー配列に構成される例を表したものである。ここで、ベイヤー配列とは、緑色画素100が市松形状に配置され、緑色画素100および青色画素100が緑色画素100の間に配置される画素の配置方法である。
[Configuration of on-chip lens and light shielding film]
FIG. 4 is a top view illustrating a configuration example of the pixel according to the embodiment of the present disclosure. This figure is a top view showing a configuration example of the on-chip lens 151 and the light shielding film 131 described in FIG. In the drawing, a dotted rectangle represents the pixel 100, a solid line rectangle represents the opening 132 of the light shielding film 131, and a broken circle represents the on-chip lens 151. In addition, “R”, “G”, and “B” in the figure represent the types of color filters 141 arranged in the pixels 100. That is, the pixel 100 in which “R”, “G”, and “B” are described represents the pixel 100 in which the color filters 141 corresponding to red light, green light, and blue light are arranged, respectively. Hereinafter, the pixels 100 on which the color filters 141 corresponding to red light, green light, and blue light are arranged are referred to as a red pixel 100, a green pixel 100, and a blue pixel 100, respectively. In the figure, an example in which the red pixel 100, the green pixel 100, and the blue pixel 100 are configured in a Bayer array is shown. Here, the Bayer arrangement is a pixel arrangement method in which the green pixels 100 are arranged in a checkered pattern, and the green pixels 100 and the blue pixels 100 are arranged between the green pixels 100.
 同図におけるaは画素領域110の中央部の画素100(画素100a)の構成を表し、同図におけるbは画素領域110の周縁部の画素100(画素100b)の構成を表す。画素100aでは、オンチップレンズ151が画素100aの中央部に配置される。これに対し、画素100bでは、瞳補正が行われ、オンチップレンズ151が同図における左側に偏移して配置される。 In the figure, a represents the configuration of the pixel 100 (pixel 100a) in the center of the pixel region 110, and b in the drawing represents the configuration of the pixel 100 (pixel 100b) in the peripheral portion of the pixel region 110. In the pixel 100a, the on-chip lens 151 is disposed at the center of the pixel 100a. In contrast, in the pixel 100b, pupil correction is performed, and the on-chip lens 151 is shifted to the left side in FIG.
 [画像信号]
 図5は、本開示の実施の形態に係る画像信号の一例を示す図である。同図は、画素アレイ部10における画素100の位置と画像信号との関係を表した図であり、図2において説明したA-A’線に沿って配置されたそれぞれの画素100の画像信号を表した図である。同図におけるa、bおよびcは、それぞれ緑色画素100、赤色画素100および青色画素100の画像信号を表す。同図の縦軸は画像信号のレベルを表し、横軸はA-A’線に沿う画素100の位置を表す。また、実線は、画像信号のグラフを表す。
[Image signal]
FIG. 5 is a diagram illustrating an example of an image signal according to the embodiment of the present disclosure. This figure shows the relationship between the position of the pixel 100 in the pixel array unit 10 and the image signal. The image signal of each pixel 100 arranged along the line AA ′ described in FIG. FIG. In the figure, a, b, and c represent image signals of the green pixel 100, the red pixel 100, and the blue pixel 100, respectively. In the drawing, the vertical axis represents the level of the image signal, and the horizontal axis represents the position of the pixel 100 along the line AA ′. A solid line represents a graph of the image signal.
 同図に表したように、画素アレイ部10の中央に配置された画素100の画像信号は比較的レベルが高く、画素アレイ部10の周縁部に配置された画素100では画素信号のレベルが低くなる。前述のように、画素アレイ部10の中央部に配置された画素100では被写体からの光が垂直に入射する。これに対し、周縁部に配置された画素100では被写体からの光が斜めに入射する。図3において説明した開口部132の面積当たりの光量が低下して画素100の入射光が減少するため、周縁部に配置された画素100では画素信号のレベルが低くなる。なお、前述の瞳補正を行わない場合には、さらに画像信号のレベルが低下することとなる。同図の点線は、入射光量の減少に基づく画像信号を表したグラフである。このように画素アレイ部10の周縁部に配置された画素100の画像信号のレベルが低下する現象は、シェーディングと称される。  As shown in the figure, the image signal of the pixel 100 arranged in the center of the pixel array unit 10 has a relatively high level, and the pixel signal arranged in the peripheral portion of the pixel array unit 10 has a low level of pixel signal. Become. As described above, in the pixel 100 arranged in the central portion of the pixel array unit 10, light from the subject enters perpendicularly. On the other hand, the light from the subject is incident obliquely on the pixels 100 arranged in the peripheral portion. Since the amount of light per area of the opening 132 described with reference to FIG. 3 is reduced and the incident light of the pixel 100 is reduced, the level of the pixel signal is lowered in the pixel 100 arranged in the peripheral portion. If the above-described pupil correction is not performed, the level of the image signal further decreases. The dotted line in the figure is a graph showing an image signal based on a decrease in the amount of incident light. A phenomenon in which the level of the image signal of the pixels 100 arranged at the peripheral edge of the pixel array unit 10 in this way is called shading. *
 同図に表したように、遮光領域の近傍の画素100において、画像信号のレベルが変化する。前述のように、遮光領域の近傍の画素100において、平坦化膜161の膜厚の増加に伴い光路長が変化するためである。また、緑色画素100、赤色画素100および青色画素100毎に画像信号の特性が異なる。具体的には、同図におけるaの緑色画素100では、比較的高いレベルの画像信号が生成される(領域301)。同図におけるbの赤色画素100では、さらに高いレベルの画像信号が生成される(領域302)。一方、同図におけるcの青色画素100では、比較的低いレベルの画像信号が生成される(領域303)。これは、オンチップレンズ151を透過する際の屈折率が光の波長により異なるためである。特に波長が短い青色光は屈折率が大きくなるため、半導体基板121の表面近傍の光電変換部101に集光される。このため、青色画素100では、平坦化膜161の膜厚の影響を受けやすくなる。 As shown in the figure, the level of the image signal changes in the pixel 100 in the vicinity of the light shielding area. This is because, as described above, in the pixel 100 in the vicinity of the light-shielding region, the optical path length changes as the film thickness of the planarization film 161 increases. Further, the characteristics of the image signal are different for each of the green pixel 100, the red pixel 100, and the blue pixel 100. Specifically, a relatively high level image signal is generated in the green pixel 100 a in FIG. In the red pixel 100 of b in the figure, a higher level image signal is generated (region 302). On the other hand, a relatively low level image signal is generated in the blue pixel 100 of FIG. This is because the refractive index when passing through the on-chip lens 151 differs depending on the wavelength of light. In particular, blue light having a short wavelength has a high refractive index, and is thus focused on the photoelectric conversion unit 101 in the vicinity of the surface of the semiconductor substrate 121. For this reason, the blue pixel 100 is easily affected by the film thickness of the planarization film 161.
 <1.第1の実施の形態>
 [撮像素子の構成]
 図6は、本開示の第1の実施の形態に係る撮像素子の構成例を示す断面図である。同図は、画素100aおよび画素100bの構成例を表す断面図である。なお、画素100aは、比較例として記載したものであり、図3において説明した画素100aと同じである。これに対し、同図の画素100bは、遮光膜131の形状が画素100aの遮光膜131と異なる。具体的には、遮光膜131の開口部132の大きさが画素100aの開口部132とは異なる。同図の開口部132bおよび132cは、それぞれ緑色画素100および青色画素100に配置された遮光膜131の開口部に該当する。
<1. First Embodiment>
[Configuration of image sensor]
FIG. 6 is a cross-sectional view illustrating a configuration example of the imaging element according to the first embodiment of the present disclosure. FIG. 3 is a cross-sectional view illustrating a configuration example of the pixel 100a and the pixel 100b. Note that the pixel 100a is described as a comparative example and is the same as the pixel 100a described in FIG. On the other hand, the pixel 100b in the figure is different from the light shielding film 131 of the pixel 100a in the shape of the light shielding film 131. Specifically, the size of the opening 132 of the light shielding film 131 is different from that of the opening 132 of the pixel 100a. The openings 132b and 132c in the same figure correspond to the openings of the light shielding film 131 disposed in the green pixel 100 and the blue pixel 100, respectively.
 [オンチップレンズおよび遮光膜の構成]
 図7は、本開示の第1の実施の形態に係る画素の構成例を示す上面図である。同図は、図4におけるbと同様に、画素100bのオンチップレンズ151および遮光膜131の構成を表した図である。同図の開口部132a、132bおよび132cは、それぞれ赤色画素100、緑色画素100および青色画素100の開口部を表す。これらの開口部132a、132bおよび132cは、図4におけるaにおいて説明した開口部132より大きな開口部に構成される。開口部を大きくすることにより入射光量を増加させることができる。また、同図に表したように、青色画素100、緑色画素100および青色画素100の順に大きな開口部132が配置される。感度が比較的高くなる緑色画素100および赤色画素100の入射光量を減少させ、感度が比較的低くなる青色画素100の入射光量を増加させる。これにより、赤色画素100、緑色画素100および青色画素100の入射光量を調整し、画像信号のレベルを補正することができる。
[Configuration of on-chip lens and light shielding film]
FIG. 7 is a top view illustrating a configuration example of the pixel according to the first embodiment of the present disclosure. This figure is a diagram showing the configuration of the on-chip lens 151 and the light shielding film 131 of the pixel 100b, similarly to b in FIG. Openings 132a, 132b, and 132c in the figure represent the openings of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively. These openings 132a, 132b, and 132c are configured to be larger than the opening 132 described in FIG. Increasing the opening can increase the amount of incident light. In addition, as shown in the figure, large openings 132 are arranged in the order of the blue pixel 100, the green pixel 100, and the blue pixel 100. The incident light amount of the green pixel 100 and the red pixel 100 with relatively high sensitivity is decreased, and the incident light amount of the blue pixel 100 with relatively low sensitivity is increased. Thereby, the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
 [撮像素子の製造方法]
 図8および9は、本開示の第1の実施の形態に係る撮像素子の製造方法の一例を示す図である。図8および9は、撮像素子1の製造工程を表した図であり、図3において説明した画素100aおよび100bならびに遮光画素200の製造工程を表した図である。まず、半導体基板121に形成されたp型のウェル領域にn型半導体領域122を形成し、光電変換部101を形成する(図8におけるa)。当該工程は、請求の範囲に記載の光電変換部形成工程の一例である。
[Method for Manufacturing Image Sensor]
8 and 9 are diagrams illustrating an example of a method for manufacturing the imaging element according to the first embodiment of the present disclosure. 8 and 9 are diagrams showing the manufacturing process of the imaging device 1, and are diagrams showing the manufacturing processes of the pixels 100a and 100b and the light-shielding pixel 200 described in FIG. First, the n-type semiconductor region 122 is formed in the p-type well region formed in the semiconductor substrate 121, and the photoelectric conversion part 101 is formed (a in FIG. 8). This step is an example of the photoelectric conversion part forming step described in the claims.
 次に、半導体基板121に配線領域(不図示)を形成し、支持基板125(不図示)を接着する。次に、半導体基板121の裏面に絶縁膜126を形成する。これは、例えば、SiO等の絶縁膜126の材料をCVD(Chemical Vapor Deposition)等を使用して成膜することにより行うことができる(図8におけるb)。 Next, a wiring region (not shown) is formed on the semiconductor substrate 121, and a support substrate 125 (not shown) is bonded. Next, an insulating film 126 is formed on the back surface of the semiconductor substrate 121. This can be performed, for example, by depositing the material of the insulating film 126 such as SiO 2 using CVD (Chemical Vapor Deposition) or the like (b in FIG. 8).
 次に、絶縁膜126の表面に遮光膜131の材料となる金属膜401およびレジスト402を順に積層する。次にレジスト402に開口部403、403a(不図示)、403bおよび403cを形成する。この開口部403等は、図6および7において説明した開口部132、132a、132bおよび132cに対応する大きさおよび位置に形成される(図8におけるc)。次に、レジスト402をマスクとして使用し、金属膜401のエッチングを行う。これは、例えば、ドライエッチングにより行うことができる。これらの工程により、開口部132、132a(不図示)、132bおよび132cを備える遮光膜131を形成することができる(図8におけるd)。これにより、画素100aおよび100bならびに遮光画素200の遮光膜131を同時に形成することができる。また、画素100bの遮光膜131の開口部を132a、132bおよび132cに変更して形成することにより、画素100bの光電変換部101への入射光量を調整することができる。当該工程は、請求の範囲に記載の第1の遮光膜形成工程および第2の遮光膜形成工程の一例である。また、当該工程は、請求の範囲に記載の入射光量を調整する工程の一例である。 Next, a metal film 401 and a resist 402 as materials for the light shielding film 131 are sequentially stacked on the surface of the insulating film 126. Next, openings 403, 403 a (not shown), 403 b and 403 c are formed in the resist 402. The openings 403 and the like are formed in a size and a position corresponding to the openings 132, 132a, 132b, and 132c described in FIGS. 6 and 7 (c in FIG. 8). Next, the metal film 401 is etched using the resist 402 as a mask. This can be performed, for example, by dry etching. Through these steps, the light shielding film 131 including the openings 132, 132a (not shown), 132b, and 132c can be formed (d in FIG. 8). Thereby, the pixels 100a and 100b and the light shielding film 131 of the light shielding pixel 200 can be formed simultaneously. Further, by changing the opening of the light shielding film 131 of the pixel 100b to 132a, 132b, and 132c, the amount of light incident on the photoelectric conversion unit 101 of the pixel 100b can be adjusted. This step is an example of the first light shielding film forming step and the second light shielding film forming step described in the claims. Moreover, the said process is an example of the process of adjusting the incident light quantity as described in a claim.
 次に、カラーフィルタ141を絶縁膜126および遮光膜131の表面に形成する。これは、カラーフィルタ141の種類毎に行うことができる。例えば、緑色に対応したカラーフィルタ141の材料となる樹脂を塗布し、赤色画素100および青色画素100のカラーフィルタ141を配置する領域に開口部を形成して硬化させる。次に、この開口部に赤色および青色に対応したカラーフィルタ141の材料となる樹脂を配置することにより行うことができる(図9におけるe)。当該工程は、請求の範囲に記載のカラーフィルタ形成工程の一例である。 Next, the color filter 141 is formed on the surfaces of the insulating film 126 and the light shielding film 131. This can be done for each type of color filter 141. For example, a resin that is a material of the color filter 141 corresponding to green is applied, and openings are formed in regions where the color filters 141 of the red pixel 100 and the blue pixel 100 are disposed, and are cured. Next, this opening can be carried out by placing a resin as a material for the color filter 141 corresponding to red and blue (e in FIG. 9). This step is an example of the color filter forming step described in the claims.
 次に、カラーフィルタ141の表面にオンチップレンズの材料となる樹脂404を塗布する。この際、カラーフィルタ141の表面が平坦化される。後述するように、この樹脂404の表面近傍はオンチップレンズ151に構成される。一方、カラーフィルタ141に隣接する領域の樹脂404は、平坦化膜161に構成される(図9におけるf)。当該工程は、請求の範囲に記載の平坦化膜形成工程の一例である。 Next, a resin 404 that is a material of an on-chip lens is applied to the surface of the color filter 141. At this time, the surface of the color filter 141 is flattened. As will be described later, the vicinity of the surface of the resin 404 is constituted by an on-chip lens 151. On the other hand, the resin 404 in the region adjacent to the color filter 141 is formed on the planarizing film 161 (f in FIG. 9). This step is an example of a planarization film forming step described in the claims.
 次に、樹脂404の表面を半球形状に加工してオンチップレンズ151を形成する。これは、例えば、樹脂404の表面にオンチップレンズ151と同様の形状のレジストを配置し、ドライエッチングを行うことにより、レジストの形状を樹脂404に転写することにより行うことができる(図9におけるg)。当該工程は、請求の範囲に記載のオンチップレンズ形成工程の一例である。 Next, the surface of the resin 404 is processed into a hemispherical shape to form an on-chip lens 151. This can be performed, for example, by placing a resist having the same shape as the on-chip lens 151 on the surface of the resin 404 and performing dry etching to transfer the shape of the resist to the resin 404 (in FIG. 9). g). This step is an example of an on-chip lens forming step described in the claims.
 以上の工程により、画素領域110の画素100aおよび100bを形成することができ、遮光領域の遮光画素200を形成することができる。 Through the above steps, the pixels 100a and 100b in the pixel region 110 can be formed, and the light-shielding pixel 200 in the light-shielding region can be formed.
 以上説明したように、本開示の第1の実施の形態の撮像素子1は、遮光領域の近傍に配置された画素100の遮光膜131の形状を画素100毎に変更するとともに画素100のカラーフィルタ141の種類に応じて調整する。これにより、遮光領域の近傍に配置された画素100の入射光量を調整し、感度の変化を補正することができる。 As described above, the imaging device 1 according to the first embodiment of the present disclosure changes the shape of the light shielding film 131 of the pixel 100 arranged in the vicinity of the light shielding region for each pixel 100 and the color filter of the pixel 100. It adjusts according to the kind of 141. Thereby, the incident light quantity of the pixel 100 arranged in the vicinity of the light shielding region can be adjusted, and the change in sensitivity can be corrected.
 <2.第2の実施の形態>
 上述の第1の実施の形態の撮像素子1は、画素100aに対して画素100bの遮光膜131の開口部132の形状を変更していた。これに対し、本開示の第2の実施の形態の撮像素子1は、オンチップレンズ151の形状を変更する点で、上述の第1の実施の形態と異なる。
<2. Second Embodiment>
In the image pickup device 1 of the first embodiment described above, the shape of the opening 132 of the light shielding film 131 of the pixel 100b is changed with respect to the pixel 100a. On the other hand, the imaging device 1 according to the second embodiment of the present disclosure is different from the above-described first embodiment in that the shape of the on-chip lens 151 is changed.
 [撮像素子の構成]
 図10は、本開示の第2の実施の形態に係る撮像素子の構成例を示す断面図である。同図の画素100bは、オンチップレンズ151の代わりにオンチップレンズ152bおよび152cを備える点で、図6において説明した画素100bと異なる。
[Configuration of image sensor]
FIG. 10 is a cross-sectional view illustrating a configuration example of an imaging element according to the second embodiment of the present disclosure. The pixel 100b in the figure is different from the pixel 100b described in FIG. 6 in that on- chip lenses 152b and 152c are provided instead of the on-chip lens 151.
 オンチップレンズ152bおよび152cは、オンチップレンズ151とは異なる形状に構成されるオンチップレンズである。後述するように、オンチップレンズ152bおよび152cは、オンチップレンズ151より小さな曲率に構成され、それぞれ緑色画素100および青色画素100に配置されるオンチップレンズに該当する。 The on- chip lenses 152b and 152c are on-chip lenses configured in a shape different from the on-chip lens 151. As will be described later, the on- chip lenses 152b and 152c are configured to have a smaller curvature than the on-chip lens 151, and correspond to the on-chip lenses disposed in the green pixel 100 and the blue pixel 100, respectively.
 [オンチップレンズおよび遮光膜の構成]
 図11は、本開示の第2の実施の形態に係る画素の構成例を示す上面図である。同図のオンチップレンズ152a、152bおよび152cは、それぞれ赤色画素100、緑色画素100および青色画素100のオンチップレンズを表す。オンチップレンズ152a、152bおよび152cは、オンチップレンズ151と比較して小さな曲率に構成される。曲率を小さくすることにより、集光距離を長くすることができる。平坦化膜161の膜厚の増加に応じて集光距離を調整することが可能となる。また、同図に表したように、赤色画素100、緑色画素100および青色画素100の順に大きな形状に構成され、この順に小さな曲率に構成される。
[Configuration of on-chip lens and light shielding film]
FIG. 11 is a top view illustrating a configuration example of a pixel according to the second embodiment of the present disclosure. The on- chip lenses 152a, 152b, and 152c in the figure represent the on-chip lenses of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively. The on- chip lenses 152a, 152b, and 152c are configured to have a smaller curvature than the on-chip lens 151. By reducing the curvature, the condensing distance can be increased. The condensing distance can be adjusted according to the increase in the thickness of the planarizing film 161. Further, as shown in the figure, the red pixel 100, the green pixel 100, and the blue pixel 100 are configured in a large shape in this order, and are configured in a small curvature in this order.
 青色画素100のオンチップレンズ152cの曲率を赤色画素100および緑色画素100より小さくすることにより、青色画素100において青色光の集光位置を半導体基板121の表面近傍の光電変換部101にすることができる。一方、赤色画素100および緑色画素100においては、比較的大きな曲率に構成されることにより、集光位置を光電変換部101の端部近傍にすることができる。これにより、赤色画素100、緑色画素100および青色画素100の感度を調整することができ、画像信号のレベルを補正することができる。 By making the curvature of the on-chip lens 152c of the blue pixel 100 smaller than that of the red pixel 100 and the green pixel 100, the blue light condensing position in the blue pixel 100 can be the photoelectric conversion unit 101 near the surface of the semiconductor substrate 121. it can. On the other hand, in the red pixel 100 and the green pixel 100, the condensing position can be set near the end of the photoelectric conversion unit 101 by being configured with a relatively large curvature. Thereby, the sensitivity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of the image signal can be corrected.
 これ以外の撮像素子1の構成は本開示の第1の実施の形態において説明した撮像素子1の構成と同様であるため、説明を省略する。 Other configurations of the image sensor 1 are the same as the configurations of the image sensor 1 described in the first embodiment of the present disclosure, and thus the description thereof is omitted.
 以上説明したように、本開示の第2の実施の形態の撮像素子1は、オンチップレンズの形状を変更することにより、遮光領域の近傍に配置された画素100への入射光量を調整する。これより、遮光領域の近傍に配置された画素100の感度の変化を補正することができる。 As described above, the imaging device 1 according to the second embodiment of the present disclosure adjusts the amount of light incident on the pixels 100 arranged in the vicinity of the light-shielding region by changing the shape of the on-chip lens. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
 <3.第3の実施の形態>
 上述の第2の実施の形態の撮像素子1は、画素100aに対して画素100bのオンチップレンズの形状を変更していた。これに対し、本開示の第3の実施の形態の撮像素子1は、オンチップレンズの屈折率を変更する点で、上述の第2の実施の形態と異なる。
<3. Third Embodiment>
In the image pickup device 1 according to the second embodiment described above, the shape of the on-chip lens of the pixel 100b is changed with respect to the pixel 100a. On the other hand, the imaging device 1 according to the third embodiment of the present disclosure is different from the above-described second embodiment in that the refractive index of the on-chip lens is changed.
 [撮像素子の構成]
 図12は、本開示の第3の実施の形態に係る撮像素子の構成例を示す断面図である。同図の画素100bは、オンチップレンズ152cおよび152dの代わりにオンチップレンズ153bおよび153cを備える点で、図10において説明した画素100bと異なる。
[Configuration of image sensor]
FIG. 12 is a cross-sectional view illustrating a configuration example of an imaging element according to the third embodiment of the present disclosure. The pixel 100b in the figure is different from the pixel 100b described in FIG. 10 in that on- chip lenses 153b and 153c are provided instead of the on-chip lenses 152c and 152d.
 オンチップレンズ153bおよび153cは、オンチップレンズ151とは異なる屈折率に構成されるオンチップレンズである。後述するように、オンチップレンズ153bおよび151cは、オンチップレンズ151より小さな屈折率に構成され、それぞれ緑色画素100および青色画素100に配置されるオンチップレンズに該当する。 The on- chip lenses 153b and 153c are on-chip lenses configured to have a refractive index different from that of the on-chip lens 151. As will be described later, the on-chip lenses 153b and 151c are configured to have a refractive index smaller than that of the on-chip lens 151, and correspond to the on-chip lenses disposed in the green pixel 100 and the blue pixel 100, respectively.
[オンチップレンズおよび遮光膜の構成]
 図13は、本開示の第3の実施の形態に係る画素の構成例を示す上面図である。同図のオンチップレンズ153a、153bおよび153cは、それぞれ赤色画素100、緑色画素100および青色画素100のオンチップレンズを表す。オンチップレンズ153a、153bおよび153cは、オンチップレンズ151と比較して小さな屈折率に構成される。また、オンチップレンズ153a、153bおよび153cは、この順に小さな屈折率に構成される。画素100aのオンチップレンズ151と比較して画素100bのオンチップレンズ153a、153bおよび153cの屈折率を小さくすることにより、集光距離を大きくすることができ、平坦化膜161の膜厚の増加に伴う集光位置の変化の影響を軽減することができる。
[Configuration of on-chip lens and light shielding film]
FIG. 13 is a top view illustrating a configuration example of a pixel according to the third embodiment of the present disclosure. On- chip lenses 153a, 153b, and 153c in the figure represent on-chip lenses of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively. The on- chip lenses 153a, 153b, and 153c are configured to have a smaller refractive index than the on-chip lens 151. The on- chip lenses 153a, 153b, and 153c are configured to have a small refractive index in this order. By reducing the refractive index of the on- chip lenses 153a, 153b, and 153c of the pixel 100b as compared with the on-chip lens 151 of the pixel 100a, the condensing distance can be increased, and the thickness of the planarization film 161 is increased. It is possible to reduce the influence of the change in the condensing position due to.
 また、青色画素100のオンチップレンズ153cの屈折率を赤色画素100および緑色画素100より小さくすることにより、青色画素100において青色光の集光位置を半導体基板121の表面近傍の光電変換部101にすることができる。これにより、赤色画素100、緑色画素100および青色画素100の入射光量を調整し、画像信号のレベルを補正することができる。なお、オンチップレンズ153a、153bおよび153cは、オンチップレンズ151とは異なる材料により構成することができる。また、オンチップレンズ153a、153bおよび153cの相互においても材料を変更することにより、異なる屈折率のオンチップレンズに構成することができる。オンチップレンズ153a、153bおよび153cは、例えば、平坦化膜161を形成した後に、図9におけるgにおいて説明したオンチップレンズ形成工程をそれぞれ行うことにより形成することができる。 In addition, by making the refractive index of the on-chip lens 153 c of the blue pixel 100 smaller than that of the red pixel 100 and the green pixel 100, the blue light condensing position in the blue pixel 100 is shifted to the photoelectric conversion unit 101 near the surface of the semiconductor substrate 121. can do. Thereby, the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected. The on- chip lenses 153a, 153b, and 153c can be made of a material different from that of the on-chip lens 151. In addition, the on- chip lenses 153a, 153b, and 153c can also be configured as on-chip lenses having different refractive indexes by changing materials. The on- chip lenses 153a, 153b, and 153c can be formed, for example, by performing the on-chip lens forming process described in FIG. 9g after forming the planarizing film 161, respectively.
 これ以外の撮像素子1の構成は本開示の第2の実施の形態において説明した撮像素子1の構成と同様であるため、説明を省略する。 Other configurations of the image sensor 1 are the same as the configurations of the image sensor 1 described in the second embodiment of the present disclosure, and thus the description thereof is omitted.
 以上説明したように、本開示の第2の実施の形態の撮像素子1は、オンチップレンズ151の屈折率を変更することにより、遮光領域の近傍に配置された画素100への入射光量を調整する。これより、遮光領域の近傍に配置された画素100の感度の変化を補正することができる。 As described above, the imaging device 1 according to the second embodiment of the present disclosure adjusts the amount of incident light on the pixels 100 arranged in the vicinity of the light-shielding region by changing the refractive index of the on-chip lens 151. To do. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
 <4.第4の実施の形態>
 上述の第1の実施の形態の撮像素子1は、画素100aに対して画素100bの遮光膜131の開口部132の形状を変更していた。これに対し、本開示の第4の実施の形態の撮像素子1は、カラーフィルタ141の形状を変更する点で、上述の第1の実施の形態と異なる。
<4. Fourth Embodiment>
In the image pickup device 1 of the first embodiment described above, the shape of the opening 132 of the light shielding film 131 of the pixel 100b is changed with respect to the pixel 100a. On the other hand, the imaging device 1 according to the fourth embodiment of the present disclosure is different from the above-described first embodiment in that the shape of the color filter 141 is changed.
 [撮像素子の構成]
 図14は、本開示の第4の実施の形態に係る撮像素子の構成例を示す断面図である。同図の画素100bは、カラーフィルタ141の代わりにカラーフィルタ142bおよび142cを備える点で、図6において説明した画素100bと異なる。
[Configuration of image sensor]
FIG. 14 is a cross-sectional view illustrating a configuration example of an imaging element according to the fourth embodiment of the present disclosure. The pixel 100b in the figure is different from the pixel 100b described in FIG. 6 in that color filters 142b and 142c are provided instead of the color filter 141.
 カラーフィルタ142bおよび142cは、カラーフィルタ141とは異なる形状に構成されるカラーフィルタである。具体的には、カラーフィルタ142bおよび142cは、カラーフィルタ141より薄い膜厚に構成され、それぞれ緑色画素100および青色画素100に配置されるカラーフィルタに該当する。また、図示はしないが、赤色画素100においてもカラーフィルタ141より薄い膜厚のカラーフィルタ142aが配置される。また、カラーフィルタ142a、142bおよび142cは、この順に薄い膜厚に構成される。カラーフィルタ142a、142bおよび142cの膜厚をカラーフィルタ141より薄くすることにより画素100bの入射光量を増加させることができ、平坦化膜161の膜厚の増加に伴う入射光量の変化の影響を軽減することができる。 The color filters 142b and 142c are color filters configured in a shape different from that of the color filter 141. Specifically, the color filters 142b and 142c are formed with a film thickness thinner than that of the color filter 141, and correspond to the color filters arranged in the green pixel 100 and the blue pixel 100, respectively. Although not shown, the color filter 142 a having a thickness smaller than that of the color filter 141 is also disposed in the red pixel 100. In addition, the color filters 142a, 142b, and 142c are configured to have a thin film thickness in this order. By making the film thickness of the color filters 142a, 142b, and 142c thinner than the color filter 141, the incident light quantity of the pixel 100b can be increased, and the influence of the change in the incident light quantity accompanying the increase in the film thickness of the planarizing film 161 is reduced. can do.
 また、青色画素100のカラーフィルタ142cの膜厚を赤色画素100および緑色画素100より薄くすることにより、青色画素100の入射光量を増加させることができる。これにより、赤色画素100、緑色画素100および青色画素100の入射光量を調整し、画像信号のレベルを補正することができる。 Further, by making the film thickness of the color filter 142c of the blue pixel 100 thinner than that of the red pixel 100 and the green pixel 100, the incident light quantity of the blue pixel 100 can be increased. Thereby, the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
 また、カラーフィルタ142a、142bおよび142cは、角を丸く構成すると好適である。隣接する画素100bへの光の入射の妨げとなる、いわゆる「ケラレ」を防止することができるためである。 Further, it is preferable that the color filters 142a, 142b, and 142c have rounded corners. This is because it is possible to prevent so-called “vignetting” that prevents light from entering the adjacent pixel 100b.
 これ以外の撮像素子1の構成は本開示の第1の実施の形態において説明した撮像素子1の構成と同様であるため、説明を省略する。 Other configurations of the image sensor 1 are the same as the configurations of the image sensor 1 described in the first embodiment of the present disclosure, and thus the description thereof is omitted.
 以上説明したように、本開示の第4の実施の形態の撮像素子1は、カラーフィルタ142の形状を変更することにより、遮光領域の近傍に配置された画素100への入射光量を調整する。これより、遮光領域の近傍に配置された画素100の感度の変化を補正することができる。 As described above, the imaging device 1 according to the fourth embodiment of the present disclosure adjusts the amount of incident light on the pixels 100 arranged in the vicinity of the light shielding region by changing the shape of the color filter 142. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
 <5.カメラへの応用例>
 本開示に係る技術(本技術)は、様々な製品に応用することができる。例えば、本技術は、カメラ等の撮像装置に搭載される撮像素子として実現されてもよい。
<5. Application examples for cameras>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the present technology may be realized as an imaging element mounted on an imaging device such as a camera.
 図15は、本技術が適用され得る撮像装置の一例であるカメラの概略的な構成例を示すブロック図である。同図のカメラ1000は、レンズ1001と、撮像素子1002と、撮像制御部1003と、レンズ駆動部1004と、画像処理部1005と、操作入力部1006と、フレームメモリ1007と、表示部1008と、記録部1009とを備える。 FIG. 15 is a block diagram illustrating a schematic configuration example of a camera that is an example of an imaging apparatus to which the present technology can be applied. The camera 1000 shown in FIG. 1 includes a lens 1001, an image sensor 1002, an imaging control unit 1003, a lens driving unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, And a recording unit 1009.
 レンズ1001は、カメラ1000の撮影レンズである。このレンズ1001は、被写体からの光を集光し、後述する撮像素子1002に入射させて被写体を結像させる。 The lens 1001 is a photographing lens of the camera 1000. The lens 1001 collects light from the subject and makes it incident on an image sensor 1002 described later to form an image of the subject.
 撮像素子1002は、レンズ1001により集光された被写体からの光を撮像する半導体素子である。この撮像素子1002は、照射された光に応じたアナログの画像信号を生成し、デジタルの画像信号に変換して出力する。 The imaging element 1002 is a semiconductor element that images light from the subject condensed by the lens 1001. The image sensor 1002 generates an analog image signal corresponding to the irradiated light, converts it into a digital image signal, and outputs it.
 撮像制御部1003は、撮像素子1002における撮像を制御するものである。この撮像制御部1003は、制御信号を生成して撮像素子1002に対して出力することにより、撮像素子1002の制御を行う。また、撮像制御部1003は、撮像素子1002から出力された画像信号に基づいてカメラ1000におけるオートフォーカスを行うことができる。ここでオートフォーカスとは、レンズ1001の焦点位置を検出して、自動的に調整するシステムである。このオートフォーカスとして、撮像素子1002に配置された位相差画素により像面位相差を検出して焦点位置を検出する方式(像面位相差オートフォーカス)を使用することができる。また、画像のコントラストが最も高くなる位置を焦点位置として検出する方式(コントラストオートフォーカス)を適用することもできる。撮像制御部1003は、検出した焦点位置に基づいてレンズ駆動部1004を介してレンズ1001の位置を調整し、オートフォーカスを行う。なお、撮像制御部1003は、例えば、ファームウェアを搭載したDSP(Digital Signal Processor)により構成することができる。 The imaging control unit 1003 controls imaging in the imaging element 1002. The imaging control unit 1003 controls the imaging element 1002 by generating a control signal and outputting the control signal to the imaging element 1002. Further, the imaging control unit 1003 can perform autofocus in the camera 1000 based on the image signal output from the imaging element 1002. Here, the autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it. As this autofocus, a method (image plane phase difference autofocus) in which an image plane phase difference is detected by a phase difference pixel arranged in the image sensor 1002 to detect a focal position can be used. In addition, a method (contrast autofocus) in which a position where the contrast of an image is the highest is detected as a focal position can be applied. The imaging control unit 1003 adjusts the position of the lens 1001 via the lens driving unit 1004 based on the detected focal position, and performs autofocus. Note that the imaging control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
 レンズ駆動部1004は、撮像制御部1003の制御に基づいて、レンズ1001を駆動するものである。このレンズ駆動部1004は、内蔵するモータを使用してレンズ1001の位置を変更することによりレンズ1001を駆動することができる。 The lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003. The lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
 画像処理部1005は、撮像素子1002により生成された画像信号を処理するものである。この処理には、例えば、画素毎の赤色、緑色および青色に対応する画像信号のうち不足する色の画像信号を生成するデモザイク、画像信号のノイズを除去するノイズリダクションおよび画像信号の符号化等が該当する。画像処理部1005は、例えば、ファームウェアを搭載したマイコンにより構成することができる。 The image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic that generates an image signal of insufficient color among image signals corresponding to red, green, and blue for each pixel, noise reduction that removes noise of the image signal, and encoding of the image signal. Applicable. The image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
 操作入力部1006は、カメラ1000の使用者からの操作入力を受け付けるものである。この操作入力部1006には、例えば、押しボタンやタッチパネルを使用することができる。操作入力部1006により受け付けられた操作入力は、撮像制御部1003や画像処理部1005に伝達される。その後、操作入力に応じた処理、例えば、被写体の撮像等の処理が起動される。 The operation input unit 1006 receives an operation input from the user of the camera 1000. For the operation input unit 1006, for example, a push button or a touch panel can be used. The operation input received by the operation input unit 1006 is transmitted to the imaging control unit 1003 and the image processing unit 1005. Thereafter, processing according to the operation input, for example, processing such as imaging of a subject is started.
 フレームメモリ1007は、1画面分の画像信号であるフレームを記憶するメモリである。このフレームメモリ1007は、画像処理部1005により制御され、画像処理の過程におけるフレームの保持を行う。 The frame memory 1007 is a memory for storing frames that are image signals for one screen. The frame memory 1007 is controlled by the image processing unit 1005 and holds a frame in the course of image processing.
 表示部1008は、画像処理部1005により処理された画像を表示するものである。この表示部1008には、例えば、液晶パネルを使用することができる。 The display unit 1008 displays the image processed by the image processing unit 1005. For example, a liquid crystal panel can be used for the display unit 1008.
 記録部1009は、画像処理部1005により処理された画像を記録するものである。この記録部1009には、例えば、メモリカードやハードディスクを使用することができる。 The recording unit 1009 records the image processed by the image processing unit 1005. For the recording unit 1009, for example, a memory card or a hard disk can be used.
 以上、本発明が適用され得るカメラについて説明した。本技術は以上において説明した構成のうち、撮像素子1002に適用され得る。具体的には、図1において説明した撮像素子1は、撮像素子1002に適用することができる。撮像素子1002に撮像素子1を適用することにより遮光領域近傍の画素100の感度の変化を補正することができ、カメラ1000により生成される画像の画質の低下を防止することができる。 The camera to which the present invention can be applied has been described above. The present technology can be applied to the image sensor 1002 among the configurations described above. Specifically, the image sensor 1 described in FIG. 1 can be applied to the image sensor 1002. By applying the image sensor 1 to the image sensor 1002, a change in sensitivity of the pixel 100 in the vicinity of the light-shielding region can be corrected, and deterioration in image quality of an image generated by the camera 1000 can be prevented.
 なお、ここでは、一例としてカメラについて説明したが、本発明に係る技術は、その他、例えば監視装置等に適用されてもよい。 In addition, although the camera was demonstrated as an example here, the technique which concerns on this invention may be applied to a monitoring apparatus etc., for example.
 <6.内視鏡手術システムへの応用例>
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
<6. Application example to endoscopic surgery system>
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
 図16は、本開示に係る技術が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。 FIG. 16 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure can be applied.
 図16では、術者(医師)11131が、内視鏡手術システム11000を用いて、患者ベッド11133上の患者11132に手術を行っている様子が図示されている。図示するように、内視鏡手術システム11000は、内視鏡11100と、気腹チューブ11111やエネルギー処置具11112等の、その他の術具11110と、内視鏡11100を支持する支持アーム装置11120と、内視鏡下手術のための各種の装置が搭載されたカート11200と、から構成される。 FIG. 16 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000. As shown in the figure, an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
 内視鏡11100は、先端から所定の長さの領域が患者11132の体腔内に挿入される鏡筒11101と、鏡筒11101の基端に接続されるカメラヘッド11102と、から構成される。図示する例では、硬性の鏡筒11101を有するいわゆる硬性鏡として構成される内視鏡11100を図示しているが、内視鏡11100は、軟性の鏡筒を有するいわゆる軟性鏡として構成されてもよい。 The endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101. In the illustrated example, an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
 鏡筒11101の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡11100には光源装置11203が接続されており、当該光源装置11203によって生成された光が、鏡筒11101の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者11132の体腔内の観察対象に向かって照射される。なお、内視鏡11100は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。 An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101. A light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens. Note that the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
 カメラヘッド11102の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU: Camera Control Unit)11201に送信される。 An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
 CCU11201は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡11100及び表示装置11202の動作を統括的に制御する。さらに、CCU11201は、カメラヘッド11102から画像信号を受け取り、その画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。 The CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
 表示装置11202は、CCU11201からの制御により、当該CCU11201によって画像処理が施された画像信号に基づく画像を表示する。 The display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
 光源装置11203は、例えばLED(Light Emitting Diode)等の光源から構成され、術部等を撮影する際の照射光を内視鏡11100に供給する。 The light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
 入力装置11204は、内視鏡手術システム11000に対する入力インタフェースである。ユーザは、入力装置11204を介して、内視鏡手術システム11000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、内視鏡11100による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示等を入力する。 The input device 11204 is an input interface for the endoscopic surgery system 11000. A user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
 処置具制御装置11205は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11112の駆動を制御する。気腹装置11206は、内視鏡11100による視野の確保及び術者の作業空間の確保の目的で、患者11132の体腔を膨らめるために、気腹チューブ11111を介して当該体腔内にガスを送り込む。レコーダ11207は、手術に関する各種の情報を記録可能な装置である。プリンタ11208は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 The treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like. In order to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the operator's work space, the pneumoperitoneum device 11206 passes gas into the body cavity via the insufflation tube 11111. Send in. The recorder 11207 is an apparatus capable of recording various types of information related to surgery. The printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
 なお、内視鏡11100に術部を撮影する際の照射光を供給する光源装置11203は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成することができる。RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置11203において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド11102の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。 In addition, the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof. When a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out. In this case, laser light from each of the RGB laser light sources is irradiated on the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby corresponding to each RGB. It is also possible to take the images that have been taken in time division. According to this method, a color image can be obtained without providing a color filter in the image sensor.
 また、光源装置11203は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド11102の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。 Further, the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
 また、光源装置11203は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察すること(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得ること等を行うことができる。光源装置11203は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。 Further, the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation. In special light observation, for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation. A so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast. Alternatively, in special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light. In fluorescence observation, the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent. The light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
 図17は、図16に示すカメラヘッド11102及びCCU11201の機能構成の一例を示すブロック図である。 FIG. 17 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
 カメラヘッド11102は、レンズユニット11401と、撮像部11402と、駆動部11403と、通信部11404と、カメラヘッド制御部11405と、を有する。CCU11201は、通信部11411と、画像処理部11412と、制御部11413と、を有する。カメラヘッド11102とCCU11201とは、伝送ケーブル11400によって互いに通信可能に接続されている。 The camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405. The CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
 レンズユニット11401は、鏡筒11101との接続部に設けられる光学系である。鏡筒11101の先端から取り込まれた観察光は、カメラヘッド11102まで導光され、当該レンズユニット11401に入射する。レンズユニット11401は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。 The lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401. The lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
 撮像部11402は、撮像素子で構成される。撮像部11402を構成する撮像素子は、1つ(いわゆる単板式)であってもよいし、複数(いわゆる多板式)であってもよい。撮像部11402が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、撮像部11402は、3D(Dimensional)表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者11131は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部11402が多板式で構成される場合には、各撮像素子に対応して、レンズユニット11401も複数系統設けられ得る。 The imaging unit 11402 includes an imaging element. One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used. In the case where the imaging unit 11402 is configured as a multi-plate type, for example, image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them. Alternatively, the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site. Note that in the case where the imaging unit 11402 is configured as a multi-plate type, a plurality of lens units 11401 can be provided corresponding to each imaging element.
 また、撮像部11402は、必ずしもカメラヘッド11102に設けられなくてもよい。例えば、撮像部11402は、鏡筒11101の内部に、対物レンズの直後に設けられてもよい。 Further, the imaging unit 11402 is not necessarily provided in the camera head 11102. For example, the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
 駆動部11403は、アクチュエータによって構成され、カメラヘッド制御部11405からの制御により、レンズユニット11401のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部11402による撮像画像の倍率及び焦点が適宜調整され得る。 The driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
 通信部11404は、CCU11201との間で各種の情報を送受信するための通信装置によって構成される。通信部11404は、撮像部11402から得た画像信号をRAWデータとして伝送ケーブル11400を介してCCU11201に送信する。 The communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201. The communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
 また、通信部11404は、CCU11201から、カメラヘッド11102の駆動を制御するための制御信号を受信し、カメラヘッド制御部11405に供給する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。 Further, the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405. The control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
 なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、ユーザによって適宜指定されてもよいし、取得された画像信号に基づいてCCU11201の制御部11413によって自動的に設定されてもよい。後者の場合には、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡11100に搭載されていることになる。 Note that the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
 カメラヘッド制御部11405は、通信部11404を介して受信したCCU11201からの制御信号に基づいて、カメラヘッド11102の駆動を制御する。 The camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
 通信部11411は、カメラヘッド11102との間で各種の情報を送受信するための通信装置によって構成される。通信部11411は、カメラヘッド11102から、伝送ケーブル11400を介して送信される画像信号を受信する。 The communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
 また、通信部11411は、カメラヘッド11102に対して、カメラヘッド11102の駆動を制御するための制御信号を送信する。画像信号や制御信号は、電気通信や光通信等によって送信することができる。 Further, the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102. The image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
 画像処理部11412は、カメラヘッド11102から送信されたRAWデータである画像信号に対して各種の画像処理を施す。 The image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
 制御部11413は、内視鏡11100による術部等の撮像、及び、術部等の撮像により得られる撮像画像の表示に関する各種の制御を行う。例えば、制御部11413は、カメラヘッド11102の駆動を制御するための制御信号を生成する。 The control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
 また、制御部11413は、画像処理部11412によって画像処理が施された画像信号に基づいて、術部等が映った撮像画像を表示装置11202に表示させる。この際、制御部11413は、各種の画像認識技術を用いて撮像画像内における各種の物体を認識してもよい。例えば、制御部11413は、撮像画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11112の使用時のミスト等を認識することができる。制御部11413は、表示装置11202に撮像画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させてもよい。手術支援情報が重畳表示され、術者11131に提示されることにより、術者11131の負担を軽減することや、術者11131が確実に手術を進めることが可能になる。 Further, the control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
 カメラヘッド11102及びCCU11201を接続する伝送ケーブル11400は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。 The transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
 ここで、図示する例では、伝送ケーブル11400を用いて有線で通信が行われていたが、カメラヘッド11102とCCU11201との間の通信は無線で行われてもよい。 Here, in the illustrated example, communication is performed by wire using the transmission cable 11400. However, communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
 以上、本開示に係る技術が適用され得る内視鏡手術システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部11402に適用され得る。具体的には、図1の撮像素子1は、撮像部10402に適用することができる。撮像部10402に本開示に係る技術を適用することにより、高い画質の術部画像を得ることができるため、術者が術部を確実に確認することが可能になる。 In the foregoing, an example of an endoscopic surgery system to which the technology according to the present disclosure can be applied has been described. The technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above. Specifically, the imaging element 1 in FIG. 1 can be applied to the imaging unit 10402. By applying the technique according to the present disclosure to the imaging unit 10402, a high-quality surgical part image can be obtained, so that the surgeon can surely check the surgical part.
 なお、ここでは、一例として内視鏡手術システムについて説明したが、本開示に係る技術は、その他、例えば、顕微鏡手術システム等に適用されてもよい。 Note that although an endoscopic surgery system has been described here as an example, the technology according to the present disclosure may be applied to, for example, a microscope surgery system and the like.
 <7.移動体への応用例>
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<7. Application example to mobile objects>
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure is realized as a device that is mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
 図18は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 18 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図18に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 The vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example illustrated in FIG. 18, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050. As a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp. In this case, the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches. The body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted. For example, the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image. The vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light. The imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The vehicle interior information detection unit 12040 detects vehicle interior information. For example, a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010. For example, the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 Further, the microcomputer 12051 controls the driving force generation device, the steering mechanism, the braking device, and the like based on information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Further, the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030. For example, the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図18の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle. In the example of FIG. 18, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices. The display unit 12062 may include at least one of an on-board display and a head-up display, for example.
 図19は、撮像部12031の設置位置の例を示す図である。 FIG. 19 is a diagram illustrating an example of an installation position of the imaging unit 12031.
 図19では、車両12100は、撮像部12031として、撮像部12101、12102、12103、12104、12105を有する。  In FIG. 19, the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031. *
 撮像部12101、12102、12103、12104、12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102、12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper portion of a windshield in the vehicle interior of the vehicle 12100. The imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100. The imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100. The forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図19には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112、12113は、それぞれサイドミラーに設けられた撮像部12102、12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 FIG. 19 shows an example of the shooting range of the imaging units 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively, and the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, the microcomputer 12051, based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100). In particular, it is possible to extract, as a preceding vehicle, a three-dimensional object that travels at a predetermined speed (for example, 0 km / h or more) in the same direction as the vehicle 12100, particularly the closest three-dimensional object on the traveling path of the vehicle 12100. it can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. Thus, cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. The microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not a person is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining. When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 displays a rectangular contour line for emphasizing the recognized pedestrian. The display unit 12062 is controlled so as to be superimposed and displayed. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
 以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部12031等に適用され得る。具体的には、図1の撮像素子1は、撮像部12031に適用することができる。撮像部12031に本開示に係る技術を適用することにより、高画質の撮影画像を得ることができるため、ドライバの疲労を軽減することが可能になる。 Heretofore, an example of a vehicle control system to which the technology according to the present disclosure can be applied has been described. The technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above. Specifically, the imaging device 1 in FIG. 1 can be applied to the imaging unit 12031. By applying the technique according to the present disclosure to the imaging unit 12031, a high-quality captured image can be obtained, and driver fatigue can be reduced.
 最後に、上述した各実施の形態の説明は本開示の一例であり、本開示は上述の実施の形態に限定されることはない。このため、上述した各実施の形態以外であっても、本開示に係る技術的思想を逸脱しない範囲であれば、設計等に応じて種々の変更が可能であることは勿論である。 Finally, the description of each embodiment described above is an example of the present disclosure, and the present disclosure is not limited to the above-described embodiment. For this reason, it is a matter of course that various modifications can be made in accordance with the design and the like as long as they do not depart from the technical idea according to the present disclosure other than the embodiments described above.
 なお、本技術は以下のような構成もとることができる。
(1)入射光に基づく光電変換を行う光電変換部と、開口部が配置されて前記入射光を遮光しながら前記開口部において入射光を透過させる遮光膜と、前記入射光のうち所定の波長の入射光を透過させるカラーフィルタと、前記カラーフィルタの表面を平坦化する平坦化膜と、前記平坦化膜に隣接して配置されて前記カラーフィルタおよび前記遮光膜の開口部を介して前記入射光を前記光電変換部に集光するオンチップレンズとを備える画素が配置される画素領域と、
 前記開口部が配置されない前記遮光膜を備える前記画素である遮光画素が配置されるとともに前記画素領域に隣接する遮光領域と
を具備し、
 前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する
撮像素子。
(2)前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を当該画素のカラーフィルタに応じてさらに調整する前記(1)に記載の撮像素子。
(3)前記遮光膜の形状を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する前記(1)または(2)に記載の撮像素子。
(4)前記オンチップレンズの形状を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する前記(1)または(2)に記載の撮像素子。
(5)前記オンチップレンズの屈折率を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する前記(1)または(2)に記載の撮像素子。
(6)前記カラーフィルタの形状を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する前記(1)または(2)に記載の撮像素子。
(7)入射光に基づく光電変換を行う光電変換部形成工程と、開口部が配置されて前記入射光を遮光しながら前記開口部において入射光を透過させる遮光膜を形成する第1の遮光膜形成工程と、前記入射光のうち所定の波長の入射光を透過させるカラーフィルタを形成するカラーフィルタ形成工程と、前記カラーフィルタの表面を平坦化する平坦化膜を形成する平坦化膜形成工程と、前記平坦化膜に隣接して配置されて前記カラーフィルタおよび前記遮光膜の開口部を介して前記入射光を前記光電変換部に集光するオンチップレンズを形成するオンチップレンズ形成工程とを備える複数の画素の形成工程と、 前記光電変換部形成工程と、前記開口部が配置されない遮光膜を形成する第2の遮光膜形成工程と、前記カラーフィルタ形成工程と、前記平坦化膜形成工程と、前記オンチップレンズ形成工程とを備える遮光画素の形成工程と
を具備し、
 前記遮光画素の形成工程は、前記形成される複数の画素が配置される画素領域の周囲に前記遮光画素を形成する工程であり、
 前記画素の製造工程は、前記遮光画素が形成される領域である遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する工程をさらに備える
撮像素子の製造方法。
In addition, this technique can also take the following structures.
(1) A photoelectric conversion unit that performs photoelectric conversion based on incident light, a light-shielding film in which an opening is disposed and blocks the incident light while transmitting the incident light, and a predetermined wavelength of the incident light A color filter that transmits the incident light, a planarizing film that planarizes the surface of the color filter, and the entrance through the openings of the color filter and the light shielding film that are disposed adjacent to the planarizing film. A pixel region in which a pixel including an on-chip lens that collects incident light on the photoelectric conversion unit is disposed;
A light-shielding pixel that is the pixel including the light-shielding film in which the opening is not disposed and a light-shielding region adjacent to the pixel region;
An image sensor that adjusts the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region.
(2) The imaging device according to (1), wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is further adjusted according to a color filter of the pixel.
(3) The imaging device according to (1) or (2), wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing the shape of the light shielding film.
(4) The imaging device according to (1) or (2), wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing the shape of the on-chip lens.
(5) The imaging device according to (1) or (2), wherein an incident light amount to the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing a refractive index of the on-chip lens.
(6) The imaging device according to (1) or (2), wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing the shape of the color filter.
(7) A photoelectric conversion part forming step for performing photoelectric conversion based on incident light, and a first light shielding film in which an opening is disposed to form a light shielding film that transmits the incident light through the opening while shielding the incident light. A forming step, a color filter forming step for forming a color filter that transmits incident light of a predetermined wavelength among the incident light, and a planarizing film forming step for forming a flattening film for flattening the surface of the color filter. An on-chip lens forming step of forming an on-chip lens disposed adjacent to the planarizing film and condensing the incident light on the photoelectric conversion unit through the color filter and the opening of the light shielding film. A plurality of pixels forming step, the photoelectric conversion portion forming step, a second light shielding film forming step for forming a light shielding film in which the opening is not disposed, and the color filter forming step, Comprising a step of forming a light-shielding pixel comprising the planarizing film forming step and the on-chip lens forming step,
The formation step of the light shielding pixel is a step of forming the light shielding pixel around a pixel region where the plurality of pixels to be formed are arranged.
The manufacturing method of the image pickup device further includes a step of adjusting an amount of incident light to the photoelectric conversion unit in the pixel in the vicinity of the light shielding region, which is a region where the light shielding pixel is formed.
 1 撮像素子
 10 画素アレイ部
 100、100a、100b 画素
 101 光電変換部
 110 画素領域
 120 遮光領域
 121 半導体基板
 131 遮光膜
 132、132a、132b 開口部
 141、142a、142b、142c カラーフィルタ
 151、152a、152b、152c、153a、153b、153c オンチップレンズ
 161 平坦化膜
 200 遮光画素
 1002 撮像素子
 10402、12031、12101~12105 撮像部
DESCRIPTION OF SYMBOLS 1 Image sensor 10 Pixel array part 100, 100a, 100b Pixel 101 Photoelectric conversion part 110 Pixel area 120 Light-shielding area 121 Semiconductor substrate 131 Light-shielding film 132, 132a, 132b Opening part 141, 142a, 142b, 142c Color filter 151, 152a, 152b , 152c, 153a, 153b, 153c On-chip lens 161 Flattening film 200 Light-shielding pixel 1002 Imaging element 10402, 12031, 12101 to 12105 Imaging unit

Claims (7)

  1.  入射光に基づく光電変換を行う光電変換部と、開口部が配置されて前記入射光を遮光しながら前記開口部において入射光を透過させる遮光膜と、前記入射光のうち所定の波長の入射光を透過させるカラーフィルタと、前記カラーフィルタの表面を平坦化する平坦化膜と、前記平坦化膜に隣接して配置されて前記カラーフィルタおよび前記遮光膜の開口部を介して前記入射光を前記光電変換部に集光するオンチップレンズとを備える画素が配置される画素領域と、
     前記開口部が配置されない前記遮光膜を備える前記画素である遮光画素が配置されるとともに前記画素領域に隣接する遮光領域と
    を具備し、
     前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する
    撮像素子。
    A photoelectric conversion unit that performs photoelectric conversion based on incident light; a light-shielding film that includes an opening and blocks the incident light while transmitting the incident light; and incident light having a predetermined wavelength among the incident light A color filter that transmits light, a flattening film that flattens the surface of the color filter, and the incident light that is disposed adjacent to the flattening film through the openings of the color filter and the light shielding film. A pixel region in which a pixel including an on-chip lens for focusing on the photoelectric conversion unit is disposed;
    A light-shielding pixel that is the pixel including the light-shielding film in which the opening is not disposed and a light-shielding region adjacent to the pixel region;
    An image sensor that adjusts the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region.
  2.  前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を当該画素のカラーフィルタに応じてさらに調整する請求項1記載の撮像素子。 The image pickup device according to claim 1, wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is further adjusted according to a color filter of the pixel.
  3.  前記遮光膜の形状を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する請求項1記載の撮像素子。 The imaging element according to claim 1, wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing the shape of the light shielding film.
  4.  前記オンチップレンズの形状を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する請求項1記載の撮像素子。 The imaging element according to claim 1, wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing the shape of the on-chip lens.
  5.  前記オンチップレンズの屈折率を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する請求項1記載の撮像素子。 The imaging device according to claim 1, wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing a refractive index of the on-chip lens.
  6.  前記カラーフィルタの形状を変更することにより前記遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する請求項1記載の撮像素子。 The image sensor according to claim 1, wherein the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region is adjusted by changing the shape of the color filter.
  7.  入射光に基づく光電変換を行う光電変換部形成工程と、開口部が配置されて前記入射光を遮光しながら前記開口部において入射光を透過させる遮光膜を形成する第1の遮光膜形成工程と、前記入射光のうち所定の波長の入射光を透過させるカラーフィルタを形成するカラーフィルタ形成工程と、前記カラーフィルタの表面を平坦化する平坦化膜を形成する平坦化膜形成工程と、前記平坦化膜に隣接して配置されて前記カラーフィルタおよび前記遮光膜の開口部を介して前記入射光を前記光電変換部に集光するオンチップレンズを形成するオンチップレンズ形成工程とを備える複数の画素の形成工程と、
     前記光電変換部形成工程と、前記開口部が配置されない遮光膜を形成する第2の遮光膜形成工程と、前記カラーフィルタ形成工程と、前記平坦化膜形成工程と、前記オンチップレンズ形成工程とを備える遮光画素の形成工程と
    を具備し、
     前記遮光画素の形成工程は、前記形成される複数の画素が配置される画素領域の周囲に前記遮光画素を形成する工程であり、
     前記画素の製造工程は、前記遮光画素が形成される領域である遮光領域の近傍の前記画素における前記光電変換部への入射光量を調整する工程をさらに備える
    撮像素子の製造方法。
    A photoelectric conversion part forming step for performing photoelectric conversion based on incident light; and a first light shielding film forming step for forming a light shielding film that transmits the incident light through the opening while the opening is disposed to shield the incident light. A color filter forming step of forming a color filter that transmits incident light of a predetermined wavelength among the incident light, a flattening film forming step of forming a flattening film for flattening the surface of the color filter, and the flattening And an on-chip lens forming step for forming an on-chip lens that is disposed adjacent to the conversion film and collects the incident light to the photoelectric conversion unit through the color filter and the opening of the light shielding film. A pixel forming step;
    The photoelectric conversion portion forming step, a second light shielding film forming step for forming a light shielding film in which the opening is not disposed, the color filter forming step, the planarization film forming step, and the on-chip lens forming step. A step of forming a light-shielding pixel comprising:
    The formation step of the light shielding pixel is a step of forming the light shielding pixel around a pixel region where the plurality of pixels to be formed are arranged.
    The manufacturing method of the image pickup device further includes a step of adjusting an amount of incident light to the photoelectric conversion unit in the pixel in the vicinity of the light shielding region, which is a region where the light shielding pixel is formed.
PCT/JP2019/009673 2018-04-26 2019-03-11 Image capture element and method of manufacturing image capture element WO2019207978A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018084688A JP2019192802A (en) 2018-04-26 2018-04-26 Imaging device and manufacturing method of imaging device
JP2018-084688 2018-04-26

Publications (1)

Publication Number Publication Date
WO2019207978A1 true WO2019207978A1 (en) 2019-10-31

Family

ID=68293847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009673 WO2019207978A1 (en) 2018-04-26 2019-03-11 Image capture element and method of manufacturing image capture element

Country Status (2)

Country Link
JP (1) JP2019192802A (en)
WO (1) WO2019207978A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022118657A1 (en) * 2020-12-04 2022-06-09 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging element and electronic apparatus
WO2023068172A1 (en) * 2021-10-20 2023-04-27 ソニーセミコンダクタソリューションズ株式会社 Imaging device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021175048A (en) * 2020-04-22 2021-11-01 ソニーセミコンダクタソリューションズ株式会社 Electronic apparatus
WO2024048488A1 (en) * 2022-08-31 2024-03-07 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07176708A (en) * 1993-12-21 1995-07-14 Matsushita Electron Corp Solid-state image pickup device
JP2009164247A (en) * 2007-12-28 2009-07-23 Sony Corp Solid-state imaging apparatus, method of producing the same, camera, and electronic apparatus
JP2010199668A (en) * 2009-02-23 2010-09-09 Sony Corp Solid-state imaging device and electronic apparatus
JP2011176325A (en) * 2011-03-22 2011-09-08 Sony Corp Solid-state imaging apparatus and electronic apparatus
JP2012064924A (en) * 2010-08-17 2012-03-29 Canon Inc Microlens array manufacturing method, solid state image pickup device manufacturing range, and solid state image pickup device
JP2012124377A (en) * 2010-12-09 2012-06-28 Sony Corp Solid state imaging device and method of manufacturing the same, and electronic apparatus
JP2017059589A (en) * 2015-09-14 2017-03-23 キヤノン株式会社 Solid-state image pickup element and imaging device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07176708A (en) * 1993-12-21 1995-07-14 Matsushita Electron Corp Solid-state image pickup device
JP2009164247A (en) * 2007-12-28 2009-07-23 Sony Corp Solid-state imaging apparatus, method of producing the same, camera, and electronic apparatus
JP2010199668A (en) * 2009-02-23 2010-09-09 Sony Corp Solid-state imaging device and electronic apparatus
JP2012064924A (en) * 2010-08-17 2012-03-29 Canon Inc Microlens array manufacturing method, solid state image pickup device manufacturing range, and solid state image pickup device
JP2012124377A (en) * 2010-12-09 2012-06-28 Sony Corp Solid state imaging device and method of manufacturing the same, and electronic apparatus
JP2011176325A (en) * 2011-03-22 2011-09-08 Sony Corp Solid-state imaging apparatus and electronic apparatus
JP2017059589A (en) * 2015-09-14 2017-03-23 キヤノン株式会社 Solid-state image pickup element and imaging device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022118657A1 (en) * 2020-12-04 2022-06-09 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging element and electronic apparatus
WO2023068172A1 (en) * 2021-10-20 2023-04-27 ソニーセミコンダクタソリューションズ株式会社 Imaging device

Also Published As

Publication number Publication date
JP2019192802A (en) 2019-10-31

Similar Documents

Publication Publication Date Title
WO2018043654A1 (en) Solid-state imaging device and manufacturing method therefor, and electronic apparatus
WO2018139278A1 (en) Image-capture element, manufacturing method, and electronic device
WO2019207978A1 (en) Image capture element and method of manufacturing image capture element
JP6951866B2 (en) Image sensor
JP2018200980A (en) Imaging apparatus, solid-state imaging device, and electronic equipment
US20230008784A1 (en) Solid-state imaging device and electronic device
US11284046B2 (en) Imaging element and imaging device for controlling polarization of incident light
WO2020137203A1 (en) Imaging element and imaging device
US20240030250A1 (en) Solid-state imaging device and electronic apparatus
US11889206B2 (en) Solid-state imaging device and electronic equipment
US20230103730A1 (en) Solid-state imaging device
US20240006443A1 (en) Solid-state imaging device, imaging device, and electronic apparatus
WO2022091576A1 (en) Solid-state imaging device and electronic apparatus
WO2022009674A1 (en) Semiconductor package and method for producing semiconductor package
WO2020195180A1 (en) Imaging element and imaging device
JP2019050338A (en) Imaging device and manufacturing method of imaging device, imaging apparatus, as well as electronic apparatus
WO2019176302A1 (en) Imaging element and method for manufacturing imaging element
WO2023195316A1 (en) Light detecting device
WO2023162496A1 (en) Imaging device
WO2023013156A1 (en) Imaging element and electronic device
WO2023195315A1 (en) Light detecting device
WO2023021787A1 (en) Optical detection device and method for manufacturing same
WO2024029408A1 (en) Imaging device
WO2023127498A1 (en) Light detection device and electronic instrument
WO2023058326A1 (en) Imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19793021

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19793021

Country of ref document: EP

Kind code of ref document: A1