WO2021111904A1 - Imaging element and imaging device - Google Patents

Imaging element and imaging device Download PDF

Info

Publication number
WO2021111904A1
WO2021111904A1 PCT/JP2020/043422 JP2020043422W WO2021111904A1 WO 2021111904 A1 WO2021111904 A1 WO 2021111904A1 JP 2020043422 W JP2020043422 W JP 2020043422W WO 2021111904 A1 WO2021111904 A1 WO 2021111904A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
photoelectric conversion
incident light
light
image pickup
Prior art date
Application number
PCT/JP2020/043422
Other languages
French (fr)
Japanese (ja)
Inventor
界斗 横地
純次 成瀬
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021111904A1 publication Critical patent/WO2021111904A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present technology relates to an image sensor and an image sensor, for example, an image sensor in which incident light is emitted from the back surface of a semiconductor substrate and an image sensor using the image sensor.
  • An image sensor is used in which the incident light is irradiated to the back surface side of the semiconductor substrate on which the photoelectric conversion part such as a photodiode that photoelectrically converts the incident light is formed. Since the incident light is applied to the photoelectric conversion unit without passing through the wiring region formed on the surface of the semiconductor substrate, the sensitivity can be improved.
  • an image pickup device for example, an image pickup device in which a photodiode or the like is formed on a silicon layer of an SOI (Silicon on Insulator) substrate formed by sequentially laminating an intermediate layer and a silicon layer on a silicon substrate is used.
  • SOI Silicon on Insulator
  • a wiring portion is arranged on the surface of a silicon layer on which a light receiving sensor portion such as a photodiode is formed. After the support substrate is adhered to this wiring region, the silicon substrate and the intermediate layer are removed.
  • the silicon layer thin film silicon having a thickness of 10 ⁇ m or less can be used. Since the process of thinning the semiconductor substrate by grinding or the like is not required, a silicon layer having a stable thickness can be manufactured with a high yield.
  • the incident light that is not absorbed by the semiconductor substrate among the incident light reaches the wiring region and is reflected, and the image sensor is again used. There was a possibility that it would be incident on.
  • This technology was made in view of such a situation, and makes it possible to improve the sensitivity by also utilizing the reflected light of the image sensor.
  • the first imaging element on one aspect of the present technology includes an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light.
  • a waveguide that guides the incident light to the photoelectric conversion unit, a reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and a plurality of sides of the photoelectric conversion unit on which the incident light is incident. It is provided with an uneven region having unevenness.
  • the first image sensor on one aspect of the present technology includes an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light.
  • a waveguide that guides the incident light to the photoelectric conversion unit, a reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and a plurality of sides of the photoelectric conversion unit on which the incident light is incident. It includes an image pickup element having an unevenness region having irregularities and a processing unit for processing a signal from the image pickup element.
  • the second imaging element on one aspect of the present technology includes an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light.
  • An opening a reflective film that reflects the incident light transmitted through the photoelectric conversion unit, a reflective film having an inclined surface, and a concavo-convex region having a plurality of irregularities on the side of the photoelectric conversion unit on which the incident light is incident.
  • the second image sensor on one aspect of the present technology has an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light.
  • An opening a reflective film that reflects the incident light transmitted through the photoelectric conversion unit, a reflective film having an inclined surface, and a concavo-convex region having a plurality of irregularities on the side of the photoelectric conversion unit on which the incident light is incident. It is provided with an image pickup device including the above and a processing unit for processing a signal from the image pickup device.
  • an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and an opening having a size substantially the same as the condensed size of the incident light.
  • a waveguide that guides the incident light to the photoelectric conversion unit
  • a reflective film that reflects the incident light that has passed through the photoelectric conversion unit
  • an uneven region that has a plurality of irregularities on the side where the incident light of the photoelectric conversion unit is incident.
  • the first image pickup device on one aspect of the present technology is provided with the first image pickup element.
  • an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and an opening having a size substantially the same as the condensed size of the incident light.
  • a reflective film that reflects incident light transmitted through the photoelectric conversion unit, a reflective film having an inclined surface, and a concavo-convex region having a plurality of irregularities on the side where the incident light of the photoelectric conversion unit is incident are provided.
  • the second image pickup device on one aspect of the present technology is provided with the second image pickup element.
  • the imaging device may be an independent device or an internal block constituting one device.
  • FIG. 1 is a diagram showing a configuration example of an image pickup device according to an embodiment of the present technology.
  • the image pickup device 1 in the figure includes a pixel array unit 10, a vertical drive unit 11, a column signal processing unit 12, and a control unit 13.
  • the pixel array unit 10 is configured by arranging the pixels 30 in a two-dimensional grid pattern.
  • the pixel 30 generates an image signal according to the irradiated light.
  • the pixel 30 has a photoelectric conversion unit that generates an electric charge according to the irradiated light.
  • the pixel 30 further has a pixel circuit. This pixel circuit generates an image signal based on the electric charge generated by the photoelectric conversion unit. The generation of the image signal is controlled by the control signal generated by the vertical drive unit 11.
  • a signal line 14 and a signal line 15 are arranged in an XY matrix in the pixel array unit 10.
  • the signal line 14 is a signal line that transmits a control signal of the pixel circuit in the pixel 30, is arranged for each line of the pixel array unit 10, and is commonly wired to the pixel 30 arranged in each line.
  • the signal line 15 is a signal line for transmitting an image signal generated by the pixel circuit of the pixel 30, is arranged for each row of the pixel array unit 10, and is commonly wired to the pixels 30 arranged in each row.
  • These photoelectric conversion units and pixel circuits are formed on a semiconductor substrate.
  • the vertical drive unit 11 generates a control signal for the pixel circuit of the pixel 30.
  • the vertical drive unit 11 transmits the generated control signal to the pixel 30 via the signal line 14 in the figure.
  • the column signal processing unit 12 processes the image signal generated by the pixel 30.
  • the column signal processing unit 12 processes the image signal transmitted from the pixel 30 via the signal line 15 in the figure.
  • the processing in the column signal processing unit 12 corresponds to, for example, analog-to-digital conversion that converts an analog image signal generated in the pixel 30 into a digital image signal.
  • the image signal processed by the column signal processing unit 12 is output as an image signal of the image sensor 1.
  • the control unit 13 controls the entire image sensor 1.
  • the control unit 13 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical drive unit 11 and the column signal processing unit 12.
  • the control signal generated by the control unit 13 is transmitted to the vertical drive unit 11 and the column signal processing unit 12 by the signal lines 16 and 17, respectively.
  • FIG. 2 is a cross-sectional view showing a configuration example of a pixel according to the first embodiment of the present technology.
  • the pixel 30 includes a semiconductor substrate 31, a semiconductor region 32, a wiring region 33, a front surface side reflective film 34, a back surface side reflective film 35, a protective film 36, and an on-chip lens 37.
  • the semiconductor substrate 31 is a semiconductor substrate on which the semiconductor region (diffusion region) of the elements constituting the photoelectric conversion unit and the pixel circuit described above is formed.
  • the semiconductor substrate 31 can be made of silicon (Si).
  • a semiconductor element such as a photoelectric conversion unit is arranged in a well region formed on the semiconductor substrate 31. For convenience, it is assumed that the semiconductor substrate 31 in the figure constitutes a p-type well region.
  • an n-type semiconductor region 32 constituting a photoelectric conversion unit is described as an example of an element.
  • the photodiode formed by the pn junction at the interface between the n-type semiconductor region 32 and the surrounding p-type well region corresponds to the photoelectric conversion unit.
  • photoelectric conversion occurs.
  • the electric charge generated by this photoelectric conversion is accumulated in the n-type semiconductor region 32.
  • An image signal is generated by a pixel circuit (not shown) based on the accumulated charge.
  • the separation region 38 can be arranged at the boundary of the pixels 30 in the semiconductor substrate 31 in the figure.
  • the separation region 38 optically separates the pixels 30. Specifically, by arranging a film that reflects the incident light between the pixels 30 as the separation region 38, leakage of the incident light to the adjacent pixels 30 is prevented. This makes it possible to prevent crosstalk between the pixels 30.
  • the separation region 38 can be made of, for example, a metal such as tungsten (W).
  • a fixed charge film and an insulating film can be arranged between the separation region 38 and the semiconductor substrate 31.
  • the fixed charge film is a film that is arranged at the interface of the semiconductor substrate 31 and pins the surface level of the semiconductor substrate 31.
  • the insulating film is a film that is arranged between the fixed charge film and the separation region 38 to insulate the separation region 38.
  • Such a separation region 38 can be formed by forming a fixed charge film and an insulating film on the surface of the groove formed in the semiconductor substrate 31 and embedding a metal such as tungsten (W). By arranging the separation region 38 provided with such an insulating film, the pixels 30 can be electrically separated.
  • the wiring area 33 is an area that is arranged adjacent to the surface of the semiconductor substrate 31 and on which wiring for transmitting signals is formed.
  • the wiring region 33 in the figure includes a wiring layer 42 and an insulating layer 41.
  • the wiring layer 42 is a conductor that transmits a signal to the elements of the semiconductor substrate 31.
  • the wiring layer 42 can be made of a metal such as copper (Cu) or tungsten (W).
  • the insulating layer 41 insulates the wiring layer 42.
  • the insulating layer 41 can be made of, for example, silicon oxide (SiO2).
  • the wiring layer 42 and the insulating layer 41 can be configured in multiple layers.
  • the figure shows an example of wiring configured in two layers. Wiring layers 42 arranged in different layers can be connected to each other by a via plug (not shown).
  • the image sensor 1 in the figure corresponds to a back-illuminated image sensor in which incident light is irradiated from the back surface side of the semiconductor substrate 31 to the photoelectric conversion unit.
  • the incident light from the subject incident on the semiconductor substrate 31 via the on-chip lens 37 and the back surface side reflective film 35, which will be described later, is absorbed by the semiconductor substrate 31 and photoelectrically converted.
  • the incident light that is not absorbed by the semiconductor substrate 31 passes through the semiconductor substrate 31 and becomes transmitted light, and is incident on the wiring region 33.
  • the surface-side reflective film 34 is arranged on the surface side of the semiconductor substrate 31 and reflects transmitted light.
  • the surface-side reflective film 34 in the figure is arranged in the wiring region 33, and is arranged adjacent to the semiconductor substrate 31 via the insulating layer 41.
  • the surface-side reflective film 34 is configured to cover the surface side of the semiconductor substrate 31 of the pixel 30.
  • the surface-side reflective film 34 By arranging the surface-side reflective film 34, the transmitted light transmitted through the semiconductor substrate 31 can be reflected to the semiconductor substrate 31 side. This makes it possible to increase the incident light that contributes to photoelectric conversion. Therefore, the conversion efficiency of the pixel 30 can be improved.
  • the surface-side reflective film 34 can be made of a metal such as W or Cu. Further, the surface side reflective film 34 can be formed by the wiring layer 42. In this case, the surface-side reflective film 34 can be formed at the same time as the wiring layer 42.
  • the back surface side reflective film 35 is arranged on the back surface side of the semiconductor substrate 31 to transmit the incident light from the subject and further reflect the reflected light.
  • the back surface side reflective film 35 in the figure is arranged adjacent to the semiconductor substrate 31 via the protective film 36.
  • the back surface side reflective film 35 is provided with an opening 52 in the central portion, and the incident light collected by the on-chip lens 37 described later is transmitted through the opening 52. Further, the back surface side reflective film 35 reflects the above-mentioned reflected light again and causes it to enter the semiconductor substrate 31 to reduce leakage of the reflected light to the outside of the pixel 30.
  • the back surface side reflective film 35 can be made of a metal such as W or Cu, like the front surface side reflective film 34 and the separation region 38.
  • the back surface side reflective film 35 can be formed at the same time as the separation region 38. Specifically, when the metal used as the material of the separation region 38 is embedded in the groove formed in the semiconductor substrate 31, a material film is also formed on the back surface of the semiconductor substrate 31. By forming the opening 52 in the formed material film, the back surface side reflective film 35 can be manufactured.
  • the opening 52 can be configured to have substantially the same size as the condensed size of the incident light by the on-chip lens 37.
  • the protective film 36 is a film that insulates and protects the back surface side of the semiconductor substrate 31.
  • the protective film 36 in the figure is configured to cover the back surface side reflective film 35, and further flattens the back surface side of the semiconductor substrate 31 on which the back surface side reflective film 35 is arranged.
  • the protective film 36 can be made of, for example, SiO2.
  • the above-mentioned fixed charge film can be arranged on the portion of the protective film 36 adjacent to the surface of the semiconductor substrate 31.
  • oxides of metals such as hafnium, aluminum and tantalum can be used.
  • the on-chip lens 37 is a lens that is arranged for each pixel 30 and collects incident light from a subject on a photoelectric conversion unit of a semiconductor substrate 31.
  • the on-chip lens 37 is configured in a convex lens shape and collects incident light.
  • the on-chip lens 37 in the figure collects incident light on the photoelectric conversion unit through the opening 52 of the back surface side reflective film 35 described above.
  • the on-chip lens 37 can be made of, for example, an organic material such as resin or an inorganic material such as silicon nitride (SiN).
  • the incident light is focused by the on-chip lens 37, and a focal point is formed in the region of the semiconductor substrate 31.
  • the light incident on the on-chip lens 37 is gradually narrowed down from the on-chip lens 37 to the semiconductor substrate 31, and the focused size, which is the irradiation range of the incident light in the horizontal direction, is narrowed.
  • the opening 52 of the back surface side reflective film 35 By configuring the opening 52 of the back surface side reflective film 35 to have a size substantially equal to the focused size of the incident light, the incident light collected by the on-chip lens 37 is shielded (vignetting) by the back surface side reflecting film 35. Can be prevented. Since the opening 52 is reduced, leakage of reflected light from the opening 52 can be reduced.
  • the pixel 30 constituting the image pickup device 1 shown in FIG. 2 includes a substrate back surface scattering portion 51.
  • a part of the reflected light of the pixel 30 may leak from the opening 52 of the back surface reflective film 35, but by providing the substrate back surface scattering portion 51, the reflected light leaking from the opening 52 is scattered.
  • the configuration can be returned to the semiconductor region 32.
  • the substrate back surface scattering portion 51 is formed on the back surface of the semiconductor substrate 31 and scatters incident light and reflected light.
  • the substrate back surface scattering portion 51 can be formed by irregularities formed on the back surface of the semiconductor substrate 31.
  • the substrate back surface scattering portion 51 is a region having a plurality of concave portions and convex portions. Since the substrate back surface scattering portion 51 is a region having irregularities, it can be configured to scatter incident light.
  • the substrate back surface scattering portion 51 in the figure is arranged in the vicinity of the opening 52 of the back surface side reflective film 35. Further, as shown in the figure, the substrate back surface scattering portion 51 is provided on the back surface side between the separation regions 38.
  • the substrate back surface scattering portion 51 can be formed, for example, by partially etching the back surface of the semiconductor substrate 31.
  • the back surface scattering portion 51 of the substrate can be formed by performing anisotropic etching on the back surface of the semiconductor substrate 31 to form a plurality of V-shaped recesses shown in the figure.
  • the pixel 30 of the image pickup device 1 of the first embodiment scatters the reflected light leaking to the outside of the pixel 30 by arranging the substrate back surface scattering portion 51 on the back surface side of the semiconductor substrate 31. Let me. Thereby, the image quality can be improved.
  • the leakage of the reflected light can be reduced by reducing the opening 52, and the reflected light can be returned to the semiconductor region 32 by providing the back surface scattering portion 51 of the substrate. Add an explanation about what will happen.
  • the solid arrow in FIG. 3 represents the incident light
  • the dotted arrow represents the reflected light.
  • the incident light 71 in the figure shows an example in which the incident light 71 is repeatedly reflected by the front surface side reflective film 34 and the back surface side reflective film 35 after passing through the semiconductor substrate 31.
  • the reflected light undergoes photoelectric conversion as the reflection is repeated, is gradually attenuated, and is absorbed without leaking to the outside of the pixel 30.
  • the incident light can be confined inside the semiconductor substrate 31, and the sensitivity can be improved. Further, by narrowing the opening 52 of the back surface side reflective film 35, it is possible to reduce the passage of the reflected light from the front surface side reflective film 34 through the opening 52. Leakage of reflected light to the outside of the pixel 30 can be reduced. If the reflected light leaks to the outside of the pixel 30 and then re-enters the nearby pixel 30, it causes noise such as flare. By narrowing the opening 52, noise can be reduced and deterioration of image quality can be prevented.
  • the reflected light can be scattered even in the substrate back surface scattering portion 51 and returned inside the semiconductor substrate 31. Therefore, the reflected light can be confined in the semiconductor substrate 31 by the substrate back surface scattering portion 51 as well. Therefore, the sensitivity can be improved.
  • FIGS. 2 and 3 show a configuration in which the back surface scattering portion 51 of the substrate is provided not only in the region of the opening 52 but also on the lower side (semiconductor substrate 31 side) of the back surface side reflective film 35. Since the reflected light can be returned to the semiconductor substrate 31 by the back surface side reflective film 35, the substrate back surface scattering portion 51 may not be provided under the back surface side reflective film 35. In other words, the substrate back surface scattering portion 51 may be configured to be provided only at the opening 52.
  • the back surface side reflective film 35 may have a two-layer structure as shown in FIG.
  • the pixel 30 shown in FIG. 4 includes an absorbing film 91 on the back surface side reflective film 35.
  • the absorption film 91 is arranged on the back surface of the semiconductor substrate 31 to absorb the incident light from the subject.
  • the absorption film 91 is provided with an opening 52 in the central portion, and the incident light collected by the on-chip lens 37 is transmitted through the opening 52.
  • the light incident on the portion other than the opening 52, that is, the absorption film 91 is absorbed by the absorption film 91.
  • the absorbing film 91 is also provided to absorb the reflected light and reduce the leakage of the reflected light to the outside of the pixel 30.
  • the absorption film 91 in the figure is arranged between the on-chip lens 37 and the back surface side reflection film 35, and is provided where the back surface side reflection film 35 is provided.
  • the absorption film 91 can be composed of, for example, a film in which an absorption member that absorbs incident light is dispersed.
  • a pigment that absorbs light such as carbon black or titanium oxide can be used as an absorbing member, and the absorbing film 91 can be formed by a film in which this pigment is dispersed in a resin or the like.
  • Such an absorption film 91 can be manufactured by forming a resin film in which a pigment is dispersed adjacent to a back surface side reflection film 35 to form an opening 52.
  • the opening 52 can be formed by dry etching or wet etching using a chemical solution.
  • An absorption film 91 having a dye-based absorption member such as an infrared light absorber can also be used.
  • the absorption film 91 By arranging the absorption film 91, the reflected light or the incident light that obliquely passes through the opening 52 of the back surface side reflection film 35 can be incident on the wall surface of the absorption film 91 in the opening 52 and absorbed.
  • the pixel 30 provided with the absorbing film 91 will be described as an example. Further, the pixel 30 shown in FIG. 4 is referred to as a pixel 30 according to the first embodiment, and is described as a pixel 30a in order to distinguish it from the pixel 30 according to another embodiment.
  • FIG. 5 is a cross-sectional view showing a configuration example of the pixel 30b according to the second embodiment of the present technology.
  • the same parts as those of the pixel 30a according to the first embodiment shown in FIG. 4 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • the pixel 30b according to the second embodiment shown in FIG. 5 has a different shape of the surface side reflective film 34 as compared with the pixel 30a according to the first embodiment, and is a surface side reflective film 101. The point is different.
  • the surface-side reflective film 101 in the pixel 30b is formed in a shape (mountain shape) in which the central portion protrudes.
  • the surface-side reflective film 101 has a shape like a combination of a quadrangle and a triangle, and when the quadrangle is used as a base, the shape is such that a triangle is placed on the central portion of the base of the quadrangle. ..
  • FIG. 5 shows a case where the surface-side reflective film 101 has a shape in which a quadrangle and a triangle are combined, it may be a triangle. Further, the vertices of the triangle may have a rounded shape.
  • the surface-side reflective film 101 may have a shape having an inclined surface on the semiconductor region 32 side.
  • the surface-side reflective film 101 is configured so that the sides facing each other of the semiconductor region 32 and the surface-side reflective film 101 are not parallel to each other. In other words, at least one side of the surface-side reflective film 101 is formed as a slope, and the example shown in FIG. 5 shows an example in which two sides are formed as a slope.
  • FIG. 3 is a diagram showing incident light and reflected light in the pixel 30a according to the first embodiment.
  • the incident light 71 is repeatedly reflected by the front surface side reflection film 34 and the back surface side reflection film 35, and is confined inside the semiconductor substrate 31.
  • the incident light 72 in FIG. 3 the incident light includes light that is vertically incident (light called a 0th-order component or the like), and such light is the surface-side reflective film 34. There was a possibility that the light would come out of the opening 52.
  • the pixel 30a shown in FIG. 3 includes the substrate back surface scattering portion 51, the incident light can be scattered and the reflected light can be scattered. Therefore, it is possible to reduce the amount of light that is reflected by the surface-side reflective film 34 and escapes from the opening 52. Further, in order to reduce the amount of light reflected by the surface-side reflective film 34 and passing through the opening 52, the shape of the surface-side reflective film 34 is shown in the surface-side reflective film 101 (FIG. 5). Shape.
  • the surface-side reflective film 101 included in the pixel 30b shown in FIG. 5 is formed in a triangular shape as described above.
  • An example of incident light and reflected light is indicated by an arrow in the pixel 30b shown on the left side of FIG.
  • the incident light hits one side of the triangular surface-side reflective film 101, is reflected, and is directed toward the side surface of the semiconductor region 32. Therefore, the structure can be prevented from being reflected by the surface-side reflective film 101 and coming out of the opening 52.
  • the amount of light that can be returned to the semiconductor region 32 can be increased. It is possible to improve the sensitivity of the pixel 30b.
  • FIG. 5 shows an example in which the surface-side reflective film 101 is arranged near the center of the pixel 30b, but this position is an example.
  • the surface-side reflective film 101 is arranged at a position avoiding a region where a gate of a transfer transistor (not shown) for reading the electric charge accumulated in the semiconductor region 32 is formed, and is displaced from the vicinity of the center of the pixel 30b. It may be arranged in. The same applies to other embodiments.
  • FIG. 6 is a cross-sectional view showing a configuration example of the pixel 30c according to the third embodiment of the present technology.
  • the same parts as those of the pixel 30b according to the second embodiment shown in FIG. 5 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • the pixel 30c according to the third embodiment shown in FIG. 6 is different from the pixel 30b according to the second embodiment in that the reflecting portion 111 is further provided in the semiconductor region 32. , Other points are similar.
  • the reflection unit 111 is provided near the center of the pixel 30c.
  • the reflection portion 111 is formed in a rod shape in cross section from the back surface scattering portion 51 of the substrate to the front surface side reflection film 34 side.
  • the reflecting portion 111 may be formed as a surface having a predetermined thickness in the semiconductor region 32, or may have a shape such as a cylinder or a polygonal prism.
  • the incident light is reflected by the surface-side reflecting film 101 and reflected on the side surface of the semiconductor region 32. It goes to the reflection part 111. Then, it hits the reflection unit 111 and is further reflected by the reflection unit 111.
  • the light reflected by the reflecting unit 111 is further scattered by the back surface scattering unit 51 of the substrate or reflected by the antireflection film 35 on the back surface side, and is returned to the semiconductor region 32. Therefore, the structure can be prevented from being reflected by the surface-side reflective film 101 and coming out of the opening 52.
  • a hypotenuse is provided on at least one side of the surface-side reflective film 101, the incident light is hit and reflected on the hypotenuse, and the reflecting portion 111 is further provided in the semiconductor region 32.
  • the amount of light that can be returned can be increased, and the sensitivity of the pixel 30c can be improved.
  • FIG. 7 is a cross-sectional view showing a configuration example of the pixel 30d according to the fourth embodiment of the present technology.
  • the same parts as those of the pixel 30b according to the second embodiment shown in FIG. 5 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • the pixel 30d according to the fourth embodiment shown in FIG. 7 is configured to further include a reflection unit 112 and a reflection unit 113 in the semiconductor region 32 as compared with the pixel 30b according to the second embodiment. The difference is that it is the same in other respects.
  • the pixel 30c according to the third embodiment shown in FIG. 6 is configured to include one reflecting unit 111, but the pixel 30d according to the fourth embodiment shown in FIG. 7 has two pixels 30d. It is configured to include a reflecting unit 112 and a reflecting unit 113. As described above, a plurality of reflecting portions may be provided in the semiconductor region 32.
  • the pixel 30d shown in FIG. 7 includes a reflecting portion 112 on the left side of the opening 52 in the drawing, and a reflecting portion 113 on the right side of the opening 52 in the drawing.
  • the reflecting portion 112 and the reflecting portion 113 are arranged at positions where the opening region of the opening 52 is not reduced.
  • the reflective unit 112 and the reflective unit 113 can each have the same configuration as the reflective unit 111 (FIG. 6). That is, the reflecting portion 112 and the reflecting portion 113 are formed on the front surface side reflecting film 34 side from the substrate back surface scattering portion 51, respectively, in a rod shape in cross section, and are formed as surfaces having a predetermined thickness in the semiconductor region 32. It may be shaped like a cylinder or a polygonal pillar.
  • the incident light is reflected by the slope of the surface side reflecting film 101 and the side surface of the semiconductor region 32. Is reflected by, and proceeds to the reflecting portion 112 side. Then, it is further reflected by the reflecting unit 112.
  • the light reflected by the reflecting unit 112 is further scattered by the substrate back surface scattering unit 51 or reflected by the back surface side reflecting film 35, and is returned to the semiconductor region 32. Therefore, the structure can be prevented from being reflected by the surface-side reflective film 101 and coming out of the opening 52.
  • the light can be confined in the region between the reflecting portion 112 and the side surface of the semiconductor region 32.
  • the reflecting portion 113 side also has a configuration in which light can be confined in the region between the reflecting portion 113 and the side surface of the semiconductor region 32.
  • the semiconductor region 32 is configured by providing an oblique side on at least one side of the surface-side reflective film 101 so that the incident light hits and reflects the oblique side, and further provides the reflecting portions 112 and 113.
  • the amount of light that can be returned to the inside can be increased, and the sensitivity of the pixel 30d can be improved.
  • FIG. 8 is a cross-sectional view showing a configuration example of the pixel 30e according to the fifth embodiment of the present technology.
  • the same parts as those of the pixel 30a according to the first embodiment shown in FIG. 2 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • the pixel 30e according to the fifth embodiment shown in FIG. 8 is configured to include a scattering portion on the surface side of the semiconductor region 32 as compared with the pixel 30a according to the first embodiment. Different, the other points are similar.
  • the pixel 30e according to the fifth embodiment shown in FIG. 8 includes a substrate back surface scattering portion 51 on the incident surface side and a substrate surface scattering portion 131 on the wiring region 33 side. That is, the pixel 30e is configured to have scattering portions above and below the semiconductor region 32, respectively.
  • the substrate surface scattering portion 131 is formed as a region having irregularities, similarly to the substrate back surface scattering portion 51.
  • the incident light is scattered by the substrate back surface scattering portion 51 and enters the semiconductor region 32. Incident. Further, among the light incident on the semiconductor region 32, the light that has reached the substrate surface scattering portion 131 is scattered by the substrate surface scattering portion 131 and returned to the semiconductor region 32.
  • the structure can prevent (reduce) the reflected light from escaping from the opening 52.
  • the amount of light that can be retained in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30e can be improved. it can.
  • FIG. 8 shows an example in which the substrate surface scattering portion 131 is formed from one side surface of the semiconductor region 32 to the other side surface (formed between the separation regions 38), the semiconductor region 32 is formed. It can be changed according to the pixel configuration, such as not being formed in the region where the gate of the transfer transistor (not shown) for reading the accumulated charge is formed. The same applies to other embodiments.
  • FIG. 9 is a cross-sectional view showing a configuration example of the pixel 30f according to the sixth embodiment of the present technology.
  • the pixel 30f according to the sixth embodiment shown in FIG. 9 includes the pixel 30b according to the second embodiment shown in FIG. 5 and the pixel 30e according to the fifth embodiment shown in FIG. It has a combined configuration. That is, the pixel 30f according to the sixth embodiment shown in FIG. 9 is configured to include the surface side reflective film 101 and the substrate surface scattering portion 131 on the surface side.
  • the pixel 30f according to the sixth embodiment includes the surface-side reflective film 101 having a slope like the pixel 30b according to the second embodiment, the reflected light of the 0th-order component escapes from the opening 52. It is possible to reduce such things as being lost. Further, since the pixel 30f according to the sixth embodiment includes the substrate surface scattering portion 131 like the pixel 30e according to the fifth embodiment, the light reaching the surface side of the semiconductor region 32 is emitted to the semiconductor region. Light can be returned into the semiconductor region 32 by being scattered within 32.
  • the surface-side reflective film 101 and the substrate surface scattering portion 131 in this way, it is possible to reduce the possibility that the reflected light escapes from the opening 52, and the light is emitted into the semiconductor region 32.
  • the amount of light that can be stopped can be increased. Therefore, the sensitivity of the pixel 30f can be improved.
  • FIG. 10 is a cross-sectional view showing a configuration example of a pixel 30 g according to a seventh embodiment of the present technology.
  • the pixel 30g according to the seventh embodiment has a configuration in which the pixel 30d according to the fourth embodiment and the pixel 30f according to the sixth embodiment are combined. That is, the pixel 30g according to the seventh embodiment includes the reflection unit 112 and the reflection unit 113, like the pixel 30d according to the fourth embodiment. Further, the pixel 30g according to the seventh embodiment includes the substrate surface scattering portion 131 like the pixel 30f according to the sixth embodiment.
  • the shape of the surface-side reflective film 101 is a shape including a triangular shape, and the reflecting portion 114 is provided at the apex of the triangular shape.
  • the reflecting portion 114 is formed in the same shape as the reflecting portion 112 in the direction extending from the surface side reflecting film 101 toward the semiconductor region 32 side (from the lower side to the upper side in the drawing).
  • the reflecting portions 112, 113, 114 By providing the reflecting portions 112, 113, 114 in this way, the amount of light that can be returned (fastened) into the semiconductor region 32 can be increased. Further, by providing the substrate surface scattering portion 131, the amount of light that can be retained in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30 g can be improved.
  • FIG. 11 is a cross-sectional view showing a configuration example of the pixel 30h according to the eighth embodiment of the present technology.
  • the pixel 30h according to the eighth embodiment is provided with a substrate surface scattering portion 131 and a reflecting portion 114 on the surface side (wiring region 33 side), similarly to the pixel 30g according to the seventh embodiment.
  • the reflective portion 114 shown in FIG. 11 showed a configuration in which the reflective portion 114 was provided on the rectangular surface-side reflective film 34, but the reflective portion 114 was provided on the surface-side reflective film 101 having a shape including a triangle. It may be configured.
  • the pixel 30h according to the eighth embodiment is different from the pixel 30 according to another embodiment in particular in the shape of the substrate back surface scattering portion 151 (in other embodiments, the substrate back surface scattering portion 51).
  • the substrate back surface scattering portion 151 shown in FIG. 11 has a configuration in which it is not formed at the opening 52. Further, the semiconductor region 32 at the opening 52 has a tapered shape with engraving.
  • the opening 52 is filled with, for example, the same material as the protective film 36.
  • a part of the semiconductor region 32 is filled with the same material as the protective film 36.
  • a part of the semiconductor region 32 is a region located on the opening 52 side and having a triangular shape in cross section, for example.
  • it is described as the light refraction structure portion 152.
  • the light refraction structure portion 152 can also be expressed as a large recess, the inside of the recess is filled with the same material as the protective film 36, and the outside of the recess is the semiconductor region 32.
  • the incident light is refracted at the boundary between the light refracting structure portion 152 and the semiconductor region 32, and in the example shown in FIG. 11, the semiconductor region 32 Proceed to the side of.
  • the incident light can be refracted.
  • the incident light that directly reaches the surface-side reflective film 34 is reduced, and the reflected light of the 0th-order component can be reduced. Therefore, it is possible to reduce the light that escapes from the opening 52.
  • the reflecting portion 114 by providing the reflecting portion 114, the light reflected (scattered) by the front surface side reflecting film 34 and the substrate back surface scattering portion 51 can hit the reflecting portion 114 and be prevented from exiting from the opening 52.
  • the structure is such that the amount of light that can be retained in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30h can be improved.
  • FIG. 12 is a cross-sectional view showing a configuration example of the pixel 30i according to the ninth embodiment of the present technology.
  • the pixel 30i according to the ninth embodiment is different in the shape of the reflecting portion 114 of the pixel 30h according to the eighth embodiment shown in FIG. 11, and is the same in other points.
  • the reflective portion 115 of the pixel 30i according to the ninth embodiment is formed in a T shape.
  • the lateral reflection portion of the reflection portion 115 is provided directly below the opening 52, and is formed to have the same size as the width of the opening 52. Further, the side of the reflection portion 115 near the opening 52 is formed at a position away from the triangular apex of the light refraction structure portion 152.
  • the light refracting structure portion 152 and the reflecting portion 115 in this way, it is possible to refract the incident light and reduce the incident light that directly reaches the surface side reflecting film 34, and the reflected light of the 0th-order component can be reduced. Can be reduced, and the light that escapes from the opening 52 can be reduced. Therefore, the sensitivity of the pixel 30i can be improved.
  • FIG. 13 is a cross-sectional view showing a configuration example of the pixel 30j according to the tenth embodiment of the present technology.
  • the pixel 30j according to the tenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30b according to the second embodiment shown in FIG.
  • the waveguide 201 is configured to be able to guide the incident light incident through the opening 52 to the semiconductor region 32.
  • the pixel 30j according to the tenth embodiment shown in FIG. 13 has a configuration including a surface-side reflective film 101 having an inclination, it may be configured to include a surface-side reflective film 34 having no inclination. ..
  • the waveguide 201 can be, for example, a core / clad type waveguide.
  • the portion corresponding to the core is the waveguide 201
  • the portion corresponding to the clad is the on-chip lens 37
  • the refractive index of the material of the waveguide 201 and the material of the on-chip lens 37 Due to the difference, a material that causes total reflection is used. Specifically, a material is used in which the refractive index of the material forming the waveguide 201 is lower than the refractive index of the material forming the on-chip lens 37.
  • the material of the waveguide 201 for example, SiN (silicon nitride), Ta2O5 (tantalum oxide), TiO2 (titanium oxide) and the like can be used.
  • the waveguide 201 is formed in a trapezoidal shape in cross section, and when the upper side is on the on-chip lens 37 side and the lower side is on the opening 52 side, the upper side is configured to have a length similar to the diameter of the on-chip lens 37.
  • the lower side can be configured to have a length similar to the width of the opening 52.
  • the size of the opening 52 is set so that the following equation (1) is satisfied, and the size of the waveguide 201 is set according to the size of the opening 52.
  • the inclination angle of the surface-side reflective film 101 By setting the inclination angle of the surface-side reflective film 101 to an angle ⁇ that satisfies the equation (1), when the light that hits the surface-side reflective film 101 is reflected, the reflected light is other than the opening 52.
  • the region, in this case, the region where the back surface scattering portion 51 of the substrate and the back surface side reflective film 35 are located can be advanced.
  • D is the length of the portion opened as the opening 52.
  • the length D can be the length of the lower side of the waveguide 201.
  • T is the length from the opening 52 to the surface-side reflective film 101.
  • T the length between the position on the boundary line between the back surface side reflective film 35 and the absorption film 91 of the opening 52 and the position of the apex of the triangle included in the front surface side reflective film 101 is defined as T.
  • the length T can be rephrased as the length of the semiconductor region 32 in the vertical direction (depth direction) and the length of the photodiode in the depth direction.
  • the width of the opening 52 and the length of the lower side of the waveguide 201 are set.
  • the equation (1) is an example, and the scope of application of the present technology is not limited to the length D based on the equation (1).
  • the waveguide 201 is arranged as shown in FIG. 15 when the pixel 30j is viewed from above.
  • the outer quadrangle represents one pixel 30j
  • the inner quadrangle with diagonal lines represents the lower surface (the surface on the opening 52 side) of the waveguide 201. Since the 1 pixel 30j is separated by the separation region 38, the outer quadrangle in the figure represents the separation region 38, and the description will be continued assuming that the region surrounded by the separation region 38 is the 1 pixel region.
  • FIG. 15A shows a case where the quadrangle of the pixel 30j and the quadrangle of the waveguide 201 are oriented in the same direction.
  • the waveguide 201 has the same shape as the pixel 30j and is formed at a position where the sides are arranged in parallel.
  • FIG. 15B shows a case where the quadrangle of the waveguide 201 is arranged at a position inclined by 45 degrees with respect to the quadrangle of the pixel 30j.
  • the waveguide 201 has the same shape (quadrangle) as the pixel 30j, but is arranged at a position where the sides intersect at 45 degrees.
  • the incident light incident on one side of the waveguide 201 is the side as shown by the arrow in the figure. It corresponds to the side (side surface of the semiconductor region 32) of the pixel 30j arranged in parallel with. Then, the light that hits the side of the pixel 30j is reflected by the side surface of the semiconductor region 32 and returned to the inside of the semiconductor region 32.
  • the incident light incident on one side of the waveguide 201 is as shown by the arrow in the figure.
  • the side (side surface of the semiconductor region 32) of the pixel 30j arranged at a position intersecting the side at 45 degrees is returned to the semiconductor region 32.
  • the position of the waveguide 201 with respect to the pixel 30j may be other than the example shown in FIG. Further, the shape of the waveguide 201 does not have to be a quadrangular shape when viewed from above, and may be a polygonal shape or a circular shape other than the quadrangular shape.
  • the waveguide 201 is used will be described as an example, but a member having the same function as the waveguide may be used instead of the waveguide 201.
  • the case where the waveguide 201 is used will be described as an example, but a member having the same function as the waveguide may be used instead of the waveguide 201.
  • a diffraction grating may be used instead of the waveguide 201.
  • the diffraction grating is composed of a member having a periodic structure, for example, using a material having a refractive index lower than that of the on-chip lens 37.
  • a filter called a plasmon filter or the like may be used as the diffraction grating.
  • the plasmon filter is a filter using a plasmon resonance pair.
  • the plasmon filter is used as a filter that transmits light of a predetermined wavelength.
  • the plasmon filter is sometimes used as a color filter.
  • the color filter is not shown, but the pixel 30 provided with the color filter can of course be used.
  • a diffraction grating-shaped filter such as a plasmon filter that selectively transmits light having a predetermined frequency can also be used.
  • the pixel 30 having a configuration in which the diffraction grating used as such a color filter is provided as the waveguide 201 may be used.
  • a color filter may be provided on the waveguide 201.
  • a color filter layer 221 is provided between the on-chip lens 37 and the waveguide 201. In this way, the color filter layer 221 may be provided.
  • the material constituting the color filter layer 221 is formed in the shape of the waveguide 201, and the color filter layer 221 is provided with the function of the waveguide 201, in other words, the waveguide 201 is provided with the function of the color filter layer 221. It can also be configured. That is, in the pixel 30j shown in FIG. 13, the waveguide 201 may be configured to have a function as a color filter. This also applies to the embodiments described below.
  • the configuration in which the color filter layer 221 is provided can be applied to the first to ninth embodiments described above, and can also be applied to each embodiment described below.
  • the incident light can be guided to the semiconductor region 32 through the opening 52, and more incident light can be incidented on the semiconductor region 32. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30j can be improved.
  • FIG. 17 is a cross-sectional view showing a configuration example of the pixel 30k according to the eleventh embodiment of the present technology.
  • the pixel 30k according to the eleventh embodiment has a configuration in which the waveguide 201 is added to the pixel 30c according to the third embodiment shown in FIG.
  • the waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
  • the incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and incident on the semiconductor region 32, and is reflected by the reflection portion 111 and stays in the semiconductor region 32. In addition, some light is reflected by the surface-side reflective film 101 and returns to the semiconductor region 32, so that the incident light can be retained in the semiconductor region 32.
  • the waveguide 201 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30k can be improved.
  • FIG. 18 is a cross-sectional view showing a configuration example of a pixel 30 m according to a twelfth embodiment of the present technology.
  • the pixel 30m according to the twelfth embodiment has a configuration in which the waveguide 201 is added to the pixel 30d according to the fourth embodiment shown in FIG.
  • the waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
  • the incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and incident on the semiconductor region 32, and is reflected by the reflecting portion 112 and the reflecting portion 113 and enters the semiconductor region 32. Stay. In addition, some light is reflected by the surface-side reflective film 101 and returns to the semiconductor region 32, so that the incident light can be retained in the semiconductor region 32.
  • the waveguide 201 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30 m can be improved.
  • FIG. 19 is a cross-sectional view showing a configuration example of the pixel 30n according to the thirteenth embodiment of the present technology.
  • the pixel 30n according to the thirteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30e according to the fifth embodiment shown in FIG.
  • the waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
  • the incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and is incident on the semiconductor region 32, and the light reaching the front surface side is scattered by the substrate surface scattering portion 131. It is returned to the semiconductor region 32 by being reflected by the surface-side reflective film 34. Therefore, the incident light can be limited to the semiconductor region 32.
  • the waveguide 201 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30n can be improved.
  • FIG. 20 is a cross-sectional view showing a configuration example of the pixel 30p according to the 14th embodiment of the present technology.
  • the pixel 30p according to the fourteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30f according to the sixth embodiment shown in FIG.
  • the waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
  • the incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and is incident on the semiconductor region 32, and the light reaching the front surface side is scattered by the substrate surface scattering portion 131. It is returned to the semiconductor region 32 by being reflected by the surface-side reflective film 101. Therefore, the incident light can be limited to the semiconductor region 32.
  • the waveguide 201 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30p can be improved.
  • FIG. 21 is a cross-sectional view showing a configuration example of the pixel 30q according to the fifteenth embodiment of the present technology.
  • the pixel 30q according to the fifteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30g according to the seventh embodiment shown in FIG.
  • the waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
  • the incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and is incident on the semiconductor region 32, and the light reaching the front surface side is scattered by the substrate surface scattering portion 131. It is returned to the semiconductor region 32 by being reflected by the surface-side reflective film 101.
  • the light incident or reflected in the semiconductor region 32 is reflected by the reflecting portion 112, the reflecting portion 113, and the reflecting portion 114, and stays in the semiconductor region 32.
  • the shape of the surface-side reflective film 101 is a shape including the hypotenuse, it is possible to reduce the reflected light of the 0th-order component and reduce the light that escapes from the opening 52. From these things, the incident light can be kept longer in the semiconductor region 32.
  • the waveguide 201 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30q can be improved.
  • FIG. 22 is a cross-sectional view showing a configuration example of the pixel 30r according to the 16th embodiment of the present technology.
  • the pixel 30r according to the sixteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30h according to the eighth embodiment shown in FIG.
  • the waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
  • the incident light guided to the opening 52 by the waveguide 201 is refracted at the boundary portion between the light refraction structure portion 152 and the semiconductor region 32, and is incident on the semiconductor region 32. ..
  • the light incident on the semiconductor region 32 and reaching the surface side is returned to the semiconductor region 32 by being scattered by the substrate surface scattering portion 131 or reflected by the surface side reflective film 34.
  • some light is reflected by the reflecting unit 114, and the structure is such that the light easily stays in the semiconductor region 32.
  • the pixel 30r according to the sixteenth embodiment also has a structure capable of retaining the incident light in the semiconductor region 32.
  • the waveguide 201 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30r can be improved.
  • FIG. 23 is a cross-sectional view showing a configuration example of the pixels 30s according to the seventeenth embodiment of the present technology.
  • the pixel 30s according to the seventeenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30i according to the ninth embodiment shown in FIG.
  • the waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
  • the incident light guided to the opening 52 by the waveguide 201 is refracted at the boundary portion between the light refraction structure portion 152 and the semiconductor region 32, and is incident on the semiconductor region 32. ..
  • the light incident on the semiconductor region 32 and reaching the surface side is returned to the semiconductor region 32 by being scattered by the substrate surface scattering portion 131 or reflected by the surface side reflective film 34.
  • some light is reflected by the reflecting unit 115, and the structure is such that the light easily stays in the semiconductor region 32.
  • the pixel 30s according to the seventeenth embodiment also has a structure capable of retaining the incident light in the semiconductor region 32.
  • the waveguide 201 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixels 30s can be improved.
  • FIG. 24 is a cross-sectional view showing a configuration example of the pixel 30t according to the eighteenth embodiment of the present technology.
  • the pixel 30t according to the eighteenth embodiment includes the waveguide 301 corresponding to the waveguide 201 like the pixel 30j according to the tenth embodiment shown in FIG. 13, but its shape and shape are formed. The position is different.
  • the waveguide 301 is configured to be able to guide the incident light incident through the opening 52 to the semiconductor region 32, and thus guides the pixels 30j according to the tenth embodiment shown in FIG. It is the same as the waveguide 201.
  • the waveguide 301 is formed in a parallelogram in cross section, and when the upper side is the on-chip lens 37 side and the lower side is the opening 52 side, the upper side and the lower side are about the same length as the width of the opening 52. Can be configured with.
  • the central position P2 of the upper side of the waveguide 301 and the central position P1 of the on-chip lens 37 are in a deviated positional relationship.
  • the central position P2 of the upper side of the waveguide 301 is located at a position shifted to the left side of the central position P1 of the on-chip lens 37.
  • the central position P3 of the lower side of the waveguide 301 and the central position P1 of the on-chip lens 37 are also in a deviated positional relationship.
  • the center position P3 of the lower side of the waveguide 301 is located at a position shifted to the right side of the center position P1 of the on-chip lens 37.
  • the center position of the opening 52 is also shifted to the right side from the center position P1 of the on-chip lens 37.
  • the waveguide 301 is formed in a shape inclined in an oblique direction.
  • the shape is formed so as to be inclined in the upper left diagonal direction.
  • the waveguide 301 When the waveguide 301 is formed in an inclined state in this way, the incident light travels to the pixel 30t shown on the left side of FIG. 24 as shown by an arrow. The incident light incident on the waveguide 301 hits the right side surface of the semiconductor region 32 in the drawing and is reflected. As shown in FIG. 24, when the waveguide 301 is formed in a state of being inclined in the upper left oblique direction, more light travels to the right side surface of the semiconductor region 32 in the drawing.
  • the waveguide 301 By configuring the waveguide 301 so that light travels on the right or left side surface of the semiconductor region 32, it is possible to reduce the amount of light that hits the surface-side reflective film 34 vertically, that is, the light of the so-called 0th-order component. Therefore, it is possible to reduce the amount of light that is reflected by the surface-side reflective film 34 and escapes from the opening 52, that is, the light of the so-called zero-order component.
  • the waveguide 301 by providing the waveguide 301, more incident light can be guided to the semiconductor region 32 through the opening 52, and the amount of reflected light emitted from the opening 52 can be reduced. it can. Therefore, the amount of light confined in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30t can be improved.
  • the inclination angle (inclination angle) of the waveguide 301 may be different depending on the position of the pixel 30t in the pixel array unit 10. For example, when the pixel array unit 10 is divided into a central region and a peripheral region, the tilt angle of the pixels 30t arranged in the central region may be formed to be larger than the tilt angle of the pixels 30t arranged in the peripheral region. ..
  • the pixel array unit 10 is divided into a peripheral region on the right side, a peripheral region on the left side, a peripheral region on the upper side, and a peripheral region on the lower side, and the directions are inclined according to the respective positions. May be different.
  • the configuration may not include the waveguide 301.
  • the waveguide 301 can be formed in the pixel 30t located in the central region of the pixel array unit 10, and the waveguide 301 can not be formed in the pixel 30t located in the peripheral region of the pixel array portion 10. ..
  • the inclination angle of the waveguide 301 is reduced from the central region to the peripheral region, and the waveguide 301 is not formed in the region where the inclination angle is 90 degrees (close to). Can be done.
  • the pixel 30t shown in FIG. 24 has a configuration in which the substrate back surface scattering portion 109 is not provided.
  • the substrate back surface scattering portion 51 is provided. It may be provided.
  • the surface side reflective film 101 having a shape having a slope may be provided.
  • the configuration may include the substrate surface scattering portion 131.
  • the pixel 30t according to the eighteenth embodiment can be configured in combination with the pixel 30 according to another embodiment.
  • the pixel 30t according to the eighteenth embodiment can be configured in combination with the pixels a to i according to the first to ninth embodiments.
  • any of the pixels a to t according to the first to eighteenth embodiments can be applied.
  • FIG. 25 is a cross-sectional view showing a configuration example of the pixel 30u according to the 19th embodiment of the present technology.
  • the pixel 30u according to the nineteenth embodiment is different in that the reflecting portion 311 is added to the pixel 30t according to the eighteenth embodiment shown in FIG. 24, and the other points are the same. is there.
  • the reflection portion 311 is formed on the left side of the opening 52 (lower side of the waveguide 301) in the drawing.
  • the reflection unit 311 is formed at a position that does not interfere with the path of the incident light input via the waveguide 301.
  • the reflecting portion 311 is formed in a rod shape in a cross section, for example, like the reflecting portion 111 provided in the pixel 30c according to the third embodiment shown in FIG.
  • the reflecting portion 311 may be formed as a surface having a predetermined thickness in the semiconductor region 32 in the semiconductor region 32, or may have a shape such as a cylinder or a polygonal prism.
  • the incident light hits the right side surface of the semiconductor region 32 and travels toward the surface side reflecting film 34 side. Then, it is reflected by the surface-side reflective film 34, reflected on the side surface of the semiconductor region 32, hits the reflecting portion 311 and is further reflected by the reflecting portion 311.
  • the light reflected by the reflecting unit 311 travels to the side surface side of the semiconductor region 32 again and is reflected. Light can be reflected a plurality of times between the side surface of the semiconductor region 32 and the reflecting portion 311.
  • the waveguide 301 By providing the waveguide 301 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, by providing the reflecting portion 311, the amount of light that can be kept in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30u can be improved.
  • FIG. 26 is a cross-sectional view showing a configuration example of the pixel 30v according to the twentieth embodiment of the present technology.
  • the pixel 30v according to the twentieth embodiment is different from the pixel 30t according to the eighteenth embodiment shown in FIG. 24 in that a reflection unit 312 and a reflection unit 313 are added. The points are similar.
  • the reflecting portion 312 is formed in the semiconductor region 32 from the back surface side to the front surface side (from top to bottom in the figure), and the reflecting portion 313 is formed in the semiconductor region 32 from the front surface side to the back surface side (from the bottom in the figure). It is formed in the direction of (upper). Further, the reflecting portion 313 is formed at a position between the reflecting portion 312 and the side surface of the semiconductor region 32. Further, the side surface of the semiconductor region is a surface facing the side surface on the side where more light is collected via the waveguide 301 (the side surface on the right side of the semiconductor region 32 in FIG. 26).
  • the incident light is directed to the right side surface side of the semiconductor region 32 as shown in the pixel 30v shown on the left side of FIG. 26. It advances, is reflected, and proceeds to the surface side reflective film 34 side. Then, it is reflected by the surface-side reflective film 34, reflected by the reflecting portion 313, and reflected by the reflecting portion 312.
  • the light reflected by the reflecting portion 312 is reflected by the back surface side reflecting film 35, reflected by the side surface of the semiconductor region 32, and hits the reflecting portion 313. In this way, light is reflected a plurality of times between the side surface of the semiconductor region 32 and the reflection unit 312, between the reflection unit 312 and the reflection unit 313, and between the reflection unit 313 and the side surface of the semiconductor region 32. Can be.
  • the waveguide 301 By providing the waveguide 301 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, by providing the reflecting portion 312 and the reflecting portion 313, the amount of light that can be retained in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30v can be improved.
  • FIG. 27 is a cross-sectional view showing a configuration example of the pixel 30w according to the 21st embodiment of the present technology.
  • the pixel 30w according to the 21st embodiment has a configuration in which a substrate back surface scattering portion 321 and a substrate surface scattering portion 322 are added to the pixel 30u according to the 19th embodiment shown in FIG. 25. Different, the other points are similar.
  • the substrate back surface scattering portion 321 is between the reflection portion 311 and the side surface of the semiconductor region 32, and is formed in a region other than the portion where the waveguide 301 is located. Further, the substrate surface scattering portion 322 is on the front surface side (wiring region 33 side), and like the substrate back surface scattering portion 321, between the reflection portion 311 (a position assumed to be extended) and the side surface of the semiconductor region 32. It is formed.
  • the configuration can be such that light can be confined in the semiconductor region 32.
  • the waveguide 301 By providing the waveguide 301 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, by providing the reflecting portion 311, the substrate back surface scattering portion 321 and the substrate surface scattering portion 322, the incident light can be returned to the semiconductor region 32 and the amount of light that can be retained can be increased. Therefore, the sensitivity of the pixel 30w can be improved.
  • the substrate back surface scattering unit 321 and the substrate surface scattering unit 322 are provided between one side surface of the semiconductor region 32 and the other side surface, for example, as in the pixel 30a according to the first embodiment shown in FIG. It may be configured to be.
  • the first to 21st embodiments can be applied in combination.
  • the pixel array unit 10 may be configured such that among the pixels 30a to 30w according to the first to 21st embodiments, the pixels 30 according to different embodiments are mixed.
  • the pixels 30a to 30w according to the first to 21st embodiments described above have been described by exemplifying a case where the pixels 30a to 30w are applied to an image pickup device (hereinafter, appropriately referred to as a normal pixel) for imaging a subject.
  • a normal pixel an image pickup device
  • it can be applied to pixels as described below.
  • FIG. 28 shows a configuration when the pixel 30j according to the tenth embodiment is applied to the distance measuring pixel.
  • the ranging pixel shown in FIG. 28 has two transfer transistors TRG1 and TRG2 as transfer gates and two stray diffusion regions FD1 and FD2 as charge storage units for one photodiode PD. It shows a so-called two-tap pixel structure in which the electric charge generated by the photodiode PD is distributed to two floating diffusion regions FD1 and FD2.
  • a surface-side reflective film 101 is formed between the two transfer transistors TRG1 and TRG2. When viewed in a plan view, as shown in FIG. 29, the surface-side reflective film 101 is formed near the center of the photodiode PD.
  • FIG. 29 is a plan view showing an arrangement example of the transistor of the pixel 30j shown in FIG. 28.
  • the photodiode PD is formed in the N-type semiconductor region 32 in the central region of the rectangular pixel 30j.
  • a surface-side reflective film 101 is formed in the center of the photodiode PD.
  • the transfer transistor TRG1, the switching transistor FDG1, the reset transistor RST1, the amplification transistor AMP1, and the selection transistor SEL1 are linearly arranged along a predetermined side of four sides of the rectangular pixel 30j outside the photodiode PD.
  • the transfer transistor TRG2, the switching transistor FDG2, the reset transistor RST2, the amplification transistor AMP2, and the selection transistor SEL2 are arranged linearly along the other side of the four sides of the rectangular pixel 30.
  • the charge discharge transistor OFG is arranged on a side different from the two sides of the pixel 30 on which the transfer transistor TRG, the switching transistor FDG, the reset transistor RST, the amplification transistor AMP, and the selection transistor SEL are formed.
  • the arrangement of the pixel circuits shown in FIG. 29 is not limited to this example, and may be other arrangements. Further, here, the configuration when the pixel 30j according to the tenth embodiment is applied to the distance measuring pixel is shown, but such a configuration is also shown when another embodiment is applied as the distance measuring pixel. Arrangements can be applied.
  • FIG. 30 shows a configuration when the pixel 30w according to the nineteenth embodiment is applied to the distance measuring pixel. Since the distance measuring pixel shown in FIG. 30 also illustrates a pixel that performs distance measuring by a two-tap method, one photodiode PD has two transfer transistors TRG1 and TRG2 as transfer gates, and has an electric charge. It has two floating diffusion regions FD1 and FD2 as storage portions.
  • the reflection portion 311 is provided at a position avoiding the opening 52 as shown in FIG. 31.
  • the configuration when the pixel 30w according to the 19th embodiment is applied to the distance measuring pixel is shown, but such an arrangement is also used when the other embodiment is applied as the distance measuring pixel. It can be applied.
  • FIG. 32 shows a configuration when the pixel 30j according to the tenth embodiment is applied to a normal pixel having a floating diffusion region FD.
  • the pixel 30j shown in FIG. 32 is provided with a floating diffusion region FD on the lower right side in the drawing, and is provided with a transfer gate TRG for transferring charges from the photodiode PD to the floating diffusion region FD.
  • the surface side reflective film 101 having an inclination that the floating diffusion region FD side becomes higher is formed.
  • the surface-side reflective film 101 having such an inclination is formed near the center of the photodiode PD in a plane, as in the case shown in FIG. 29, for example.
  • the inclined surface of the surface-side reflective film 101 is formed so as not to face the floating diffusion region FD side. Therefore, as shown in FIG. 32, the incident light hits the inclined surface of the surface-side reflective film 101 and travels to the side surface of the semiconductor region 32 opposite to the side on which the floating diffusion region FD is formed.
  • the surface-side reflective film 101 By forming the surface-side reflective film 101 so that the inclined surface does not face the floating diffusion region FD, it is possible to reduce the amount of light traveling toward the floating diffusion region FD.
  • FIG. 33 shows a configuration when the pixel 30t according to the eighteenth embodiment of the present technology is applied to a normal pixel having a floating diffusion region FD.
  • the pixel 30t shown in FIG. 33 is provided with a floating diffusion region FD on the lower left side in the drawing, and is provided with a transfer gate TRG for transferring charges from the photodiode PD to the floating diffusion region FD.
  • the reflecting portion 351 is added to the floating diffusion region FD side (nearby).
  • the reflective portion 351 is formed so as to extend in the vertical direction from the surface-side reflective film 34.
  • the reflecting portion 351 and the surface side reflecting film 34 may be integrally formed, or may be formed as a separate body, and the reflecting portion 351 and the surface side reflecting film 34 may not be in contact with each other.
  • the surface-side reflective film 34 having such a reflective portion 351 is formed near the center of the photodiode PD, as in the case shown in FIG. 29 (the portion described as the front-side reflective film 101 in FIG. 30), for example.
  • the reflection unit 351 may be configured to surround the floating diffusion region FD.
  • the incident light is opposed to the right side surface of the semiconductor region 32 in the drawing, in other words, the side on which the floating diffusion region FD is formed, by the waveguide 301. Proceed to the side to do. Then, the incident light is reflected by the side surface of the semiconductor region 32, hits the surface side reflective film 34, is reflected, and hits the reflecting portion 351. If there is no reflecting portion 351, the reflected light advances to the floating diffusion region FD, but by providing the reflecting portion 351, it is possible to prevent the reflected light from advancing to the floating diffusion region FD.
  • FIG. 34 is a diagram showing a configuration of pixels 30x to which the present technology is applied to the distance measuring pixels that are pixels having a memory and performing distance measuring. Further, the pixel 30x indicates a case where the distance measuring pixel is a 2-tap method.
  • the pixel 30x shown in FIG. 34 includes a waveguide 201 and a light refraction structure portion 152, similarly to the pixel 30r shown in FIG.
  • the semiconductor substrate 31 on the wiring region 33 side of the pixel 30x has a floating diffusion region FD1, a memory region Mem1, a transfer gate TRG1, a transfer gate TRG1', and a floating diffusion region FD2, a memory area Mem2, a transfer gate TRG2, and a transfer gate TRG2'. Is provided.
  • the electric charge from the photodiode PD is transferred to the memory area Mem1 by the transfer gate TRG1.
  • the electric charge transferred to the memory area Mem1 is transferred to the floating diffusion area FD1 by the transfer gate TRG1'.
  • the charge from the photodiode PD is transferred to the memory area Mem2 by the transfer gate TRG2.
  • the electric charge transferred to the memory area Mem2 is transferred to the floating diffusion area FD2 by the transfer gate TRG2'.
  • the transfer gate TRG1 and the transfer gate TRG2 are each formed of (gates) vertical transistors.
  • the gate of the vertical transistor is a gate having vertical wiring provided in the vertical direction in the drawing and wiring provided close to the semiconductor region 32 constituting the photodiode PD. Is.
  • FIG. 34 shows an example in which the semiconductor region 32 is provided close to the semiconductor region 32, the configuration may be such that the semiconductor region 32 is provided.
  • the reflection unit 361 is placed on the memory area Mem and the floating diffusion area FD so that the stray light component does not enter the memory area Mem and the floating diffusion area FD. It is formed.
  • the reflection portion 361 is formed at the boundary between the semiconductor region 32 and the semiconductor substrate 31. Further, as shown in FIG. 35, the reflection portion 361 is a region avoiding the gate TGR1 and the gate TGR2 of the vertical transistor in a plane, and is formed in the semiconductor region 32.
  • the incident light is refracted by the light refracting structure portion 152, travels toward the side surface of the semiconductor region 32, and is reflected.
  • the reflected light reflected on the side surface of the semiconductor region 32 hits the reflecting portion 361 and is reflected. Since the light is reflected by the reflection unit 361, it is possible to prevent the light from entering the memory area Mem or the floating diffusion area FD arranged under the reflection unit 361.
  • FIG. 36 is a pixel provided with a memory like the pixel 30x shown in FIG. 34, and is a diagram showing a configuration example of the pixel 30y when the present technology is applied to the distance measuring pixel for performing distance measurement. Further, the pixel 30y shown in FIG. 36 shows a case where the pixel 30y is a distance measuring pixel in a two-tap method like the pixel 30x.
  • the pixel 30y shown in FIG. 36 includes a waveguide 301 and a reflection unit 311 like the pixel 30w shown in FIG. 26. Further, the pixels 30y are the same as the pixels 30x, on the semiconductor substrate 31 on the wiring area 33 side, the floating diffusion area FD1, the memory area Mem1, the transfer gate TRG1, the transfer gate TRG1', and the floating diffusion area FD2, the memory area Mem2, and the transfer gate. It is equipped with TRG2 and transfer gate TRG2'.
  • the waveguide 301 When the waveguide 301 is provided, as in the case where the waveguide 201 is provided (similar to the pixel 30x), the memory area Mem and the floating diffusion are prevented so that the stray light component does not enter the memory area Mem and the floating diffusion area FD.
  • a reflection portion 361 is formed on the region FD. Further, as shown in FIG. 35, the reflection portion 361 is a region avoiding the gate TGR1 and the gate TGR2 of the vertical transistor in a plane, and is formed in the semiconductor region 32.
  • the incident light is guided to the waveguide 301, travels in the side surface direction of the semiconductor region 32, and is reflected.
  • the reflected light reflected on the side surface of the semiconductor region 32 hits the reflecting portion 361 and is reflected. Since the light is reflected by the reflection unit 361, it is possible to prevent the light from entering the memory area Mem or the floating diffusion area FD arranged under the reflection unit 361.
  • SPAD single photon avalanche diode
  • the waveguide it is possible to configure the structure so that light can pass through the opening even if the height is low, so that the height of the pixel can be reduced and the size of the pixel can be reduced.
  • the technology according to the present disclosure can be applied to various products.
  • the present technology may be realized as an image pickup device mounted on an image pickup device such as a camera.
  • FIG. 37 is a block diagram showing a schematic configuration example of a camera which is an example of an imaging device to which the present technology can be applied.
  • the camera 1000 in the figure includes a lens 1001, an image pickup element 1002, an image pickup control unit 1003, a lens drive unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, and the like.
  • a recording unit 1009 is provided.
  • the lens 1001 is a photographing lens of the camera 1000.
  • the lens 1001 collects light from the subject and causes the light to be incident on the image pickup device 1002 described later to form an image of the subject.
  • the image sensor 1002 is a semiconductor element that captures light from a subject focused by the lens 1001.
  • the image sensor 1002 generates an analog image signal according to the irradiated light, converts it into a digital image signal, and outputs the signal.
  • the image pickup control unit 1003 controls the image pickup in the image pickup device 1002.
  • the image pickup control unit 1003 controls the image pickup device 1002 by generating a control signal and outputting the control signal to the image pickup device 1002. Further, the image pickup control unit 1003 can perform autofocus on the camera 1000 based on the image signal output from the image pickup device 1002.
  • autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it.
  • a method image plane phase difference autofocus
  • image plane phase difference in which the image plane phase difference is detected by the phase difference pixels arranged in the image sensor 1002 to detect the focal position
  • image pickup control unit 1003 adjusts the position of the lens 1001 via the lens drive unit 1004 based on the detected focal position, and performs autofocus.
  • the image pickup control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
  • DSP Digital Signal Processor
  • the lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003.
  • the lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
  • the image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic to generate an image signal of a color that is insufficient among the image signals corresponding to red, green, and blue for each pixel, noise reduction to remove noise of the image signal, and coding of the image signal. Applicable.
  • the image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
  • the operation input unit 1006 receives the operation input from the user of the camera 1000.
  • a push button or a touch panel can be used for the operation input unit 1006.
  • the operation input received by the operation input unit 1006 is transmitted to the image pickup control unit 1003 and the image processing unit 1005. After that, processing according to the operation input, for example, processing such as imaging of the subject is activated.
  • the frame memory 1007 is a memory that stores a frame that is an image signal for one screen.
  • the frame memory 1007 is controlled by the image processing unit 1005 and holds frames in the process of image processing.
  • the display unit 1008 displays the image processed by the image processing unit 1005.
  • a liquid crystal panel can be used.
  • the recording unit 1009 records the image processed by the image processing unit 1005.
  • a memory card or a hard disk can be used for the recording unit 1009.
  • the cameras to which this disclosure can be applied have been described above.
  • the present technology can be applied to the image pickup device 1002 among the configurations described above.
  • the image pickup device 1 described with reference to FIG. 1 can be applied to the image pickup device 1002.
  • the reflected light is reduced, and it is possible to prevent deterioration of the image quality of the image generated by the camera 1000.
  • the technology according to the present disclosure may be applied to other devices such as a distance sensor.
  • the present disclosure can be applied to a semiconductor device in the form of a semiconductor module in addition to an electronic device such as a camera.
  • the technique according to the present disclosure can be applied to an image pickup module which is a semiconductor module in which the image pickup device 1002 and the image pickup control unit 1003 of FIG. 37 are enclosed in one package.
  • FIG. 38 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (the present technique) can be applied.
  • FIG. 38 shows a surgeon (doctor) 11131 performing surgery on patient 11132 on patient bed 11133 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as a pneumoperitoneum tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
  • a cart 11200 equipped with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
  • the endoscope 11100 configured as a so-called rigid mirror having a rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. Good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101 to be an objective. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image pickup element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the image pickup element by the optical system.
  • the observation light is photoelectrically converted by the image pickup device, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 11201.
  • the CCU11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal for displaying an image based on the image signal, such as development processing (demosaic processing).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on the image signal processed by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of, for example, a light source such as an LED (light LED radio), and supplies irradiation light to the endoscope 11100 when photographing an operating part or the like.
  • a light source such as an LED (light LED radio)
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for cauterizing, incising, sealing a blood vessel, or the like of a tissue.
  • the pneumoperitoneum device 11206 uses a gas in the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the work space of the operator.
  • the recorder 11207 is a device capable of recording various information related to surgery.
  • the printer 11208 is a device capable of printing various information related to surgery in various formats such as text, images, and graphs.
  • the light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
  • a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-divided manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to support each of RGB. It is also possible to capture the image in a time-divided manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the light intensity to acquire an image in a time-divided manner and synthesizing the image, so-called high dynamic without blackout and overexposure. Range images can be generated.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrower band than the irradiation light (that is, white light) during normal observation, the surface layer of the mucous membrane So-called narrow band imaging, in which a predetermined tissue such as a blood vessel is photographed with high contrast, is performed.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
  • FIG. 39 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU11201 shown in FIG. 38.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • CCU11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and CCU11201 are communicatively connected to each other by a transmission cable 11400.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101.
  • the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and incident on the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the image sensor constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type).
  • each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
  • the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D (dimensional) display, respectively.
  • the 3D display enables the operator 11131 to more accurately grasp the depth of the biological tissue in the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each image pickup element.
  • the imaging unit 11402 does not necessarily have to be provided on the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is composed of an actuator, and the zoom lens and focus lens of the lens unit 11401 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 11405. As a result, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU11201.
  • the communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition.
  • the above-mentioned imaging conditions such as frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of CCU11201 based on the acquired image signal. Good.
  • the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal which is the RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site and the like by the endoscope 11100 and the display of the captured image obtained by the imaging of the surgical site and the like. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display an image captured by the surgical unit or the like based on the image signal processed by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edge of an object included in the captured image to remove surgical tools such as forceps, a specific biological part, bleeding, and mist when using the energy treatment tool 11112. Can be recognized.
  • the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the surgical support information and presenting it to the surgeon 11131, it is possible to reduce the burden on the surgeon 11131 and to allow the surgeon 11131 to proceed with the surgery reliably.
  • the transmission cable 11400 that connects the camera head 11102 and CCU11201 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
  • the communication is performed by wire using the transmission cable 11400, but the communication between the camera head 11102 and the CCU11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 40 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (Interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12030 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs cooperative control for the purpose of antiglare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
  • FIG. 41 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as, for example, the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 41 shows an example of the photographing range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more.
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle runs autonomously without depending on the operation of the driver.
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
  • pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the system represents the entire device composed of a plurality of devices.
  • the present technology can also have the following configurations.
  • An on-chip lens that collects incident light and A photoelectric conversion unit that performs photoelectric conversion of the incident light and An opening having a size substantially the same as the focused size of the incident light, a waveguide for guiding the incident light to the photoelectric conversion unit, and a waveguide.
  • a reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and
  • An image pickup device including a concavo-convex region having a plurality of concavities and convexities on the side of the photoelectric conversion unit on which the incident light is incident.
  • the image pickup device according to any one of (1) to (9) above, further comprising a reflection unit that reflects light in the vicinity of the charge storage unit that stores the charge converted by the photoelectric conversion unit.
  • the reflective film is provided between a charge storage unit that stores charges converted by the photoelectric conversion unit and the photoelectric conversion unit.
  • the waveguide is a diffraction grating.
  • An on-chip lens that collects incident light and A photoelectric conversion unit that performs photoelectric conversion of the incident light and An opening having a size substantially the same as the focused size of the incident light, a waveguide for guiding the incident light to the photoelectric conversion unit, and a waveguide.
  • a reflective film that reflects the incident light that has passed through the photoelectric conversion unit and An image pickup apparatus including an image pickup element having a plurality of unevenness regions on the side of the photoelectric conversion unit on which the incident light is incident, and a processing unit for processing a signal from the image pickup element.
  • An image pickup device including a concavo-convex region having a plurality of concavities and convexities on the side of the photoelectric conversion unit on which the incident light is incident.
  • An on-chip lens that collects incident light and A photoelectric conversion unit that performs photoelectric conversion of the incident light and An opening having a size substantially the same as the focused size of the incident light, A reflective film that reflects the incident light that has passed through the photoelectric conversion unit and has an inclined surface.
  • An image pickup apparatus including an image pickup element having a plurality of unevenness regions on the side of the photoelectric conversion unit on which the incident light is incident, and a processing unit for processing a signal from the image pickup element.
  • 1 image pickup element 10 pixel array unit, 11 vertical drive unit, 12 column signal processing unit, 13 control unit, 14, 15, 16 signal lines, 30 pixels, 31 semiconductor substrate, 32 semiconductor area, 33 wiring area, 34 surface side Reflective film, 35 backside reflective film, 36 protective film, 37 on-chip lens, 38 separation area, 41 insulating layer, 42 wiring layer, 51 substrate backside scattering part, 52 opening, 71, 72 incident light, 91 absorbing film, 101 front side reflective film, 109 substrate back surface scattering part, 111, 112, 113, 114, 115 reflection part, 131 substrate surface scattering part, 151 substrate back surface scattering part, 152 light refraction structure part, 201 waveguide, 221 color filter layer , 301 waveguide, 311, 312, 313 reflector, 321 substrate back surface scatterer, 322 substrate surface scatterer, 351, 361 reflector

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Optics & Photonics (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Diffracting Gratings Or Hologram Optical Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

The present technology relates to an imaging element and an imaging device, which make it possible to retain light incident on a photoelectric conversion part of the imaging element in the photoelectric conversion part, and improve sensitivity. This imaging element is provided with: an on-chip lens that condenses incident light; a photoelectric conversion part that performs photoelectric conversion on the incident light; a waveguide that guides the incident light to the photoelectric conversion part in an opening having approximately the same size as the condensation size of the incident light; a reflective film that reflects the incident light transmitted through the photoelectric conversion part; and a relief region having a plurality of recesses and protrusions on the side on which the incident light is incident of the photoelectric conversion part. The reflective film has an inclined surface. The present technology is applicable to an imaging element.

Description

撮像素子および撮像装置Image sensor and image sensor
 本技術は撮像素子および撮像装置に関し、例えば、半導体基板の裏面から入射光が照射される撮像素子および当該撮像素子を使用する撮像装置に関する。 The present technology relates to an image sensor and an image sensor, for example, an image sensor in which incident light is emitted from the back surface of a semiconductor substrate and an image sensor using the image sensor.
 入射光を光電変換するフォトダイオード等の光電変換部が形成された半導体基板の裏面側に入射光が照射される撮像素子が使用されている。半導体基板の表面に形成される配線領域を介することなく入射光が光電変換部に照射されるため、感度を向上させることができる。 An image sensor is used in which the incident light is irradiated to the back surface side of the semiconductor substrate on which the photoelectric conversion part such as a photodiode that photoelectrically converts the incident light is formed. Since the incident light is applied to the photoelectric conversion unit without passing through the wiring region formed on the surface of the semiconductor substrate, the sensitivity can be improved.
 このような撮像素子として、例えば、シリコン基板に中間層およびシリコン層が順に積層されて構成されたSOI(Silicon on Insulator)基板のシリコン層にフォトダイオード等が形成される撮像素子が使用されている(例えば、特許文献1参照)。このような撮像素子においては、フォトダイオード等の受光センサ部が形成されたシリコン層の表面に配線部(配線領域)が配置される。この配線領域に支持基板が接着された後に、シリコン基板および中間層が除去される。シリコン層には、10μm以下の厚さの薄膜シリコンを使用することができる。研削等による半導体基板の薄肉化の工程が不要であるため、厚さが安定したシリコン層を歩留まりよく製造することができる。 As such an image pickup device, for example, an image pickup device in which a photodiode or the like is formed on a silicon layer of an SOI (Silicon on Insulator) substrate formed by sequentially laminating an intermediate layer and a silicon layer on a silicon substrate is used. (See, for example, Patent Document 1). In such an image pickup device, a wiring portion (wiring region) is arranged on the surface of a silicon layer on which a light receiving sensor portion such as a photodiode is formed. After the support substrate is adhered to this wiring region, the silicon substrate and the intermediate layer are removed. As the silicon layer, thin film silicon having a thickness of 10 μm or less can be used. Since the process of thinning the semiconductor substrate by grinding or the like is not required, a silicon layer having a stable thickness can be manufactured with a high yield.
特開2007-335905号公報JP-A-2007-335905
 フォトダイオード等が形成される半導体基板は、薄膜のシリコン層が使用されるため、入射光のうち半導体基板にて吸収されなかった入射光が、配線領域に到達して反射され、再度、撮像素子に入射してしまう可能性があった。 Since a thin silicon layer is used for the semiconductor substrate on which the photodiode or the like is formed, the incident light that is not absorbed by the semiconductor substrate among the incident light reaches the wiring region and is reflected, and the image sensor is again used. There was a possibility that it would be incident on.
 本技術は、このような状況に鑑みてなされたものであり、撮像素子の反射光も利用して、感度を向上させることができるようにするものである。 This technology was made in view of such a situation, and makes it possible to improve the sensitivity by also utilizing the reflected light of the image sensor.
 本技術の一側面の第1の撮像素子は、入射光を集光するオンチップレンズと、前記入射光の光電変換を行う光電変換部と、前記入射光の集光サイズと略同じ大きさの開口部に、前記光電変換部に前記入射光を導く導波路と、前記光電変換部を透過した前記入射光を反射する反射膜と、前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域とを備える。 The first imaging element on one aspect of the present technology includes an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light. In the opening, a waveguide that guides the incident light to the photoelectric conversion unit, a reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and a plurality of sides of the photoelectric conversion unit on which the incident light is incident. It is provided with an uneven region having unevenness.
 本技術の一側面の第1の撮像装置は、入射光を集光するオンチップレンズと、前記入射光の光電変換を行う光電変換部と、前記入射光の集光サイズと略同じ大きさの開口部に、前記光電変換部に前記入射光を導く導波路と、前記光電変換部を透過した前記入射光を反射する反射膜と、前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域とを備える撮像素子と前記撮像素子からの信号を処理する処理部とを備える。 The first image sensor on one aspect of the present technology includes an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light. In the opening, a waveguide that guides the incident light to the photoelectric conversion unit, a reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and a plurality of sides of the photoelectric conversion unit on which the incident light is incident. It includes an image pickup element having an unevenness region having irregularities and a processing unit for processing a signal from the image pickup element.
 本技術の一側面の第2の撮像素子は、入射光を集光するオンチップレンズと、前記入射光の光電変換を行う光電変換部と、前記入射光の集光サイズと略同じ大きさの開口部と、前記光電変換部を透過した前記入射光を反射する反射膜であり、傾斜面を有する反射膜と、前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域とを備える。 The second imaging element on one aspect of the present technology includes an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light. An opening, a reflective film that reflects the incident light transmitted through the photoelectric conversion unit, a reflective film having an inclined surface, and a concavo-convex region having a plurality of irregularities on the side of the photoelectric conversion unit on which the incident light is incident. And.
 本技術の一側面の第2の撮像装置は、入射光を集光するオンチップレンズと、前記入射光の光電変換を行う光電変換部と、前記入射光の集光サイズと略同じ大きさの開口部と、前記光電変換部を透過した前記入射光を反射する反射膜であり、傾斜面を有する反射膜と、前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域とを備える撮像素子と前記撮像素子からの信号を処理する処理部とを備える。 The second image sensor on one aspect of the present technology has an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and a size that is substantially the same as the condensed size of the incident light. An opening, a reflective film that reflects the incident light transmitted through the photoelectric conversion unit, a reflective film having an inclined surface, and a concavo-convex region having a plurality of irregularities on the side of the photoelectric conversion unit on which the incident light is incident. It is provided with an image pickup device including the above and a processing unit for processing a signal from the image pickup device.
 本技術の一側面の第1の撮像素子においては、入射光を集光するオンチップレンズと、入射光の光電変換を行う光電変換部と、入射光の集光サイズと略同じ大きさの開口部に、光電変換部に入射光を導く導波路と、光電変換部を透過した入射光を反射する反射膜と、光電変換部の入射光が入射する側に複数の凹凸を有する凹凸領域とが備えられる。 In the first image pickup element on one aspect of the present technology, an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and an opening having a size substantially the same as the condensed size of the incident light. There are a waveguide that guides the incident light to the photoelectric conversion unit, a reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and an uneven region that has a plurality of irregularities on the side where the incident light of the photoelectric conversion unit is incident. Be prepared.
 本技術の一側面の第1の撮像装置においては、前記第1の撮像素子が備えられている。 The first image pickup device on one aspect of the present technology is provided with the first image pickup element.
 本技術の一側面の第2の撮像素子においては、入射光を集光するオンチップレンズと、入射光の光電変換を行う光電変換部と、入射光の集光サイズと略同じ大きさの開口部と、光電変換部を透過した入射光を反射する反射膜であり、傾斜面を有する反射膜と、光電変換部の入射光が入射する側に複数の凹凸を有する凹凸領域とが備えられている。 In the second imaging element on one aspect of the present technology, an on-chip lens that collects incident light, a photoelectric conversion unit that performs photoelectric conversion of the incident light, and an opening having a size substantially the same as the condensed size of the incident light. A portion, a reflective film that reflects incident light transmitted through the photoelectric conversion unit, a reflective film having an inclined surface, and a concavo-convex region having a plurality of irregularities on the side where the incident light of the photoelectric conversion unit is incident are provided. There is.
 本技術の一側面の第2の撮像装置においては、前記第2の撮像素子が備えられている。 The second image pickup device on one aspect of the present technology is provided with the second image pickup element.
 なお、撮像装置は、独立した装置であっても良いし、1つの装置を構成している内部ブロックであっても良い。 The imaging device may be an independent device or an internal block constituting one device.
本技術を適用した撮像素子の一実施の形態の構成を示す図である。It is a figure which shows the structure of one Embodiment of the image pickup device to which this technique is applied. 第1の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 1st Embodiment. 撮像素子への入射光と、反射光の一例を示す図である。It is a figure which shows an example of the incident light to an image sensor, and the reflected light. 第1の実施の形態に係る画素の他の構成例を示す図である。It is a figure which shows the other structural example of the pixel which concerns on 1st Embodiment. 第2の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 2nd Embodiment. 第3の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 3rd Embodiment. 第4の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 4th Embodiment. 第5の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 5th Embodiment. 第6の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 6th Embodiment. 第7の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 7th Embodiment. 第8の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 8th Embodiment. 第9の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 9th Embodiment. 第10の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 10th Embodiment. 導波路の大きさについて説明するための図である。It is a figure for demonstrating the size of the waveguide. 導波路の配置位置について説明するための図である。It is a figure for demonstrating the arrangement position of the waveguide. 第10の実施の形態に係る画素の他の構成例を示す図である。It is a figure which shows the other structural example of the pixel which concerns on 10th Embodiment. 第11の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 11th Embodiment. 第12の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 12th Embodiment. 第13の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 13th Embodiment. 第14の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 14th Embodiment. 第15の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 15th Embodiment. 第16の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 16th Embodiment. 第17の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 17th Embodiment. 第18の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 18th Embodiment. 第19の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 19th Embodiment. 第20の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 20th Embodiment. 第21の実施の形態に係る画素の構成例を示す図である。It is a figure which shows the structural example of the pixel which concerns on 21st Embodiment. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 適用例について説明するための図である。It is a figure for demonstrating an application example. 内視鏡手術システムの概略的な構成の一例を示す図である。It is a figure which shows an example of the schematic structure of the endoscopic surgery system. カメラヘッド及びCCUの機能構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional structure of a camera head and a CCU. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the schematic structure of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of the vehicle exterior information detection unit and the image pickup unit.
 以下に、本技術を実施するための形態(以下、実施の形態という)について説明する。 The embodiment for implementing the present technology (hereinafter referred to as the embodiment) will be described below.
 <撮像素子の構成>
 図1は、本技術の実施の形態に係る撮像素子の構成例を示す図である。同図の撮像素子1は、画素アレイ部10と、垂直駆動部11と、カラム信号処理部12と、制御部13とを備える。
<Structure of image sensor>
FIG. 1 is a diagram showing a configuration example of an image pickup device according to an embodiment of the present technology. The image pickup device 1 in the figure includes a pixel array unit 10, a vertical drive unit 11, a column signal processing unit 12, and a control unit 13.
 画素アレイ部10は、画素30が2次元格子状に配置されて構成されたものである。ここで、画素30は、照射された光に応じた画像信号を生成するものである。この画素30は、照射された光に応じた電荷を生成する光電変換部を有する。また画素30は、画素回路をさらに有する。この画素回路は、光電変換部により生成された電荷に基づく画像信号を生成する。画像信号の生成は、垂直駆動部11により生成された制御信号により制御される。 The pixel array unit 10 is configured by arranging the pixels 30 in a two-dimensional grid pattern. Here, the pixel 30 generates an image signal according to the irradiated light. The pixel 30 has a photoelectric conversion unit that generates an electric charge according to the irradiated light. Further, the pixel 30 further has a pixel circuit. This pixel circuit generates an image signal based on the electric charge generated by the photoelectric conversion unit. The generation of the image signal is controlled by the control signal generated by the vertical drive unit 11.
 画素アレイ部10には、信号線14と信号線15がXYマトリクス状に配置される。信号線14は、画素30における画素回路の制御信号を伝達する信号線であり、画素アレイ部10の行毎に配置され、各行に配置される画素30に対して共通に配線される。信号線15は、画素30の画素回路により生成された画像信号を伝達する信号線であり、画素アレイ部10の列毎に配置され、各列に配置される画素30に対して共通に配線される。これら光電変換部および画素回路は、半導体基板に形成される。 A signal line 14 and a signal line 15 are arranged in an XY matrix in the pixel array unit 10. The signal line 14 is a signal line that transmits a control signal of the pixel circuit in the pixel 30, is arranged for each line of the pixel array unit 10, and is commonly wired to the pixel 30 arranged in each line. The signal line 15 is a signal line for transmitting an image signal generated by the pixel circuit of the pixel 30, is arranged for each row of the pixel array unit 10, and is commonly wired to the pixels 30 arranged in each row. To. These photoelectric conversion units and pixel circuits are formed on a semiconductor substrate.
 垂直駆動部11は、画素30の画素回路の制御信号を生成するものである。この垂直駆動部11は、生成した制御信号を同図の信号線14を介して画素30に伝達する。カラム信号処理部12は、画素30により生成された画像信号を処理するものである。このカラム信号処理部12は、同図の信号線15を介して画素30から伝達された画像信号の処理を行う。カラム信号処理部12における処理には、例えば、画素30において生成されたアナログの画像信号をデジタルの画像信号に変換するアナログデジタル変換が該当する。 The vertical drive unit 11 generates a control signal for the pixel circuit of the pixel 30. The vertical drive unit 11 transmits the generated control signal to the pixel 30 via the signal line 14 in the figure. The column signal processing unit 12 processes the image signal generated by the pixel 30. The column signal processing unit 12 processes the image signal transmitted from the pixel 30 via the signal line 15 in the figure. The processing in the column signal processing unit 12 corresponds to, for example, analog-to-digital conversion that converts an analog image signal generated in the pixel 30 into a digital image signal.
 カラム信号処理部12により処理された画像信号は、撮像素子1の画像信号として出力される。制御部13は、撮像素子1の全体を制御するものである。この制御部13は、垂直駆動部11およびカラム信号処理部12を制御する制御信号を生成して出力することにより、撮像素子1の制御を行う。制御部13により生成された制御信号は、信号線16および17により垂直駆動部11およびカラム信号処理部12に対してそれぞれ伝達される。 The image signal processed by the column signal processing unit 12 is output as an image signal of the image sensor 1. The control unit 13 controls the entire image sensor 1. The control unit 13 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical drive unit 11 and the column signal processing unit 12. The control signal generated by the control unit 13 is transmitted to the vertical drive unit 11 and the column signal processing unit 12 by the signal lines 16 and 17, respectively.
 <第1の実施の形態に係る画素の構成>
 図2は、本技術の第1の実施の形態に係る画素の構成例を示す断面図である。
<Pixel configuration according to the first embodiment>
FIG. 2 is a cross-sectional view showing a configuration example of a pixel according to the first embodiment of the present technology.
 画素30は、半導体基板31、半導体領域32、配線領域33、表面側反射膜34、裏面側反射膜35、保護膜36、およびオンチップレンズ37を備える。 The pixel 30 includes a semiconductor substrate 31, a semiconductor region 32, a wiring region 33, a front surface side reflective film 34, a back surface side reflective film 35, a protective film 36, and an on-chip lens 37.
 半導体基板31は、前述の光電変換部や画素回路を構成する素子の半導体領域(拡散領域)が形成される半導体の基板である。この半導体基板31は、シリコン(Si)により構成することができる。光電変換部等の半導体素子は、半導体基板31に形成されたウェル領域に配置される。便宜上、同図の半導体基板31は、p型のウェル領域を構成するものと想定する。 The semiconductor substrate 31 is a semiconductor substrate on which the semiconductor region (diffusion region) of the elements constituting the photoelectric conversion unit and the pixel circuit described above is formed. The semiconductor substrate 31 can be made of silicon (Si). A semiconductor element such as a photoelectric conversion unit is arranged in a well region formed on the semiconductor substrate 31. For convenience, it is assumed that the semiconductor substrate 31 in the figure constitutes a p-type well region.
 このp型のウェル領域にn型の半導体領域を形成することにより、素子の拡散領域を形成することができる。同図の半導体基板31には、素子の例として光電変換部を構成するn型の半導体領域32を記載した。このn型の半導体領域32と周囲のp型のウェル領域との界面のpn接合によるフォトダイオードが光電変換部に該当する。このn型の半導体領域32に入射光が照射されると光電変換を生じる。この光電変換により生成された電荷がn型の半導体領域32に蓄積される。この蓄積された電荷に基づいて不図示の画素回路により画像信号が生成される。 By forming an n-type semiconductor region in this p-type well region, a diffusion region of the device can be formed. On the semiconductor substrate 31 in the figure, an n-type semiconductor region 32 constituting a photoelectric conversion unit is described as an example of an element. The photodiode formed by the pn junction at the interface between the n-type semiconductor region 32 and the surrounding p-type well region corresponds to the photoelectric conversion unit. When the n-type semiconductor region 32 is irradiated with incident light, photoelectric conversion occurs. The electric charge generated by this photoelectric conversion is accumulated in the n-type semiconductor region 32. An image signal is generated by a pixel circuit (not shown) based on the accumulated charge.
 なお、同図の半導体基板31における画素30の境界には、分離領域38を配置することができる。この分離領域38は、画素30を光学的に分離するものである。具体的には、分離領域38として入射光を反射する膜を画素30の間に配置することにより、隣接する画素30への入射光の漏洩を防止する。これにより、画素30の間のクロストークを防ぐことができる。分離領域38は、例えば、タングステン(W)等の金属により構成することができる。 Note that the separation region 38 can be arranged at the boundary of the pixels 30 in the semiconductor substrate 31 in the figure. The separation region 38 optically separates the pixels 30. Specifically, by arranging a film that reflects the incident light between the pixels 30 as the separation region 38, leakage of the incident light to the adjacent pixels 30 is prevented. This makes it possible to prevent crosstalk between the pixels 30. The separation region 38 can be made of, for example, a metal such as tungsten (W).
 なお、分離領域38と半導体基板31との間には、固定電荷膜および絶縁膜を配置することができる。固定電荷膜は、半導体基板31の界面に配置されて半導体基板31の表面準位をピニングする膜である。また、絶縁膜は、固定電荷膜および分離領域38の間に配置されて分離領域38を絶縁する膜である。このような分離領域38は、半導体基板31に形成された溝の表面に固定電荷膜および絶縁膜を形成し、タングステン(W)等の金属を埋め込むことにより形成することができる。このような絶縁膜を備える分離領域38を配置することにより、画素30を電気的に分離することもできる。 A fixed charge film and an insulating film can be arranged between the separation region 38 and the semiconductor substrate 31. The fixed charge film is a film that is arranged at the interface of the semiconductor substrate 31 and pins the surface level of the semiconductor substrate 31. The insulating film is a film that is arranged between the fixed charge film and the separation region 38 to insulate the separation region 38. Such a separation region 38 can be formed by forming a fixed charge film and an insulating film on the surface of the groove formed in the semiconductor substrate 31 and embedding a metal such as tungsten (W). By arranging the separation region 38 provided with such an insulating film, the pixels 30 can be electrically separated.
 配線領域33は、半導体基板31の表面に隣接して配置され、信号を伝達する配線が形成される領域である。同図の配線領域33は、配線層42および絶縁層41を備える。配線層42は、半導体基板31の素子に信号を伝達する導体である。この配線層42は、銅(Cu)やタングステン(W)等の金属により構成することができる。 The wiring area 33 is an area that is arranged adjacent to the surface of the semiconductor substrate 31 and on which wiring for transmitting signals is formed. The wiring region 33 in the figure includes a wiring layer 42 and an insulating layer 41. The wiring layer 42 is a conductor that transmits a signal to the elements of the semiconductor substrate 31. The wiring layer 42 can be made of a metal such as copper (Cu) or tungsten (W).
 絶縁層41は、配線層42を絶縁するものである。この絶縁層41は、例えば、酸化シリコン(SiO2)により構成することができる。なお、配線層42および絶縁層41は、多層に構成することができる。同図は、2層に構成された配線の例を表したものである。異なる層に配置された配線層42同士は不図示のビアプラグにより接続することができる。 The insulating layer 41 insulates the wiring layer 42. The insulating layer 41 can be made of, for example, silicon oxide (SiO2). The wiring layer 42 and the insulating layer 41 can be configured in multiple layers. The figure shows an example of wiring configured in two layers. Wiring layers 42 arranged in different layers can be connected to each other by a via plug (not shown).
 なお、同図の撮像素子1は、半導体基板31の裏面側から光電変換部に入射光が照射される裏面照射型の撮像素子に該当する。後述するオンチップレンズ37および裏面側反射膜35を介して半導体基板31に入射する被写体からの入射光は、半導体基板31に吸収されて光電変換される。しかしながら、半導体基板31に吸収されなかった入射光は、半導体基板31を透過して透過光となり、配線領域33に入射する。 The image sensor 1 in the figure corresponds to a back-illuminated image sensor in which incident light is irradiated from the back surface side of the semiconductor substrate 31 to the photoelectric conversion unit. The incident light from the subject incident on the semiconductor substrate 31 via the on-chip lens 37 and the back surface side reflective film 35, which will be described later, is absorbed by the semiconductor substrate 31 and photoelectrically converted. However, the incident light that is not absorbed by the semiconductor substrate 31 passes through the semiconductor substrate 31 and becomes transmitted light, and is incident on the wiring region 33.
 表面側反射膜34は、半導体基板31の表面側に配置されて透過光を反射するものである。同図の表面側反射膜34は、配線領域33に配置され、絶縁層41を介して半導体基板31に隣接して配置される。この表面側反射膜34は、画素30の半導体基板31の表面側を覆う形状に構成される。 The surface-side reflective film 34 is arranged on the surface side of the semiconductor substrate 31 and reflects transmitted light. The surface-side reflective film 34 in the figure is arranged in the wiring region 33, and is arranged adjacent to the semiconductor substrate 31 via the insulating layer 41. The surface-side reflective film 34 is configured to cover the surface side of the semiconductor substrate 31 of the pixel 30.
 表面側反射膜34を配置することにより、半導体基板31を透過した透過光を半導体基板31側に反射することができる。これにより、光電変換に寄与する入射光を増加させることができる。よって画素30の変換効率の向上が可能となる。この表面側反射膜34は、WやCu等の金属により構成することができる。また、配線層42により表面側反射膜34を構成することもできる。この場合には、表面側反射膜34を配線層42と同時に形成することができる。 By arranging the surface-side reflective film 34, the transmitted light transmitted through the semiconductor substrate 31 can be reflected to the semiconductor substrate 31 side. This makes it possible to increase the incident light that contributes to photoelectric conversion. Therefore, the conversion efficiency of the pixel 30 can be improved. The surface-side reflective film 34 can be made of a metal such as W or Cu. Further, the surface side reflective film 34 can be formed by the wiring layer 42. In this case, the surface-side reflective film 34 can be formed at the same time as the wiring layer 42.
 裏面側反射膜35は、半導体基板31の裏面側に配置されて被写体からの入射光を透過させるとともに反射光をさらに反射するものである。同図の裏面側反射膜35は、保護膜36を介して半導体基板31に隣接して配置される。この裏面側反射膜35は、中央部に開口部52を備え、この開口部52を介して後述するオンチップレンズ37により集光された入射光を透過させる。また、裏面側反射膜35は、上述の反射光を再度反射して半導体基板31に入射させ、画素30の外部への反射光の漏洩を軽減する。この裏面側反射膜35は、表面側反射膜34や分離領域38と同様にWやCu等の金属により構成することができる。 The back surface side reflective film 35 is arranged on the back surface side of the semiconductor substrate 31 to transmit the incident light from the subject and further reflect the reflected light. The back surface side reflective film 35 in the figure is arranged adjacent to the semiconductor substrate 31 via the protective film 36. The back surface side reflective film 35 is provided with an opening 52 in the central portion, and the incident light collected by the on-chip lens 37 described later is transmitted through the opening 52. Further, the back surface side reflective film 35 reflects the above-mentioned reflected light again and causes it to enter the semiconductor substrate 31 to reduce leakage of the reflected light to the outside of the pixel 30. The back surface side reflective film 35 can be made of a metal such as W or Cu, like the front surface side reflective film 34 and the separation region 38.
 また、裏面側反射膜35は、分離領域38と同時に形成することができる。具体的には、半導体基板31に形成された溝に分離領域38の材料となる金属を埋め込む際に、半導体基板31の裏面にも材料膜を形成する。この形成された材料膜に開口部52を形成することにより、裏面側反射膜35を製造することができる。開口部52は、オンチップレンズ37による入射光の集光サイズと略同じ大きさに構成することができる。 Further, the back surface side reflective film 35 can be formed at the same time as the separation region 38. Specifically, when the metal used as the material of the separation region 38 is embedded in the groove formed in the semiconductor substrate 31, a material film is also formed on the back surface of the semiconductor substrate 31. By forming the opening 52 in the formed material film, the back surface side reflective film 35 can be manufactured. The opening 52 can be configured to have substantially the same size as the condensed size of the incident light by the on-chip lens 37.
 保護膜36は、半導体基板31の裏面側を絶縁するとともに保護する膜である。同図の保護膜36は、裏面側反射膜35を覆う形状に構成され、裏面側反射膜35が配置された半導体基板31の裏面側の平坦化をさらに行う。この保護膜36は、例えば、SiO2により構成することができる。なお、保護膜36のうち、半導体基板31の表面に隣接する部分には、前述の固定電荷膜を配置することができる。固定電荷膜には、例えば、ハフニウム、アルミニウムおよびタンタル等の金属の酸化物を使用することができる。 The protective film 36 is a film that insulates and protects the back surface side of the semiconductor substrate 31. The protective film 36 in the figure is configured to cover the back surface side reflective film 35, and further flattens the back surface side of the semiconductor substrate 31 on which the back surface side reflective film 35 is arranged. The protective film 36 can be made of, for example, SiO2. The above-mentioned fixed charge film can be arranged on the portion of the protective film 36 adjacent to the surface of the semiconductor substrate 31. For the fixed charge film, for example, oxides of metals such as hafnium, aluminum and tantalum can be used.
 オンチップレンズ37は、画素30毎に配置されて半導体基板31の光電変換部に被写体からの入射光を集光するレンズである。このオンチップレンズ37は、凸レンズ形状に構成され、入射光を集光する。同図のオンチップレンズ37は、上述の裏面側反射膜35の開口部52を介して入射光を光電変換部に集光する。オンチップレンズ37は、例えば、樹脂等の有機材料や窒化シリコン(SiN)等の無機材料により構成することができる。 The on-chip lens 37 is a lens that is arranged for each pixel 30 and collects incident light from a subject on a photoelectric conversion unit of a semiconductor substrate 31. The on-chip lens 37 is configured in a convex lens shape and collects incident light. The on-chip lens 37 in the figure collects incident light on the photoelectric conversion unit through the opening 52 of the back surface side reflective film 35 described above. The on-chip lens 37 can be made of, for example, an organic material such as resin or an inorganic material such as silicon nitride (SiN).
 同図に表したように、入射光はオンチップレンズ37により集光され、半導体基板31の領域に焦点が形成される。オンチップレンズ37に入射した光は、オンチップレンズ37から半導体基板31に至る間に徐々に絞られ、水平方向の入射光の照射範囲である集光サイズが狭くなる。裏面側反射膜35の開口部52を入射光の集光サイズに略等しい大きさに構成することにより、オンチップレンズ37により集光された入射光の裏面側反射膜35による遮蔽(ケラレ)を防ぐことができる。開口部52が縮小されるため、開口部52からの反射光の漏洩を低減することができる。 As shown in the figure, the incident light is focused by the on-chip lens 37, and a focal point is formed in the region of the semiconductor substrate 31. The light incident on the on-chip lens 37 is gradually narrowed down from the on-chip lens 37 to the semiconductor substrate 31, and the focused size, which is the irradiation range of the incident light in the horizontal direction, is narrowed. By configuring the opening 52 of the back surface side reflective film 35 to have a size substantially equal to the focused size of the incident light, the incident light collected by the on-chip lens 37 is shielded (vignetting) by the back surface side reflecting film 35. Can be prevented. Since the opening 52 is reduced, leakage of reflected light from the opening 52 can be reduced.
 さらに、図2に示した撮像素子1を構成する画素30は、基板裏面散乱部51を備える。画素30は、反射光の一部が裏面側反射膜35の開口部52から漏洩する可能性があるが、基板裏面散乱部51を設けることで、開口部52から漏洩する反射光を散乱させ、半導体領域32に戻すことができる構成となる。 Further, the pixel 30 constituting the image pickup device 1 shown in FIG. 2 includes a substrate back surface scattering portion 51. A part of the reflected light of the pixel 30 may leak from the opening 52 of the back surface reflective film 35, but by providing the substrate back surface scattering portion 51, the reflected light leaking from the opening 52 is scattered. The configuration can be returned to the semiconductor region 32.
 基板裏面散乱部51は、半導体基板31の裏面に形成されて入射光や反射光を散乱するものである。この基板裏面散乱部51は、半導体基板31の裏面に形成された凹凸により構成することができる。基板裏面散乱部51は、複数の凹部および凸部を有する領域とされる。基板裏面散乱部51は、凹凸を有する領域であるため、入射された光を散乱させる構造とすることができる。同図の基板裏面散乱部51は、裏面側反射膜35の開口部52の近傍に配置される。また、同図に示すように、基板裏面散乱部51は、分離領域38間の裏面側に設けられている。 The substrate back surface scattering portion 51 is formed on the back surface of the semiconductor substrate 31 and scatters incident light and reflected light. The substrate back surface scattering portion 51 can be formed by irregularities formed on the back surface of the semiconductor substrate 31. The substrate back surface scattering portion 51 is a region having a plurality of concave portions and convex portions. Since the substrate back surface scattering portion 51 is a region having irregularities, it can be configured to scatter incident light. The substrate back surface scattering portion 51 in the figure is arranged in the vicinity of the opening 52 of the back surface side reflective film 35. Further, as shown in the figure, the substrate back surface scattering portion 51 is provided on the back surface side between the separation regions 38.
 開口部52を通って画素30の外部に漏洩する反射光は、基板裏面散乱部51により散乱されるため、広い範囲に分散して照射される。このため、フレア等を目立たなくすることができる。基板裏面散乱部51は、例えば、半導体基板31の裏面を部分的にエッチングすることにより形成することができる。例えば、半導体基板31の裏面に対して異方性のエッチングを行って同図に表したV字の凹部を複数形成することにより、基板裏面散乱部51を形成することができる。 The reflected light leaking to the outside of the pixel 30 through the opening 52 is scattered by the back surface scattering portion 51 of the substrate, so that it is dispersed and irradiated over a wide range. Therefore, flare and the like can be made inconspicuous. The substrate back surface scattering portion 51 can be formed, for example, by partially etching the back surface of the semiconductor substrate 31. For example, the back surface scattering portion 51 of the substrate can be formed by performing anisotropic etching on the back surface of the semiconductor substrate 31 to form a plurality of V-shaped recesses shown in the figure.
 以上説明したように、第1の実施の形態の撮像素子1の画素30は、半導体基板31の裏面側に基板裏面散乱部51を配置することにより、画素30の外部に漏洩する反射光を散乱させる。これにより、画質を向上させることができる。 As described above, the pixel 30 of the image pickup device 1 of the first embodiment scatters the reflected light leaking to the outside of the pixel 30 by arranging the substrate back surface scattering portion 51 on the back surface side of the semiconductor substrate 31. Let me. Thereby, the image quality can be improved.
 <入射光の反射>
 図3を参照して、開口部52を縮小することで、反射光の漏洩を低減することができ、基板裏面散乱部51を設けることで、半導体領域32に反射光を戻すことができる構成となることについて説明を加える。
<Reflection of incident light>
With reference to FIG. 3, the leakage of the reflected light can be reduced by reducing the opening 52, and the reflected light can be returned to the semiconductor region 32 by providing the back surface scattering portion 51 of the substrate. Add an explanation about what will happen.
 図3中の実線の矢印は入射光を表し、点線の矢印は反射光を表す。また、同図の入射光71は、半導体基板31を透過した後に表面側反射膜34および裏面側反射膜35により繰り返し反射される例を表したものである。反射光は、反射を繰り返すうちに光電変換を生じて徐々に減衰し、画素30の外部に漏洩することなく吸収される。 The solid arrow in FIG. 3 represents the incident light, and the dotted arrow represents the reflected light. Further, the incident light 71 in the figure shows an example in which the incident light 71 is repeatedly reflected by the front surface side reflective film 34 and the back surface side reflective film 35 after passing through the semiconductor substrate 31. The reflected light undergoes photoelectric conversion as the reflection is repeated, is gradually attenuated, and is absorbed without leaking to the outside of the pixel 30.
 入射光を半導体基板31の内部に閉じ込めることが可能となり、感度を向上させることができる。また、裏面側反射膜35の開口部52を狭くすることにより、表面側反射膜34からの反射光の開口部52の通過を低減することができる。画素30の外部への反射光の漏洩を低減することができる。反射光が画素30の外部に漏洩した後、近傍の画素30に再度入射するとフレア等のノイズの原因となる。開口部52を狭くすることにより、ノイズを低減して画質の低下を防止することができる。 The incident light can be confined inside the semiconductor substrate 31, and the sensitivity can be improved. Further, by narrowing the opening 52 of the back surface side reflective film 35, it is possible to reduce the passage of the reflected light from the front surface side reflective film 34 through the opening 52. Leakage of reflected light to the outside of the pixel 30 can be reduced. If the reflected light leaks to the outside of the pixel 30 and then re-enters the nearby pixel 30, it causes noise such as flare. By narrowing the opening 52, noise can be reduced and deterioration of image quality can be prevented.
 また、基板裏面散乱部51を備えることで、基板裏面散乱部51においても反射光を散乱し、半導体基板31の内部で戻すことができる。よって、基板裏面散乱部51によっても半導体基板31内に反射光を閉じ込めることが可能となる。よって、感度を向上させることができる。 Further, by providing the substrate back surface scattering portion 51, the reflected light can be scattered even in the substrate back surface scattering portion 51 and returned inside the semiconductor substrate 31. Therefore, the reflected light can be confined in the semiconductor substrate 31 by the substrate back surface scattering portion 51 as well. Therefore, the sensitivity can be improved.
 なお、図2,3では、基板裏面散乱部51を開口部52の領域だけでなく、裏面側反射膜35の下側(半導体基板31側)にも備える構成を示している。裏面側反射膜35により、反射光を半導体基板31に戻すことができるため、裏面側反射膜35の下側に基板裏面散乱部51を設けない構成としても良い。換言すれば、基板裏面散乱部51は、開口部52のところだけに設けられているような構成とすることも可能である。 Note that FIGS. 2 and 3 show a configuration in which the back surface scattering portion 51 of the substrate is provided not only in the region of the opening 52 but also on the lower side (semiconductor substrate 31 side) of the back surface side reflective film 35. Since the reflected light can be returned to the semiconductor substrate 31 by the back surface side reflective film 35, the substrate back surface scattering portion 51 may not be provided under the back surface side reflective film 35. In other words, the substrate back surface scattering portion 51 may be configured to be provided only at the opening 52.
 また、裏面側反射膜35は、図4に示すように、2層構造としても良い。図4に示す画素30は、裏面側反射膜35上に、吸収膜91を備える。 Further, the back surface side reflective film 35 may have a two-layer structure as shown in FIG. The pixel 30 shown in FIG. 4 includes an absorbing film 91 on the back surface side reflective film 35.
 吸収膜91は、半導体基板31の裏面に配置されて被写体からの入射光を吸収するものである。この吸収膜91は、中央部に開口部52を備え、この開口部52を介してオンチップレンズ37により集光された入射光を透過させる。一方で、開口部52以外の部分、すなわち、吸収膜91のところに入射された光は、吸収膜91により吸収される構成とされている。 The absorption film 91 is arranged on the back surface of the semiconductor substrate 31 to absorb the incident light from the subject. The absorption film 91 is provided with an opening 52 in the central portion, and the incident light collected by the on-chip lens 37 is transmitted through the opening 52. On the other hand, the light incident on the portion other than the opening 52, that is, the absorption film 91 is absorbed by the absorption film 91.
 また吸収膜91は、反射光を吸収して画素30の外部への反射光の漏洩を軽減するためにも設けられている。同図の吸収膜91は、オンチップレンズ37および裏面側反射膜35の間に配置され、裏面側反射膜35が設けられているところに設けられている。 The absorbing film 91 is also provided to absorb the reflected light and reduce the leakage of the reflected light to the outside of the pixel 30. The absorption film 91 in the figure is arranged between the on-chip lens 37 and the back surface side reflection film 35, and is provided where the back surface side reflection film 35 is provided.
 吸収膜91は、例えば、入射光を吸収する吸収部材が分散された膜により構成することができる。例えば、カーボンブラックや酸化チタン等の光を吸収する顔料を吸収部材として使用し、この顔料が樹脂等に分散された膜により吸収膜91を構成することができる。このような吸収膜91は、顔料が分散された樹脂膜を裏面側反射膜35に隣接して成膜し、開口部52を形成することにより製造することができる。 The absorption film 91 can be composed of, for example, a film in which an absorption member that absorbs incident light is dispersed. For example, a pigment that absorbs light such as carbon black or titanium oxide can be used as an absorbing member, and the absorbing film 91 can be formed by a film in which this pigment is dispersed in a resin or the like. Such an absorption film 91 can be manufactured by forming a resin film in which a pigment is dispersed adjacent to a back surface side reflection film 35 to form an opening 52.
 なお、開口部52の形成は、ドライエッチングや薬液を使用するウェットエッチングにより行うことができる。なお、赤外光吸収剤等の染料系の吸収部材を有する吸収膜91を使用することもできる。 The opening 52 can be formed by dry etching or wet etching using a chemical solution. An absorption film 91 having a dye-based absorption member such as an infrared light absorber can also be used.
 吸収膜91を配置することにより、裏面側反射膜35の開口部52を斜めに通過する反射光や入射光を開口部52における吸収膜91の壁面に入射させて吸収させることができる。 By arranging the absorption film 91, the reflected light or the incident light that obliquely passes through the opening 52 of the back surface side reflection film 35 can be incident on the wall surface of the absorption film 91 in the opening 52 and absorbed.
 以下の説明では、吸収膜91を備える画素30を例に挙げて説明する。また、図4に示した画素30を、第1の実施の形態に係る画素30とし、他の実施の形態に係る画素30と区別するために、画素30aと記述する。 In the following description, the pixel 30 provided with the absorbing film 91 will be described as an example. Further, the pixel 30 shown in FIG. 4 is referred to as a pixel 30 according to the first embodiment, and is described as a pixel 30a in order to distinguish it from the pixel 30 according to another embodiment.
 <第2の実施の形態に係る画素の構成>
 図5は、本技術の第2の実施の形態に係る画素30bの構成例を示す断面図である。図5に示した画素30bにおいて、図4に示した第1の実施の形態に係る画素30aと同一の部分には、同一の符号を付し、その説明は適宜省略する。
<Pixel configuration according to the second embodiment>
FIG. 5 is a cross-sectional view showing a configuration example of the pixel 30b according to the second embodiment of the present technology. In the pixel 30b shown in FIG. 5, the same parts as those of the pixel 30a according to the first embodiment shown in FIG. 4 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
 図5に示した第2の実施の形態に係る画素30bは、第1の実施の形態に係る画素30aと比較し、表面側反射膜34の形状が異なり、表面側反射膜101となっている点が異なる。 The pixel 30b according to the second embodiment shown in FIG. 5 has a different shape of the surface side reflective film 34 as compared with the pixel 30a according to the first embodiment, and is a surface side reflective film 101. The point is different.
 画素30bにおける表面側反射膜101は、中央部が突起した形状(山形形状)に形成されている。換言すれば表面側反射膜101は、四角形と三角形が組み合わさったような形状とされ、四角形を土台としたとき、その四角形の土台の中央部分に三角形が載せられたような形状とされている。 The surface-side reflective film 101 in the pixel 30b is formed in a shape (mountain shape) in which the central portion protrudes. In other words, the surface-side reflective film 101 has a shape like a combination of a quadrangle and a triangle, and when the quadrangle is used as a base, the shape is such that a triangle is placed on the central portion of the base of the quadrangle. ..
 なお、図5では、表面側反射膜101として四角形と三角形が組み合わされたような形状である場合を示しているが、三角形であっても良い。また、三角形の頂点は、丸みを帯びた形状であっても良い。表面側反射膜101は、半導体領域32側に、傾斜面を有する形状とすることができる。 Although FIG. 5 shows a case where the surface-side reflective film 101 has a shape in which a quadrangle and a triangle are combined, it may be a triangle. Further, the vertices of the triangle may have a rounded shape. The surface-side reflective film 101 may have a shape having an inclined surface on the semiconductor region 32 side.
 表面側反射膜101は、半導体領域32と表面側反射膜101うち向かい合う辺同士が平行にならないように構成されている。換言すれば、表面側反射膜101の少なくとも1辺は、斜面として形成され、図5に示した例では、2辺が斜面として形成されている例を示している。 The surface-side reflective film 101 is configured so that the sides facing each other of the semiconductor region 32 and the surface-side reflective film 101 are not parallel to each other. In other words, at least one side of the surface-side reflective film 101 is formed as a slope, and the example shown in FIG. 5 shows an example in which two sides are formed as a slope.
 ここで、図3を再度参照する。図3は、第1の実施の形態に係る画素30aにおける入射光と反射光を示した図である。図3を参照して説明したように、入射光71は、表面側反射膜34および裏面側反射膜35により繰り返し反射され、半導体基板31の内部に閉じ込められる。図3に入射光72として示したように、入射光の中には、垂直に入射される光(0次成分などと称される光)があり、そのような光は、表面側反射膜34で反射され、開口部52から抜けてしまう可能性があった。 Here, refer to FIG. 3 again. FIG. 3 is a diagram showing incident light and reflected light in the pixel 30a according to the first embodiment. As described with reference to FIG. 3, the incident light 71 is repeatedly reflected by the front surface side reflection film 34 and the back surface side reflection film 35, and is confined inside the semiconductor substrate 31. As shown as incident light 72 in FIG. 3, the incident light includes light that is vertically incident (light called a 0th-order component or the like), and such light is the surface-side reflective film 34. There was a possibility that the light would come out of the opening 52.
 図3に示した画素30aは、基板裏面散乱部51を備えるため、入射光を散乱させ、また反射光を散乱させることができる。よって、表面側反射膜34で反射され、開口部52から抜けてしまう光を少なくすることが可能な構成である。さらに、このような表面側反射膜34で反射され、開口部52から抜けてしまう光を少なくするために、表面側反射膜34の形状を、表面側反射膜101(図5)に示したような形状とする。 Since the pixel 30a shown in FIG. 3 includes the substrate back surface scattering portion 51, the incident light can be scattered and the reflected light can be scattered. Therefore, it is possible to reduce the amount of light that is reflected by the surface-side reflective film 34 and escapes from the opening 52. Further, in order to reduce the amount of light reflected by the surface-side reflective film 34 and passing through the opening 52, the shape of the surface-side reflective film 34 is shown in the surface-side reflective film 101 (FIG. 5). Shape.
 図5に示した画素30bが備える表面側反射膜101は、上記したように、三角形状に形成されている。図5の左側に図示した画素30bに、入射光と反射光の一例を矢印で示した。図5の左側に図示した画素30bに示したように、入射された光は、三角形状の表面側反射膜101の一辺に当たり、反射され、半導体領域32の側面方向に向かう。よって、表面側反射膜101により反射され、開口部52から抜けてしまうようなことを防ぐことができる構造とすることができる。 The surface-side reflective film 101 included in the pixel 30b shown in FIG. 5 is formed in a triangular shape as described above. An example of incident light and reflected light is indicated by an arrow in the pixel 30b shown on the left side of FIG. As shown in the pixel 30b illustrated on the left side of FIG. 5, the incident light hits one side of the triangular surface-side reflective film 101, is reflected, and is directed toward the side surface of the semiconductor region 32. Therefore, the structure can be prevented from being reflected by the surface-side reflective film 101 and coming out of the opening 52.
 このように、表面側反射膜101の少なくとも1辺に斜辺を設け、その斜辺に、入射光が当たり反射するような構成とすることで、半導体領域32内に戻すことができる光量を増やすことができ、画素30bの感度を向上させることができる。 In this way, by providing a hypotenuse on at least one side of the surface-side reflective film 101 and configuring the hypotenuse so that incident light hits and reflects the hypotenuse, the amount of light that can be returned to the semiconductor region 32 can be increased. It is possible to improve the sensitivity of the pixel 30b.
 なお、図5では、表面側反射膜101が、画素30bの中央付近に配置されている例を示したが、この位置は一例である。例えば、表面側反射膜101は、半導体領域32に蓄積された電荷を読み出す転送トランジスタ(不図示)のゲートが形成されている領域を避けた位置に配置され、画素30bの中央付近よりずれた位置に配置されていても良い。他の実施の形態においても同様である。 Note that FIG. 5 shows an example in which the surface-side reflective film 101 is arranged near the center of the pixel 30b, but this position is an example. For example, the surface-side reflective film 101 is arranged at a position avoiding a region where a gate of a transfer transistor (not shown) for reading the electric charge accumulated in the semiconductor region 32 is formed, and is displaced from the vicinity of the center of the pixel 30b. It may be arranged in. The same applies to other embodiments.
 <第3の実施の形態に係る画素の構成>
 図6は、本技術の第3の実施の形態に係る画素30cの構成例を示す断面図である。図6に示した画素30cにおいて、図5に示した第2の実施の形態に係る画素30bと同一の部分には、同一の符号を付し、その説明は適宜省略する。
<Pixel configuration according to the third embodiment>
FIG. 6 is a cross-sectional view showing a configuration example of the pixel 30c according to the third embodiment of the present technology. In the pixel 30c shown in FIG. 6, the same parts as those of the pixel 30b according to the second embodiment shown in FIG. 5 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
 図6に示した第3の実施の形態に係る画素30cは、第2の実施の形態に係る画素30bと比較し、半導体領域32内に反射部111をさらに備える構成とされている点が異なり、他の点は同様である。 The pixel 30c according to the third embodiment shown in FIG. 6 is different from the pixel 30b according to the second embodiment in that the reflecting portion 111 is further provided in the semiconductor region 32. , Other points are similar.
 反射部111は、図6に示した例では、画素30cの中央付近設けられている。反射部111は、基板裏面散乱部51から表面側反射膜34側に、断面では棒状に形成されている。反射部111は、半導体領域32内において、所定の厚さを有した面として形成されていても良いし、円柱や多角柱といった形状であっても良い。 In the example shown in FIG. 6, the reflection unit 111 is provided near the center of the pixel 30c. The reflection portion 111 is formed in a rod shape in cross section from the back surface scattering portion 51 of the substrate to the front surface side reflection film 34 side. The reflecting portion 111 may be formed as a surface having a predetermined thickness in the semiconductor region 32, or may have a shape such as a cylinder or a polygonal prism.
 反射部111を半導体領域32内に設けることで、図6の左側に図示した画素30cに示したように、入射光は、表面側反射膜101で反射され、半導体領域32の側面で反射され、反射部111に向かう。そして反射部111に当たり、反射部111でさらに反射される。 By providing the reflecting portion 111 in the semiconductor region 32, as shown in the pixel 30c shown on the left side of FIG. 6, the incident light is reflected by the surface-side reflecting film 101 and reflected on the side surface of the semiconductor region 32. It goes to the reflection part 111. Then, it hits the reflection unit 111 and is further reflected by the reflection unit 111.
 反射部111で反射された光は基板裏面散乱部51にてさらに散乱されたり、裏面側反射膜35で反射されたりし、半導体領域32に戻される。よって、表面側反射膜101により反射され、開口部52から抜けてしまうようなことを防ぐことができる構造とすることができる。 The light reflected by the reflecting unit 111 is further scattered by the back surface scattering unit 51 of the substrate or reflected by the antireflection film 35 on the back surface side, and is returned to the semiconductor region 32. Therefore, the structure can be prevented from being reflected by the surface-side reflective film 101 and coming out of the opening 52.
 このように、表面側反射膜101の少なくとも1辺に斜辺を設け、その斜辺に、入射光が当たり反射するような構成とし、さらに反射部111を設ける構成とすることで、半導体領域32内に戻すことができる光量を増やすことができ、画素30cの感度を向上させることができる。 In this way, a hypotenuse is provided on at least one side of the surface-side reflective film 101, the incident light is hit and reflected on the hypotenuse, and the reflecting portion 111 is further provided in the semiconductor region 32. The amount of light that can be returned can be increased, and the sensitivity of the pixel 30c can be improved.
 <第4の実施の形態に係る画素の構成>
 図7は、本技術の第4の実施の形態に係る画素30dの構成例を示す断面図である。図7に示した画素30dにおいて、図5に示した第2の実施の形態に係る画素30bと同一の部分には、同一の符号を付し、その説明は適宜省略する。
<Pixel configuration according to the fourth embodiment>
FIG. 7 is a cross-sectional view showing a configuration example of the pixel 30d according to the fourth embodiment of the present technology. In the pixel 30d shown in FIG. 7, the same parts as those of the pixel 30b according to the second embodiment shown in FIG. 5 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
 図7に示した第4の実施の形態に係る画素30dは、第2の実施の形態に係る画素30bと比較し、半導体領域32内に反射部112と反射部113をさらに備える構成とされている点が異なり、他の点は同様である。 The pixel 30d according to the fourth embodiment shown in FIG. 7 is configured to further include a reflection unit 112 and a reflection unit 113 in the semiconductor region 32 as compared with the pixel 30b according to the second embodiment. The difference is that it is the same in other respects.
 図6に示した第3の実施の形態に係る画素30cは、1つの反射部111を備える構成とされていたが、図7に示した第4の実施の形態に係る画素30dは、2つの反射部112と反射部113を備える構成とされている。このように、反射部は、半導体領域32内に複数備えられている構成とすることも可能である。 The pixel 30c according to the third embodiment shown in FIG. 6 is configured to include one reflecting unit 111, but the pixel 30d according to the fourth embodiment shown in FIG. 7 has two pixels 30d. It is configured to include a reflecting unit 112 and a reflecting unit 113. As described above, a plurality of reflecting portions may be provided in the semiconductor region 32.
 図7に示した画素30dは、開口部52の図中左側に反射部112を備え、開口部52の図中右側に反射部113を備える。反射部112と反射部113は、開口部52の開口領域を小さくすることがない位置に配置されている。 The pixel 30d shown in FIG. 7 includes a reflecting portion 112 on the left side of the opening 52 in the drawing, and a reflecting portion 113 on the right side of the opening 52 in the drawing. The reflecting portion 112 and the reflecting portion 113 are arranged at positions where the opening region of the opening 52 is not reduced.
 反射部112と反射部113は、それぞれ、反射部111(図6)と同様な構成とすることができる。すなわち、反射部112と反射部113は、それぞれ、基板裏面散乱部51から表面側反射膜34側に、断面では棒状に形成され、半導体領域32内において、所定の厚さを有する面として形成されていても良いし、円柱や多角柱といった形状であっても良い。 The reflective unit 112 and the reflective unit 113 can each have the same configuration as the reflective unit 111 (FIG. 6). That is, the reflecting portion 112 and the reflecting portion 113 are formed on the front surface side reflecting film 34 side from the substrate back surface scattering portion 51, respectively, in a rod shape in cross section, and are formed as surfaces having a predetermined thickness in the semiconductor region 32. It may be shaped like a cylinder or a polygonal pillar.
 反射部112,113を半導体領域32内に設けることで、図7の左側に図示した画素30dに示したように、入射光は、表面側反射膜101の斜面で反射され、半導体領域32の側面で反射され、反射部112側に進む。そして反射部112でさらに反射される。 By providing the reflecting portions 112 and 113 in the semiconductor region 32, as shown in the pixel 30d shown on the left side of FIG. 7, the incident light is reflected by the slope of the surface side reflecting film 101 and the side surface of the semiconductor region 32. Is reflected by, and proceeds to the reflecting portion 112 side. Then, it is further reflected by the reflecting unit 112.
 反射部112で反射された光は基板裏面散乱部51にてさらに散乱されたり、裏面側反射膜35で反射されたりし、半導体領域32に戻される。よって、表面側反射膜101により反射され、開口部52から抜けてしまうようなことを防ぐことができる構造とすることができる。 The light reflected by the reflecting unit 112 is further scattered by the substrate back surface scattering unit 51 or reflected by the back surface side reflecting film 35, and is returned to the semiconductor region 32. Therefore, the structure can be prevented from being reflected by the surface-side reflective film 101 and coming out of the opening 52.
 この場合、反射部112と半導体領域32の側面との間の領域に、光を閉じ込めることができる構成となる。また反射部113側も同様に、反射部113と半導体領域32の側面との間の領域に、光を閉じ込めることができる構成となる。 In this case, the light can be confined in the region between the reflecting portion 112 and the side surface of the semiconductor region 32. Similarly, the reflecting portion 113 side also has a configuration in which light can be confined in the region between the reflecting portion 113 and the side surface of the semiconductor region 32.
 このように、表面側反射膜101の少なくとも1辺に斜辺を設け、その斜辺に、入射光が当たり反射するような構成とし、さらに反射部112,113を設ける構成とすることで、半導体領域32内に戻すことができる光量を増やすことができ、画素30dの感度を向上させることができる。 In this way, the semiconductor region 32 is configured by providing an oblique side on at least one side of the surface-side reflective film 101 so that the incident light hits and reflects the oblique side, and further provides the reflecting portions 112 and 113. The amount of light that can be returned to the inside can be increased, and the sensitivity of the pixel 30d can be improved.
 <第5の実施の形態に係る画素の構成>
 図8は、本技術の第5の実施の形態に係る画素30eの構成例を示す断面図である。図8に示した画素30eにおいて、図2に示した第1の実施の形態に係る画素30aと同一の部分には、同一の符号を付し、その説明は適宜省略する。
<Pixel configuration according to the fifth embodiment>
FIG. 8 is a cross-sectional view showing a configuration example of the pixel 30e according to the fifth embodiment of the present technology. In the pixel 30e shown in FIG. 8, the same parts as those of the pixel 30a according to the first embodiment shown in FIG. 2 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
 図8に示した第5の実施の形態に係る画素30eは、第1の実施の形態に係る画素30aと比較し、半導体領域32の表面側にも散乱部を備える構成とされている点が異なり、他の点は同様である。 The pixel 30e according to the fifth embodiment shown in FIG. 8 is configured to include a scattering portion on the surface side of the semiconductor region 32 as compared with the pixel 30a according to the first embodiment. Different, the other points are similar.
 図8に示した第5の実施の形態に係る画素30eは、入射面側に基板裏面散乱部51を備え、配線領域33側に基板表面散乱部131を備える。すなわち、画素30eは、半導体領域32の上下に、それぞれ散乱部を備える構成とされている。基板表面散乱部131は、基板裏面散乱部51と同じく、凹凸を有す領域として形成されている。基板裏面散乱部51と基板表面散乱部131を設けることで、図8の左側に図示した画素30eに示したように、入射光が、基板裏面散乱部51で散乱して、半導体領域32内に入射する。また、半導体領域32に入射した光の内、基板表面散乱部131まで達した光は、基板表面散乱部131で散乱し、半導体領域32内に戻される。 The pixel 30e according to the fifth embodiment shown in FIG. 8 includes a substrate back surface scattering portion 51 on the incident surface side and a substrate surface scattering portion 131 on the wiring region 33 side. That is, the pixel 30e is configured to have scattering portions above and below the semiconductor region 32, respectively. The substrate surface scattering portion 131 is formed as a region having irregularities, similarly to the substrate back surface scattering portion 51. By providing the substrate back surface scattering portion 51 and the substrate surface scattering portion 131, as shown in the pixel 30e shown on the left side of FIG. 8, the incident light is scattered by the substrate back surface scattering portion 51 and enters the semiconductor region 32. Incident. Further, among the light incident on the semiconductor region 32, the light that has reached the substrate surface scattering portion 131 is scattered by the substrate surface scattering portion 131 and returned to the semiconductor region 32.
 このように、基板裏面散乱部51と基板表面散乱部131との間で、散乱と反射が繰り返されることで、半導体領域32内に光を閉じ込めることができる。よって、反射光が開口部52から抜けてしまうようなことを防ぐ(低減する)ことができる構造とすることができる。 In this way, light can be confined in the semiconductor region 32 by repeating scattering and reflection between the substrate back surface scattering unit 51 and the substrate surface scattering unit 131. Therefore, the structure can prevent (reduce) the reflected light from escaping from the opening 52.
 このように、基板裏面散乱部51と基板表面散乱部131を設ける構成とすることで、半導体領域32内に光を留めることができる光量を増やすことができ、画素30eの感度を向上させることができる。 In this way, by providing the substrate back surface scattering portion 51 and the substrate surface scattering portion 131, the amount of light that can be retained in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30e can be improved. it can.
 なお、図8では、基板表面散乱部131が、半導体領域32の一方の側面から他方の側面まで形成されている(分離領域38間に形成されている)例を示したが、半導体領域32に蓄積された電荷を読み出す転送トランジスタ(不図示)のゲートが形成されている領域には形成されないなど、画素の構成に合わせた変更は可能である。他の実施の形態においても同様である。 Although FIG. 8 shows an example in which the substrate surface scattering portion 131 is formed from one side surface of the semiconductor region 32 to the other side surface (formed between the separation regions 38), the semiconductor region 32 is formed. It can be changed according to the pixel configuration, such as not being formed in the region where the gate of the transfer transistor (not shown) for reading the accumulated charge is formed. The same applies to other embodiments.
 <第6の実施の形態に係る画素の構成>
 図9は、本技術の第6の実施の形態に係る画素30fの構成例を示す断面図である。
<Pixel configuration according to the sixth embodiment>
FIG. 9 is a cross-sectional view showing a configuration example of the pixel 30f according to the sixth embodiment of the present technology.
 図9に示した第6の実施の形態に係る画素30fは、図5に示した第2の実施の形態に係る画素30bと、図8に示した第5の実施の形態に係る画素30eを組み合わせた構成を有している。すなわち、図9に示した第6の実施の形態に係る画素30fは、表面側に表面側反射膜101と基板表面散乱部131を備える構成とされている。 The pixel 30f according to the sixth embodiment shown in FIG. 9 includes the pixel 30b according to the second embodiment shown in FIG. 5 and the pixel 30e according to the fifth embodiment shown in FIG. It has a combined configuration. That is, the pixel 30f according to the sixth embodiment shown in FIG. 9 is configured to include the surface side reflective film 101 and the substrate surface scattering portion 131 on the surface side.
 第6の実施の形態に係る画素30fは、第2の実施の形態に係る画素30bと同じく、斜面を有する表面側反射膜101を備えるため、0次成分の反射光が開口部52から抜け出てしまうようなことを低減することができる。また、第6の実施の形態に係る画素30fは、第5の実施の形態に係る画素30eと同じく、基板表面散乱部131を備えるため、半導体領域32の表面側に達した光を、半導体領域32内に散乱させて、半導体領域32内に光を戻すことができる。 Since the pixel 30f according to the sixth embodiment includes the surface-side reflective film 101 having a slope like the pixel 30b according to the second embodiment, the reflected light of the 0th-order component escapes from the opening 52. It is possible to reduce such things as being lost. Further, since the pixel 30f according to the sixth embodiment includes the substrate surface scattering portion 131 like the pixel 30e according to the fifth embodiment, the light reaching the surface side of the semiconductor region 32 is emitted to the semiconductor region. Light can be returned into the semiconductor region 32 by being scattered within 32.
 このように、表面側反射膜101と基板表面散乱部131を設ける構成とすることで、開口部52から反射光が抜け出てしまうようなことを低減することができ、かつ半導体領域32内に光を留めることができる光量を増やすことができる。よって、画素30fの感度を向上させることができる。 By providing the surface-side reflective film 101 and the substrate surface scattering portion 131 in this way, it is possible to reduce the possibility that the reflected light escapes from the opening 52, and the light is emitted into the semiconductor region 32. The amount of light that can be stopped can be increased. Therefore, the sensitivity of the pixel 30f can be improved.
 <第7の実施の形態に係る画素の構成>
 図10は、本技術の第7の実施の形態に係る画素30gの構成例を示す断面図である。
<Pixel configuration according to the seventh embodiment>
FIG. 10 is a cross-sectional view showing a configuration example of a pixel 30 g according to a seventh embodiment of the present technology.
 第7の実施の形態に係る画素30gは、第4の実施の形態に係る画素30dと第6の実施の形態に係る画素30fを組み合わせた構成とされている。すなわち、第7の実施の形態に係る画素30gは、第4の実施の形態に係る画素30dと同じく、反射部112と反射部113を備える。また、第7の実施の形態に係る画素30gは、第6の実施の形態に係る画素30fと同じく、基板表面散乱部131を備える。 The pixel 30g according to the seventh embodiment has a configuration in which the pixel 30d according to the fourth embodiment and the pixel 30f according to the sixth embodiment are combined. That is, the pixel 30g according to the seventh embodiment includes the reflection unit 112 and the reflection unit 113, like the pixel 30d according to the fourth embodiment. Further, the pixel 30g according to the seventh embodiment includes the substrate surface scattering portion 131 like the pixel 30f according to the sixth embodiment.
 さらに第7の実施の形態に係る画素30gは、表面側反射膜101の形状が、三角形状を含む形状とされ、その三角形の頂点に反射部114を備える構成とされている。反射部114は、表面側反射膜101から、半導体領域32側に伸びる方向(図中下から上方向)に、反射部112と同様の形状で形成されている。 Further, in the pixel 30g according to the seventh embodiment, the shape of the surface-side reflective film 101 is a shape including a triangular shape, and the reflecting portion 114 is provided at the apex of the triangular shape. The reflecting portion 114 is formed in the same shape as the reflecting portion 112 in the direction extending from the surface side reflecting film 101 toward the semiconductor region 32 side (from the lower side to the upper side in the drawing).
 このように、反射部112,113,114を設ける構成とすることで、半導体領域32内に戻す(留める)ことができる光量を増やすことができる。また基板表面散乱部131を設ける構成とすることで、半導体領域32内に光を留めることができる光量を増やすことができる。よって画素30gの感度を向上させることができる。 By providing the reflecting portions 112, 113, 114 in this way, the amount of light that can be returned (fastened) into the semiconductor region 32 can be increased. Further, by providing the substrate surface scattering portion 131, the amount of light that can be retained in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30 g can be improved.
 <第8の実施の形態に係る画素の構成>
 図11は、本技術の第8の実施の形態に係る画素30hの構成例を示す断面図である。
<Pixel configuration according to the eighth embodiment>
FIG. 11 is a cross-sectional view showing a configuration example of the pixel 30h according to the eighth embodiment of the present technology.
 第8の実施の形態に係る画素30hは、第7の実施の形態に係る画素30gと同じく、表面側(配線領域33側)に、基板表面散乱部131を備え、反射部114を備える。なお、図11に示した反射部114は、四角形状の表面側反射膜34に反射部114を設けた構成を示したが、三角形を含む形状の表面側反射膜101に反射部114を設けた構成としても良い。 The pixel 30h according to the eighth embodiment is provided with a substrate surface scattering portion 131 and a reflecting portion 114 on the surface side (wiring region 33 side), similarly to the pixel 30g according to the seventh embodiment. The reflective portion 114 shown in FIG. 11 showed a configuration in which the reflective portion 114 was provided on the rectangular surface-side reflective film 34, but the reflective portion 114 was provided on the surface-side reflective film 101 having a shape including a triangle. It may be configured.
 第8の実施の形態に係る画素30hは、他の実施の形態に係る画素30と比較し、特に基板裏面散乱部151(他の実施の形態では、基板裏面散乱部51)の形状が異なる。図11に示した基板裏面散乱部151は、開口部52のところには形成されていない構成とされている。また開口部52のところの半導体領域32は、テーパ形状に、彫り込みが入れられたような形状とされている。 The pixel 30h according to the eighth embodiment is different from the pixel 30 according to another embodiment in particular in the shape of the substrate back surface scattering portion 151 (in other embodiments, the substrate back surface scattering portion 51). The substrate back surface scattering portion 151 shown in FIG. 11 has a configuration in which it is not formed at the opening 52. Further, the semiconductor region 32 at the opening 52 has a tapered shape with engraving.
 開口部52には、例えば、保護膜36と同一の材料が充填されている。図11に示した画素30hでは、半導体領域32の一部が、保護膜36と同一の材料で充填されている。半導体領域32の一部とは、開口部52側に位置し、断面で、例えば三角形状になる領域である。ここでは、光屈折構造部152と記載する。この光屈折構造部152は、大きな凹部であると表すこともでき、凹部の中側は、保護膜36と同一の材料で充填され、凹部の外側は、半導体領域32とされている。 The opening 52 is filled with, for example, the same material as the protective film 36. In the pixel 30h shown in FIG. 11, a part of the semiconductor region 32 is filled with the same material as the protective film 36. A part of the semiconductor region 32 is a region located on the opening 52 side and having a triangular shape in cross section, for example. Here, it is described as the light refraction structure portion 152. The light refraction structure portion 152 can also be expressed as a large recess, the inside of the recess is filled with the same material as the protective film 36, and the outside of the recess is the semiconductor region 32.
 光屈折構造部152と半導体領域32の境界では、材料の違いから屈折が起こり、入射光の光路を曲げることができる。例えば、図11に左側に示した画素30hに矢印で示したように、入射光は、光屈折構造部152と半導体領域32の境界部分で屈折し、図11に示した例では、半導体領域32の側面側に進む。 At the boundary between the light refraction structure portion 152 and the semiconductor region 32, refraction occurs due to the difference in materials, and the optical path of the incident light can be bent. For example, as shown by an arrow in the pixel 30h shown on the left side of FIG. 11, the incident light is refracted at the boundary between the light refracting structure portion 152 and the semiconductor region 32, and in the example shown in FIG. 11, the semiconductor region 32 Proceed to the side of.
 このように、光屈折構造部152を設けることで、入射光を屈折させることができる。入射光が屈折することで、直接的に表面側反射膜34に到達する入射光が少なくなり、0次成分の反射光を減少させることができる。よって、開口部52から抜け出てしまう光を減少させることができる。 By providing the light refraction structure portion 152 in this way, the incident light can be refracted. By refracting the incident light, the incident light that directly reaches the surface-side reflective film 34 is reduced, and the reflected light of the 0th-order component can be reduced. Therefore, it is possible to reduce the light that escapes from the opening 52.
 また、反射部114を設けることで、表面側反射膜34や基板裏面散乱部51で反射(散乱)された光が、反射部114に当たり、開口部52から抜け出ないようにすることができる。 Further, by providing the reflecting portion 114, the light reflected (scattered) by the front surface side reflecting film 34 and the substrate back surface scattering portion 51 can hit the reflecting portion 114 and be prevented from exiting from the opening 52.
 このように、光屈折構造部152を設ける構成とすることで、開口部52から反射光が抜け出てしまうようなことを低減することができる。また他の実施の形態と同じく、半導体領域32内に光を留めることができる光量を増やすことができる構造でもある。よって、画素30hの感度を向上させることができる。 By providing the light refraction structure portion 152 in this way, it is possible to reduce the possibility that the reflected light escapes from the opening 52. Further, as in the other embodiments, the structure is such that the amount of light that can be retained in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30h can be improved.
 <第9の実施の形態に係る画素の構成>
 図12は、本技術の第9の実施の形態に係る画素30iの構成例を示す断面図である。
<Pixel configuration according to the ninth embodiment>
FIG. 12 is a cross-sectional view showing a configuration example of the pixel 30i according to the ninth embodiment of the present technology.
 第9の実施の形態に係る画素30iは、図11に示した第8の実施の形態に係る画素30hの反射部114の形状が異なり、他の点は同一である。第9の実施の形態に係る画素30iの反射部115は、T字形状に形成されている。反射部115として、表面側反射膜34と平行となる方向(図中横方向)にも、反射構造を有する部分を設けることで、表面側反射膜34や基板裏面散乱部51で反射(散乱)された光が、反射部115に当たり、開口部52から抜け出ないようにすることができる。 The pixel 30i according to the ninth embodiment is different in the shape of the reflecting portion 114 of the pixel 30h according to the eighth embodiment shown in FIG. 11, and is the same in other points. The reflective portion 115 of the pixel 30i according to the ninth embodiment is formed in a T shape. By providing a portion having a reflection structure in the direction parallel to the front surface side reflection film 34 (horizontal direction in the drawing) as the reflection portion 115, the reflection (scattering) is performed by the front surface side reflection film 34 and the substrate back surface scattering portion 51. It is possible to prevent the emitted light from hitting the reflecting portion 115 and exiting the opening 52.
 反射部115の横方向の反射部は、開口部52の真下に設けられ、開口部52の幅と同程度の大きさで形成されている。また、反射部115の開口部52に近い側は、光屈折構造部152の三角形状の頂点よりも離れた位置に形成されている。 The lateral reflection portion of the reflection portion 115 is provided directly below the opening 52, and is formed to have the same size as the width of the opening 52. Further, the side of the reflection portion 115 near the opening 52 is formed at a position away from the triangular apex of the light refraction structure portion 152.
 このように、光屈折構造部152と反射部115を設けることで、入射光を屈折させ、直接的に表面側反射膜34に到達する入射光を少なくすることができ、0次成分の反射光を減少させることができるとともに、開口部52から抜け出てしまう光を減少させることができる。よって画素30iの感度を向上させることができる。 By providing the light refracting structure portion 152 and the reflecting portion 115 in this way, it is possible to refract the incident light and reduce the incident light that directly reaches the surface side reflecting film 34, and the reflected light of the 0th-order component can be reduced. Can be reduced, and the light that escapes from the opening 52 can be reduced. Therefore, the sensitivity of the pixel 30i can be improved.
 <第10の実施の形態に係る画素の構成>
 図13は、本技術の第10の実施の形態に係る画素30jの構成例を示す断面図である。
<Pixel configuration according to the tenth embodiment>
FIG. 13 is a cross-sectional view showing a configuration example of the pixel 30j according to the tenth embodiment of the present technology.
 第10の実施の形態に係る画素30jは、図5に示した第2の実施の形態に係る画素30bに、導波路201を追加した構成とされている。導波路201は、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30j according to the tenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30b according to the second embodiment shown in FIG. The waveguide 201 is configured to be able to guide the incident light incident through the opening 52 to the semiconductor region 32.
 なお、図13に示した第10の実施の形態に係る画素30jは、傾斜を有する表面側反射膜101を備える構成を示したが、傾斜を有しない表面側反射膜34を備える構成としても良い。 Although the pixel 30j according to the tenth embodiment shown in FIG. 13 has a configuration including a surface-side reflective film 101 having an inclination, it may be configured to include a surface-side reflective film 34 having no inclination. ..
 導波路201は、例えば、コア/クラッド型の導波路とすることができる。コア/クラッド型の導波路201とした場合、コアに該当する部分を導波路201とし、クラッドに該当する部分をオンチップレンズ37とし、導波路201の材料とオンチップレンズ37の材料の屈折率の違いから、全反射が起こるような材料が用いられる。具体的には、導波路201を形成する材料の屈折率は、オンチップレンズ37を形成する材料の屈折率よりも低くなる材料が用いられる。 The waveguide 201 can be, for example, a core / clad type waveguide. When the core / clad type waveguide 201 is used, the portion corresponding to the core is the waveguide 201, the portion corresponding to the clad is the on-chip lens 37, and the refractive index of the material of the waveguide 201 and the material of the on-chip lens 37. Due to the difference, a material that causes total reflection is used. Specifically, a material is used in which the refractive index of the material forming the waveguide 201 is lower than the refractive index of the material forming the on-chip lens 37.
 導波路201の材料としては、例えば、SiN(窒化シリコン)、Ta2O5(酸化タンタル)、TiO2(酸化チタン)などを用いることができる。 As the material of the waveguide 201, for example, SiN (silicon nitride), Ta2O5 (tantalum oxide), TiO2 (titanium oxide) and the like can be used.
 導波路201は、断面において台形形状で形成され、上辺を、オンチップレンズ37側とし、下辺を開口部52側とした場合、上辺は、オンチップレンズ37の直径と同程度の長さで構成することができ、下辺は、開口部52の幅と同程度の長さで構成することができる。 The waveguide 201 is formed in a trapezoidal shape in cross section, and when the upper side is on the on-chip lens 37 side and the lower side is on the opening 52 side, the upper side is configured to have a length similar to the diameter of the on-chip lens 37. The lower side can be configured to have a length similar to the width of the opening 52.
 一例として、以下の式(1)が満たされるように開口部52の大きさが設定され、その開口部52の大きさに合わせた導波路201の大きさが設定される。
 θ≧(1/2)arctan(D/2T)   ・・・(1)
As an example, the size of the opening 52 is set so that the following equation (1) is satisfied, and the size of the waveguide 201 is set according to the size of the opening 52.
θ ≧ (1/2) arctan (D / 2T) ・ ・ ・ (1)
 式(1)において、θは、図14に示すように、表面側反射膜101の三角形状に形成されている部分の2辺がなす角(傾斜角)である。またこの角度θは、垂直方向に入射された入射光と、その入射光が表面側反射膜101に当たり反射した反射光とがなす角(=2θ)の(1/2)の大きさとなる角度である。 In the formula (1), θ is an angle (inclination angle) formed by two sides of the triangular portion of the surface-side reflective film 101, as shown in FIG. Further, this angle θ is an angle that is (1/2) of the angle (= 2θ) formed by the incident light incident in the vertical direction and the reflected light reflected by the incident light hitting the surface-side reflective film 101. is there.
 表面側反射膜101の傾斜角を、式(1)が満たされる角度θとすることで、表面側反射膜101に当たった光が反射されたときに、その反射光が、開口部52以外の領域、この場合、基板裏面散乱部51や裏面側反射膜35がある領域に進むようにすることができる。 By setting the inclination angle of the surface-side reflective film 101 to an angle θ that satisfies the equation (1), when the light that hits the surface-side reflective film 101 is reflected, the reflected light is other than the opening 52. The region, in this case, the region where the back surface scattering portion 51 of the substrate and the back surface side reflective film 35 are located can be advanced.
 また、式(1)において、Dは、開口部52として開口されている部分の長さである。長さDは、導波路201の下辺の長さとすることができる。また式(1)において、Tは、開口部52から表面側反射膜101までの長さである。図14に示した例では、開口部52の裏面側反射膜35と吸収膜91の境界線上にある位置と、表面側反射膜101に含まれる三角形の頂点の位置の間の長さをTとして示している。長さTは、半導体領域32の縦方向(深さ方向)の長さ、フォトダイオードの深さ方向の長さと言い換えることもできる。 Further, in the formula (1), D is the length of the portion opened as the opening 52. The length D can be the length of the lower side of the waveguide 201. Further, in the formula (1), T is the length from the opening 52 to the surface-side reflective film 101. In the example shown in FIG. 14, the length between the position on the boundary line between the back surface side reflective film 35 and the absorption film 91 of the opening 52 and the position of the apex of the triangle included in the front surface side reflective film 101 is defined as T. Shown. The length T can be rephrased as the length of the semiconductor region 32 in the vertical direction (depth direction) and the length of the photodiode in the depth direction.
 式(1)が満たされる長さDが設定されることで、開口部52の幅や導波路201の下辺の長さが設定される。なお、式(1)は一例であり、式(1)に基づく長さDに本技術の適用範囲が限定される記載ではない。 By setting the length D that satisfies the equation (1), the width of the opening 52 and the length of the lower side of the waveguide 201 are set. The equation (1) is an example, and the scope of application of the present technology is not limited to the length D based on the equation (1).
 導波路201は、画素30jを上部からみたとき、図15に示すように配置されている。図15中、外側の四角形は、1画素30jを表し、内側の斜線を付してある四角形は、導波路201の下面(開口部52側の面)を表す。1画素30jは、分離領域38で分離されているため、図中外側の四角形は、分離領域38を表し、分離領域38で囲まれている領域が1画素領域であるとして説明を続ける。 The waveguide 201 is arranged as shown in FIG. 15 when the pixel 30j is viewed from above. In FIG. 15, the outer quadrangle represents one pixel 30j, and the inner quadrangle with diagonal lines represents the lower surface (the surface on the opening 52 side) of the waveguide 201. Since the 1 pixel 30j is separated by the separation region 38, the outer quadrangle in the figure represents the separation region 38, and the description will be continued assuming that the region surrounded by the separation region 38 is the 1 pixel region.
 図15のAは、画素30jの四角形と導波路201の四角形が同一方向に向いている場合を示している。図15のAに示したように、導波路201は、画素30jと同一形状で、辺同士が平行に配置される位置に形成される。 FIG. 15A shows a case where the quadrangle of the pixel 30j and the quadrangle of the waveguide 201 are oriented in the same direction. As shown in FIG. 15A, the waveguide 201 has the same shape as the pixel 30j and is formed at a position where the sides are arranged in parallel.
 図15のBは、画素30jの四角形に対して、導波路201の四角形が45度傾いた位置に配置されている場合を示している。図15のBに示したように、導波路201は、画素30jと同一形状(四角形)であるが、辺同士が45度で交わるような位置に配置される。 FIG. 15B shows a case where the quadrangle of the waveguide 201 is arranged at a position inclined by 45 degrees with respect to the quadrangle of the pixel 30j. As shown in FIG. 15B, the waveguide 201 has the same shape (quadrangle) as the pixel 30j, but is arranged at a position where the sides intersect at 45 degrees.
 例えば、図15のAに示したように、画素30jと導波路201を同一方向に配置した場合、図中矢印で示したように、導波路201の1辺に入射した入射光は、その辺と平行に配置されている画素30jの辺(半導体領域32の側面)に当たる。そして、画素30jの辺に当たった光は、半導体領域32の側面で反射され、半導体領域32内に戻される。 For example, as shown in FIG. 15A, when the pixels 30j and the waveguide 201 are arranged in the same direction, the incident light incident on one side of the waveguide 201 is the side as shown by the arrow in the figure. It corresponds to the side (side surface of the semiconductor region 32) of the pixel 30j arranged in parallel with. Then, the light that hits the side of the pixel 30j is reflected by the side surface of the semiconductor region 32 and returned to the inside of the semiconductor region 32.
 同様に図15のBに示したように、画素30jに対して導波路201を45度傾けて配置した場合、図中矢印で示したように、導波路201の1辺に入射した入射光は、その辺と45度に交わる位置に配置されている画素30jの辺(半導体領域32の側面)に当たる。そして、画素30jの辺に当たった光は、画素30jの他の辺(半導体領域32の側面)に当たる。このように2回、画素30jの辺に当たった光が、半導体領域32内に戻される。 Similarly, as shown in FIG. 15B, when the waveguide 201 is arranged at an angle of 45 degrees with respect to the pixel 30j, the incident light incident on one side of the waveguide 201 is as shown by the arrow in the figure. , Corresponds to the side (side surface of the semiconductor region 32) of the pixel 30j arranged at a position intersecting the side at 45 degrees. Then, the light that hits the side of the pixel 30j hits the other side (the side surface of the semiconductor region 32) of the pixel 30j. In this way, the light that hits the side of the pixel 30j twice is returned to the semiconductor region 32.
 導波路201を画素30jに対してどのような位置となるように設けるかは、図15に示した例以外であっても良い。また、導波路201の形状は、上部からみたとき、四角形状でなくても良く、四角形状以外の多角形状や円形状などであっても良い。 The position of the waveguide 201 with respect to the pixel 30j may be other than the example shown in FIG. Further, the shape of the waveguide 201 does not have to be a quadrangular shape when viewed from above, and may be a polygonal shape or a circular shape other than the quadrangular shape.
 またここでは、導波路201を用いた場合を例に挙げて説明を行うが、導波路201の代わりに導波路と同等の機能を有する部材を用いても良い。以下の説明においても、導波路201を用いた場合を例に挙げて説明を行うが、導波路201の代わりに導波路と同等の機能を有する部材を用いても良い。 Further, here, the case where the waveguide 201 is used will be described as an example, but a member having the same function as the waveguide may be used instead of the waveguide 201. In the following description, the case where the waveguide 201 is used will be described as an example, but a member having the same function as the waveguide may be used instead of the waveguide 201.
 例えば、導波路201の代わりに回折格子を用いても良い。回折格子は、例えば、オンチップレンズ37よりも低い屈折率を有する材料を用いて、周期的な構造を有する部材で構成されている。 For example, a diffraction grating may be used instead of the waveguide 201. The diffraction grating is composed of a member having a periodic structure, for example, using a material having a refractive index lower than that of the on-chip lens 37.
 例えば、回折格子として、プラズモンフィルタなどと称されるフィルタを用いても良い。プラズモンフィルタは、プラズモン共鳴対を用いたフィルタである。プラズモンフィルタは、所定の波長の光を透過するフィルタとして用いられる。 For example, a filter called a plasmon filter or the like may be used as the diffraction grating. The plasmon filter is a filter using a plasmon resonance pair. The plasmon filter is used as a filter that transmits light of a predetermined wavelength.
 プラズモンフィルタは、カラーフィルタとして用いられることもある。上記した実施の形態においては、カラーフィルタを図示していないが、カラーフィルタを設けた画素30とすることももちろんできる。また、カラーフィルタとして、プラズモンフィルタなどの所定の周波数の光を選択的に透過させる回折格子形状のフィルタを用いることもできる。また、そのようなカラーフィルタとした用いた回折格子を、導波路201として備える構成の画素30としても良い。 The plasmon filter is sometimes used as a color filter. In the above-described embodiment, the color filter is not shown, but the pixel 30 provided with the color filter can of course be used. Further, as the color filter, a diffraction grating-shaped filter such as a plasmon filter that selectively transmits light having a predetermined frequency can also be used. Further, the pixel 30 having a configuration in which the diffraction grating used as such a color filter is provided as the waveguide 201 may be used.
 また、図16に示すように、カラーフィルタを導波路201上に設けても良い。図16に示した画素30j’は、オンチップレンズ37と導波路201との間に、カラーフィルタ層221が設けられている。このように、カラーフィルタ層221を設けた構成とすることもできる。 Further, as shown in FIG. 16, a color filter may be provided on the waveguide 201. In the pixel 30j'shown in FIG. 16, a color filter layer 221 is provided between the on-chip lens 37 and the waveguide 201. In this way, the color filter layer 221 may be provided.
 カラーフィルタ層221を構成する材料を、導波路201の形状に形成し、カラーフィルタ層221に導波路201の機能を持たせる、換言すれば、導波路201にカラーフィルタ層221の機能を持たせる構成とすることもできる。すなわち、図13に示した画素30jにおいて、導波路201に、カラーフィルタとしての機能を持たせた構成とすることもできる。このことは、以下に説明する実施の形態においても、同様に適用できる。 The material constituting the color filter layer 221 is formed in the shape of the waveguide 201, and the color filter layer 221 is provided with the function of the waveguide 201, in other words, the waveguide 201 is provided with the function of the color filter layer 221. It can also be configured. That is, in the pixel 30j shown in FIG. 13, the waveguide 201 may be configured to have a function as a color filter. This also applies to the embodiments described below.
 カラーフィルタ層221を設ける構成は、上記した第1乃至第9の実施の形態においても適用できるし、以下に説明する各実施の形態に対しても適用できる。 The configuration in which the color filter layer 221 is provided can be applied to the first to ninth embodiments described above, and can also be applied to each embodiment described below.
 このように導波路201を設けることで、開口部52を介して入射光を半導体領域32に導くことができ、より多くの入射光を半導体領域32に入射させることができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30jの感度を向上させることができる。 By providing the waveguide 201 in this way, the incident light can be guided to the semiconductor region 32 through the opening 52, and more incident light can be incidented on the semiconductor region 32. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30j can be improved.
 <第11の実施の形態に係る画素の構成>
 図17は、本技術の第11の実施の形態に係る画素30kの構成例を示す断面図である。
<Pixel configuration according to the eleventh embodiment>
FIG. 17 is a cross-sectional view showing a configuration example of the pixel 30k according to the eleventh embodiment of the present technology.
 第11の実施の形態に係る画素30kは、図6に示した第3の実施の形態に係る画素30cに、導波路201を追加した構成とされている。導波路201は、図13に示した第10の実施の形態に係る画素30jの導波路201と同じく、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30k according to the eleventh embodiment has a configuration in which the waveguide 201 is added to the pixel 30c according to the third embodiment shown in FIG. The waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
 導波路201で開口部52に導かれた入射光は、基板裏面散乱部51で散乱されて、半導体領域32に入射し、また、反射部111で反射され、半導体領域32内にとどまる。また、表面側反射膜101で反射されて半導体領域32に戻る光もあり、入射光を半導体領域32に留めることができる。 The incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and incident on the semiconductor region 32, and is reflected by the reflection portion 111 and stays in the semiconductor region 32. In addition, some light is reflected by the surface-side reflective film 101 and returns to the semiconductor region 32, so that the incident light can be retained in the semiconductor region 32.
 このように導波路201を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30kの感度を向上させることができる。 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30k can be improved.
 <第12の実施の形態に係る画素の構成>
 図18は、本技術の第12の実施の形態に係る画素30mの構成例を示す断面図である。
<Pixel configuration according to the twelfth embodiment>
FIG. 18 is a cross-sectional view showing a configuration example of a pixel 30 m according to a twelfth embodiment of the present technology.
 第12の実施の形態に係る画素30mは、図7に示した第4の実施の形態に係る画素30dに、導波路201を追加した構成とされている。導波路201は、図13に示した第10の実施の形態に係る画素30jの導波路201と同じく、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30m according to the twelfth embodiment has a configuration in which the waveguide 201 is added to the pixel 30d according to the fourth embodiment shown in FIG. The waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
 導波路201で開口部52に導かれた入射光は、基板裏面散乱部51で散乱されて、半導体領域32に入射し、また、反射部112と反射部113で反射され、半導体領域32内にとどまる。また、表面側反射膜101で反射されて半導体領域32に戻る光もあり、入射光を半導体領域32に留めることができる。 The incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and incident on the semiconductor region 32, and is reflected by the reflecting portion 112 and the reflecting portion 113 and enters the semiconductor region 32. Stay. In addition, some light is reflected by the surface-side reflective film 101 and returns to the semiconductor region 32, so that the incident light can be retained in the semiconductor region 32.
 このように導波路201を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30mの感度を向上させることができる。 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30 m can be improved.
 <第13の実施の形態に係る画素の構成>
 図19は、本技術の第13の実施の形態に係る画素30nの構成例を示す断面図である。
<Pixel configuration according to the thirteenth embodiment>
FIG. 19 is a cross-sectional view showing a configuration example of the pixel 30n according to the thirteenth embodiment of the present technology.
 第13の実施の形態に係る画素30nは、図8に示した第5の実施の形態に係る画素30eに、導波路201を追加した構成とされている。導波路201は、図13に示した第10の実施の形態に係る画素30jの導波路201と同じく、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30n according to the thirteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30e according to the fifth embodiment shown in FIG. The waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
 導波路201で開口部52に導かれた入射光は、基板裏面散乱部51で散乱されて、半導体領域32に入射し、表面側に達した光は、基板表面散乱部131で散乱されたり、表面側反射膜34で反射されたりすることで、半導体領域32に戻される。よって、入射光を半導体領域32に留めることができる。 The incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and is incident on the semiconductor region 32, and the light reaching the front surface side is scattered by the substrate surface scattering portion 131. It is returned to the semiconductor region 32 by being reflected by the surface-side reflective film 34. Therefore, the incident light can be limited to the semiconductor region 32.
 このように導波路201を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30nの感度を向上させることができる。 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30n can be improved.
 <第14の実施の形態に係る画素の構成>
 図20は、本技術の第14の実施の形態に係る画素30pの構成例を示す断面図である。
<Pixel configuration according to the 14th embodiment>
FIG. 20 is a cross-sectional view showing a configuration example of the pixel 30p according to the 14th embodiment of the present technology.
 第14の実施の形態に係る画素30pは、図9に示した第6の実施の形態に係る画素30fに、導波路201を追加した構成とされている。導波路201は、図13に示した第10の実施の形態に係る画素30jの導波路201と同じく、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30p according to the fourteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30f according to the sixth embodiment shown in FIG. The waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
 導波路201で開口部52に導かれた入射光は、基板裏面散乱部51で散乱されて、半導体領域32に入射し、表面側に達した光は、基板表面散乱部131で散乱されたり、表面側反射膜101で反射されたりすることで、半導体領域32に戻される。よって、入射光を半導体領域32に留めることができる。 The incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and is incident on the semiconductor region 32, and the light reaching the front surface side is scattered by the substrate surface scattering portion 131. It is returned to the semiconductor region 32 by being reflected by the surface-side reflective film 101. Therefore, the incident light can be limited to the semiconductor region 32.
 このように導波路201を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30pの感度を向上させることができる。 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30p can be improved.
 <第15の実施の形態に係る画素の構成>
 図21は、本技術の第15の実施の形態に係る画素30qの構成例を示す断面図である。
<Pixel configuration according to the fifteenth embodiment>
FIG. 21 is a cross-sectional view showing a configuration example of the pixel 30q according to the fifteenth embodiment of the present technology.
 第15の実施の形態に係る画素30qは、図10に示した第7の実施の形態に係る画素30gに、導波路201を追加した構成とされている。導波路201は、図13に示した第10の実施の形態に係る画素30jの導波路201と同じく、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30q according to the fifteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30g according to the seventh embodiment shown in FIG. The waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
 導波路201で開口部52に導かれた入射光は、基板裏面散乱部51で散乱されて、半導体領域32に入射し、表面側に達した光は、基板表面散乱部131で散乱されたり、表面側反射膜101で反射されたりすることで、半導体領域32に戻される。 The incident light guided to the opening 52 by the waveguide 201 is scattered by the substrate back surface scattering portion 51 and is incident on the semiconductor region 32, and the light reaching the front surface side is scattered by the substrate surface scattering portion 131. It is returned to the semiconductor region 32 by being reflected by the surface-side reflective film 101.
 また、半導体領域32に入射または反射された光は、反射部112、反射部113、および反射部114で反射され、半導体領域32内にとどまる。 Further, the light incident or reflected in the semiconductor region 32 is reflected by the reflecting portion 112, the reflecting portion 113, and the reflecting portion 114, and stays in the semiconductor region 32.
 表面側反射膜101の形状が、斜辺を含む形状とされていることで、0次成分の反射光を減らし、開口部52から抜け出る光を減らすことができる。これらのことから、入射光を半導体領域32により長く留めることができる。 Since the shape of the surface-side reflective film 101 is a shape including the hypotenuse, it is possible to reduce the reflected light of the 0th-order component and reduce the light that escapes from the opening 52. From these things, the incident light can be kept longer in the semiconductor region 32.
 このように導波路201を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30qの感度を向上させることができる。 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30q can be improved.
 <第16の実施の形態に係る画素の構成>
 図22は、本技術の第16の実施の形態に係る画素30rの構成例を示す断面図である。
<Pixel configuration according to the 16th embodiment>
FIG. 22 is a cross-sectional view showing a configuration example of the pixel 30r according to the 16th embodiment of the present technology.
 第16の実施の形態に係る画素30rは、図11に示した第8の実施の形態に係る画素30hに、導波路201を追加した構成とされている。導波路201は、図13に示した第10の実施の形態に係る画素30jの導波路201と同じく、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30r according to the sixteenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30h according to the eighth embodiment shown in FIG. The waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
 導波路201で開口部52に導かれた入射光は、光屈折構造部152が形成されているため、光屈折構造部152と半導体領域32の境界部分で屈折し、半導体領域32に入射される。 Since the light refraction structure portion 152 is formed, the incident light guided to the opening 52 by the waveguide 201 is refracted at the boundary portion between the light refraction structure portion 152 and the semiconductor region 32, and is incident on the semiconductor region 32. ..
 半導体領域32に入射され、表面側に達した光は、基板表面散乱部131で散乱されたり、表面側反射膜34で反射されたりすることで、半導体領域32に戻される。また、反射部114で反射される光もあり、半導体領域32内に光がとどまりやすい構造とされている。第16の実施の形態に係る画素30rも、入射光を半導体領域32に留めることができる構造とされている。 The light incident on the semiconductor region 32 and reaching the surface side is returned to the semiconductor region 32 by being scattered by the substrate surface scattering portion 131 or reflected by the surface side reflective film 34. In addition, some light is reflected by the reflecting unit 114, and the structure is such that the light easily stays in the semiconductor region 32. The pixel 30r according to the sixteenth embodiment also has a structure capable of retaining the incident light in the semiconductor region 32.
 このように導波路201を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30rの感度を向上させることができる。 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixel 30r can be improved.
 <第17の実施の形態に係る画素の構成>
 図23は、本技術の第17の実施の形態に係る画素30sの構成例を示す断面図である。
<Pixel configuration according to the 17th embodiment>
FIG. 23 is a cross-sectional view showing a configuration example of the pixels 30s according to the seventeenth embodiment of the present technology.
 第17の実施の形態に係る画素30sは、図12に示した第9の実施の形態に係る画素30iに、導波路201を追加した構成とされている。導波路201は、図13に示した第10の実施の形態に係る画素30jの導波路201と同じく、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている。 The pixel 30s according to the seventeenth embodiment has a configuration in which the waveguide 201 is added to the pixel 30i according to the ninth embodiment shown in FIG. The waveguide 201 has a configuration capable of guiding the incident light incident through the opening 52 to the semiconductor region 32, similarly to the waveguide 201 of the pixel 30j according to the tenth embodiment shown in FIG. ing.
 導波路201で開口部52に導かれた入射光は、光屈折構造部152が形成されているため、光屈折構造部152と半導体領域32の境界部分で屈折し、半導体領域32に入射される。 Since the light refraction structure portion 152 is formed, the incident light guided to the opening 52 by the waveguide 201 is refracted at the boundary portion between the light refraction structure portion 152 and the semiconductor region 32, and is incident on the semiconductor region 32. ..
 半導体領域32に入射され、表面側に達した光は、基板表面散乱部131で散乱されたり、表面側反射膜34で反射されたりすることで、半導体領域32に戻される。また、反射部115で反射される光もあり、半導体領域32内に光がとどまりやすい構造とされている。第17の実施の形態に係る画素30sも、入射光を半導体領域32に留めることができる構造とされている。 The light incident on the semiconductor region 32 and reaching the surface side is returned to the semiconductor region 32 by being scattered by the substrate surface scattering portion 131 or reflected by the surface side reflective film 34. In addition, some light is reflected by the reflecting unit 115, and the structure is such that the light easily stays in the semiconductor region 32. The pixel 30s according to the seventeenth embodiment also has a structure capable of retaining the incident light in the semiconductor region 32.
 このように導波路201を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、上記した実施の形態と同じく、開口部52から抜け出る反射光の量を低減させ、半導体領域32内に閉じ込める光量を増やすことができる。よって、画素30sの感度を向上させることができる。 By providing the waveguide 201 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, as in the above-described embodiment, the amount of reflected light emitted from the opening 52 can be reduced, and the amount of light confined in the semiconductor region 32 can be increased. Therefore, the sensitivity of the pixels 30s can be improved.
 <第18の実施の形態に係る画素の構成>
 図24は、本技術の第18の実施の形態に係る画素30tの構成例を示す断面図である。
<Pixel configuration according to the eighteenth embodiment>
FIG. 24 is a cross-sectional view showing a configuration example of the pixel 30t according to the eighteenth embodiment of the present technology.
 第18の実施の形態に係る画素30tは、図13に示した第10の実施の形態に係る画素30jと同じく導波路201に該当する導波路301を備えるが、その形状や、形成されている位置が異なる。 The pixel 30t according to the eighteenth embodiment includes the waveguide 301 corresponding to the waveguide 201 like the pixel 30j according to the tenth embodiment shown in FIG. 13, but its shape and shape are formed. The position is different.
 導波路301は、開口部52を介して入射される入射光を半導体領域32に導くことができる構成とされている点で、図13に示した第10の実施の形態に係る画素30jの導波路201と同じである。 The waveguide 301 is configured to be able to guide the incident light incident through the opening 52 to the semiconductor region 32, and thus guides the pixels 30j according to the tenth embodiment shown in FIG. It is the same as the waveguide 201.
 導波路301は、断面において平行四辺形で形成され、上辺を、オンチップレンズ37側とし、下辺を開口部52側とした場合、上辺と下辺は、開口部52の幅と同程度の長さで構成することができる。 The waveguide 301 is formed in a parallelogram in cross section, and when the upper side is the on-chip lens 37 side and the lower side is the opening 52 side, the upper side and the lower side are about the same length as the width of the opening 52. Can be configured with.
 導波路301の上辺の中心の位置P2と、オンチップレンズ37の中央の位置P1は、ずれた位置関係にある。図24に示した例では、導波路301の上辺の中心の位置P2は、オンチップレンズ37の中央の位置P1よりも左側にずれた位置にある。 The central position P2 of the upper side of the waveguide 301 and the central position P1 of the on-chip lens 37 are in a deviated positional relationship. In the example shown in FIG. 24, the central position P2 of the upper side of the waveguide 301 is located at a position shifted to the left side of the central position P1 of the on-chip lens 37.
 導波路301の下辺の中心の位置P3と、オンチップレンズ37の中央の位置P1も、ずれた位置関係にある。図24に示した例では、導波路301の下辺の中心の位置P3は、オンチップレンズ37の中央の位置P1よりも右側にずれた位置にある。 The central position P3 of the lower side of the waveguide 301 and the central position P1 of the on-chip lens 37 are also in a deviated positional relationship. In the example shown in FIG. 24, the center position P3 of the lower side of the waveguide 301 is located at a position shifted to the right side of the center position P1 of the on-chip lens 37.
 開口部52は、導波路301の下辺の位置に合わせて開口されているため、開口部52の中心の位置も、オンチップレンズ37の中央の位置P1よりも右側にずれた位置にある。 Since the opening 52 is opened according to the position of the lower side of the waveguide 301, the center position of the opening 52 is also shifted to the right side from the center position P1 of the on-chip lens 37.
 導波路301は、斜め方向に傾いた形状で形成されている。図24に示した例では、導波路301の下辺を基準とした場合、左上斜め方向に傾いた状態で形成されている。 The waveguide 301 is formed in a shape inclined in an oblique direction. In the example shown in FIG. 24, when the lower side of the waveguide 301 is used as a reference, the shape is formed so as to be inclined in the upper left diagonal direction.
 このように、導波路301を傾いた状態で形成した場合、図24の左側に示した画素30tに矢印で示したように入射光は進む。導波路301を介して入射した入射光は、半導体領域32の図中右側の側面に当たり、反射される。図24に示したように、導波路301を、左上斜め方向に傾いた状態で形成した場合、半導体領域32の図中右側の側面に進む光が増える。 When the waveguide 301 is formed in an inclined state in this way, the incident light travels to the pixel 30t shown on the left side of FIG. 24 as shown by an arrow. The incident light incident on the waveguide 301 hits the right side surface of the semiconductor region 32 in the drawing and is reflected. As shown in FIG. 24, when the waveguide 301 is formed in a state of being inclined in the upper left oblique direction, more light travels to the right side surface of the semiconductor region 32 in the drawing.
 半導体領域32の右または左の側面に、光が進むように導波路301を構成することで、表面側反射膜34に、垂直に当たる光、いわゆる0次成分の光を少なくすることができる。よって、表面側反射膜34に反射され、開口部52から抜け出る光、いわゆる0次成分の光を少なくすることができる。 By configuring the waveguide 301 so that light travels on the right or left side surface of the semiconductor region 32, it is possible to reduce the amount of light that hits the surface-side reflective film 34 vertically, that is, the light of the so-called 0th-order component. Therefore, it is possible to reduce the amount of light that is reflected by the surface-side reflective film 34 and escapes from the opening 52, that is, the light of the so-called zero-order component.
 よって、導波路301を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになり、開口部52から抜け出る反射光の量を低減させる用にすることができる。よって半導体領域32内に閉じ込める光量を増やすことができ、画素30tの感度を向上させることができる。 Therefore, by providing the waveguide 301, more incident light can be guided to the semiconductor region 32 through the opening 52, and the amount of reflected light emitted from the opening 52 can be reduced. it can. Therefore, the amount of light confined in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30t can be improved.
 導波路301の傾きの角度(傾斜角度)は、画素アレイ部10内の画素30tの位置により異なるようにしても良い。例えば、画素アレイ部10を中央領域と周辺領域に分けた場合、中央領域に配置される画素30tの傾斜角度は、周辺領域に配置される画素30tの傾斜角度よりも大きく形成されていても良い。 The inclination angle (inclination angle) of the waveguide 301 may be different depending on the position of the pixel 30t in the pixel array unit 10. For example, when the pixel array unit 10 is divided into a central region and a peripheral region, the tilt angle of the pixels 30t arranged in the central region may be formed to be larger than the tilt angle of the pixels 30t arranged in the peripheral region. ..
 また、周辺領域に配置される画素30tのうち、画素アレイ部10の右側の周辺領域、左側の周辺領域、上側の周辺領域、下側の周辺領域に分け、それぞれの位置により傾斜している方向が異なるようにしても良い。 Further, among the pixels 30t arranged in the peripheral region, the pixel array unit 10 is divided into a peripheral region on the right side, a peripheral region on the left side, a peripheral region on the upper side, and a peripheral region on the lower side, and the directions are inclined according to the respective positions. May be different.
 また、画素アレイ部10の周辺領域に配置されている画素30tには、斜め方向からの入射光が多いため、導波路301を備えない構成としても良い。この場合、画素アレイ部10の中央領域に位置する画素30tには導波路301を形成し、画素アレイ部10の周辺領域に位置する画素30tには導波路301を形成しないようにすることができる。このようにした場合、中央領域から周辺領域に向かうにつれて導波路301の傾斜角度が小さくなるようにし、傾斜角度が90度(に近い角度)になる領域では導波路301を形成しないようにすることができる。 Further, since the pixels 30t arranged in the peripheral region of the pixel array unit 10 have a large amount of incident light from an oblique direction, the configuration may not include the waveguide 301. In this case, the waveguide 301 can be formed in the pixel 30t located in the central region of the pixel array unit 10, and the waveguide 301 can not be formed in the pixel 30t located in the peripheral region of the pixel array portion 10. .. In this case, the inclination angle of the waveguide 301 is reduced from the central region to the peripheral region, and the waveguide 301 is not formed in the region where the inclination angle is 90 degrees (close to). Can be done.
 図24に示した画素30tは、基板裏面散乱部109を備えない構成を示したが、例えば、図4に示した第1の実施の形態に係る画素30aのように、基板裏面散乱部51を備える構成としても良い。 The pixel 30t shown in FIG. 24 has a configuration in which the substrate back surface scattering portion 109 is not provided. However, for example, as in the pixel 30a according to the first embodiment shown in FIG. 4, the substrate back surface scattering portion 51 is provided. It may be provided.
 また、図5に示した第2の実施の形態に係る画素30bのように、斜面を有する形状の表面側反射膜101を備える構成としても良い。また、図9に示した第6の実施の形態に係る画素30fのように、基板表面散乱部131を備える構成としても良い。このように、第18の実施の形態に係る画素30tは、他の実施の形態に係る画素30と組み合わせた構成とすることができる。 Further, as in the pixel 30b according to the second embodiment shown in FIG. 5, the surface side reflective film 101 having a shape having a slope may be provided. Further, as in the pixel 30f according to the sixth embodiment shown in FIG. 9, the configuration may include the substrate surface scattering portion 131. As described above, the pixel 30t according to the eighteenth embodiment can be configured in combination with the pixel 30 according to another embodiment.
 すなわち第18の実施の形態に係る画素30tは、第1乃至第9の実施の形態に係る画素a乃至iと組み合わせた構成とすることができる。 That is, the pixel 30t according to the eighteenth embodiment can be configured in combination with the pixels a to i according to the first to ninth embodiments.
 また、画素アレイ部10内における位置により、第1乃至18の実施の形態に係る画素a乃至tのいずれかが適用されるようにすることもできる。 Further, depending on the position in the pixel array unit 10, any of the pixels a to t according to the first to eighteenth embodiments can be applied.
 <第19の実施の形態に係る画素の構成>
 図25は、本技術の第19の実施の形態に係る画素30uの構成例を示す断面図である。
<Pixel configuration according to the 19th embodiment>
FIG. 25 is a cross-sectional view showing a configuration example of the pixel 30u according to the 19th embodiment of the present technology.
 第19の実施の形態に係る画素30uは、図24に示した第18の実施の形態に係る画素30tに、反射部311を追加した構成とされている点が異なり、他の点は同様である。 The pixel 30u according to the nineteenth embodiment is different in that the reflecting portion 311 is added to the pixel 30t according to the eighteenth embodiment shown in FIG. 24, and the other points are the same. is there.
 反射部311は、図25に示した例では、開口部52(導波路301の下辺)の図中左側に形成されている。反射部311は、導波路301を介して入力される入射光の進路を邪魔しない位置に形成されている。反射部311は、例えば、図6に示した第3の実施の形態に係る画素30cに設けられている反射部111と同じく、断面では棒状に形成されている。反射部311は、半導体領域32内において、半導体領域32内に所定の厚さを有する面として形成されていても良いし、円柱や多角柱といった形状であっても良い。 In the example shown in FIG. 25, the reflection portion 311 is formed on the left side of the opening 52 (lower side of the waveguide 301) in the drawing. The reflection unit 311 is formed at a position that does not interfere with the path of the incident light input via the waveguide 301. The reflecting portion 311 is formed in a rod shape in a cross section, for example, like the reflecting portion 111 provided in the pixel 30c according to the third embodiment shown in FIG. The reflecting portion 311 may be formed as a surface having a predetermined thickness in the semiconductor region 32 in the semiconductor region 32, or may have a shape such as a cylinder or a polygonal prism.
 反射部311を半導体領域32内に設けることで、図25の左側に図示した画素30uに示したように、入射光は、半導体領域32の右側の側面に当たり、表面側反射膜34側に進む。そして表面側反射膜34で反射され、半導体領域32の側面で反射され、反射部311に当たり、反射部311でさらに反射される。 By providing the reflecting portion 311 in the semiconductor region 32, as shown in the pixel 30u shown on the left side of FIG. 25, the incident light hits the right side surface of the semiconductor region 32 and travels toward the surface side reflecting film 34 side. Then, it is reflected by the surface-side reflective film 34, reflected on the side surface of the semiconductor region 32, hits the reflecting portion 311 and is further reflected by the reflecting portion 311.
 反射部311で反射された光は、再度半導体領域32の側面側に進み、反射される。半導体領域32の側面と反射部311との間で、光は複数回反射される構造とすることができる。 The light reflected by the reflecting unit 311 travels to the side surface side of the semiconductor region 32 again and is reflected. Light can be reflected a plurality of times between the side surface of the semiconductor region 32 and the reflecting portion 311.
 このように、導波路301を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、反射部311を設ける構成とすることで、半導体領域32内に留めることができる光量を増やすことができ、画素30uの感度を向上させることができる。 By providing the waveguide 301 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, by providing the reflecting portion 311, the amount of light that can be kept in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30u can be improved.
 <第20の実施の形態に係る画素の構成>
 図26は、本技術の第20の実施の形態に係る画素30vの構成例を示す断面図である。
<Pixel configuration according to the twentieth embodiment>
FIG. 26 is a cross-sectional view showing a configuration example of the pixel 30v according to the twentieth embodiment of the present technology.
 第20の実施の形態に係る画素30vは、図24に示した第18の実施の形態に係る画素30tに、反射部312と反射部313を追加した構成とされている点が異なり、他の点は同様である。 The pixel 30v according to the twentieth embodiment is different from the pixel 30t according to the eighteenth embodiment shown in FIG. 24 in that a reflection unit 312 and a reflection unit 313 are added. The points are similar.
 反射部312は、半導体領域32内において、裏面側から表面側(図中上から下)の方向にかけて形成され、反射部313は、半導体領域32内において、表面側から裏面側(図中下から上)の方向にかけて形成されている。また反射部313は、反射部312と半導体領域32の側面との間の位置に形成されている。またその半導体領域の側面は、導波路301を介して光がより多く集まる側の側面(図26では半導体領域32の右側の側面)と対向する面である。 The reflecting portion 312 is formed in the semiconductor region 32 from the back surface side to the front surface side (from top to bottom in the figure), and the reflecting portion 313 is formed in the semiconductor region 32 from the front surface side to the back surface side (from the bottom in the figure). It is formed in the direction of (upper). Further, the reflecting portion 313 is formed at a position between the reflecting portion 312 and the side surface of the semiconductor region 32. Further, the side surface of the semiconductor region is a surface facing the side surface on the side where more light is collected via the waveguide 301 (the side surface on the right side of the semiconductor region 32 in FIG. 26).
 導波路301、反射部312、および反射部313を半導体領域32内に設けることで、図26の左側に図示した画素30vに示したように、入射光は、半導体領域32の右側の側面側に進み、反射され、表面側反射膜34側に進む。そして表面側反射膜34で反射され、反射部313で反射され、反射部312で反射される。 By providing the waveguide 301, the reflection portion 312, and the reflection portion 313 in the semiconductor region 32, the incident light is directed to the right side surface side of the semiconductor region 32 as shown in the pixel 30v shown on the left side of FIG. 26. It advances, is reflected, and proceeds to the surface side reflective film 34 side. Then, it is reflected by the surface-side reflective film 34, reflected by the reflecting portion 313, and reflected by the reflecting portion 312.
 反射部312で反射された光は、裏面側反射膜35で反射され、半導体領域32の側面で反射され、反射部313に当たる。このように、半導体領域32の側面と反射部312との間、反射部312と反射部313との間、反射部313と半導体領域32の側面との間で、光は複数回反射される構造とすることができる。 The light reflected by the reflecting portion 312 is reflected by the back surface side reflecting film 35, reflected by the side surface of the semiconductor region 32, and hits the reflecting portion 313. In this way, light is reflected a plurality of times between the side surface of the semiconductor region 32 and the reflection unit 312, between the reflection unit 312 and the reflection unit 313, and between the reflection unit 313 and the side surface of the semiconductor region 32. Can be.
 このように、導波路301を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、反射部312と反射部313を設ける構成とすることで、半導体領域32内に留めることができる光量を増やすことができ、画素30vの感度を向上させることができる。 By providing the waveguide 301 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, by providing the reflecting portion 312 and the reflecting portion 313, the amount of light that can be retained in the semiconductor region 32 can be increased, and the sensitivity of the pixel 30v can be improved.
 <第21の実施の形態に係る画素の構成>
 図27は、本技術の第21の実施の形態に係る画素30wの構成例を示す断面図である。
<Pixel configuration according to the 21st embodiment>
FIG. 27 is a cross-sectional view showing a configuration example of the pixel 30w according to the 21st embodiment of the present technology.
 第21の実施の形態に係る画素30wは、図25に示した第19の実施の形態に係る画素30uに、基板裏面散乱部321と基板表面散乱部322を追加した構成とされている点が異なり、他の点は同様である。 The pixel 30w according to the 21st embodiment has a configuration in which a substrate back surface scattering portion 321 and a substrate surface scattering portion 322 are added to the pixel 30u according to the 19th embodiment shown in FIG. 25. Different, the other points are similar.
 基板裏面散乱部321は、反射部311と半導体領域32の側面との間であり、導波路301がある部分以外の領域に形成されている。また、基板表面散乱部322は、表面側(配線領域33側)であり、基板裏面散乱部321と同じく、反射部311(を延長したと仮定した位置)と半導体領域32の側面との間に形成されている。 The substrate back surface scattering portion 321 is between the reflection portion 311 and the side surface of the semiconductor region 32, and is formed in a region other than the portion where the waveguide 301 is located. Further, the substrate surface scattering portion 322 is on the front surface side (wiring region 33 side), and like the substrate back surface scattering portion 321, between the reflection portion 311 (a position assumed to be extended) and the side surface of the semiconductor region 32. It is formed.
 基板裏面散乱部321と基板表面散乱部322を設けることで、基板裏面散乱部321または基板表面散乱部322に到達した光を散乱させることができ、例えば、図8に示した第5の実施の形態に係る画素30eと同じく、光を半導体領域32に閉じ込めることができる構成とすることができる。 By providing the substrate back surface scattering unit 321 and the substrate surface scattering unit 322, the light that has reached the substrate back surface scattering unit 321 or the substrate surface scattering unit 322 can be scattered. Similar to the pixel 30e according to the form, the configuration can be such that light can be confined in the semiconductor region 32.
 このように、導波路301を設けることで、開口部52を介して半導体領域32により多くの入射光を導くことができるようになる。また、反射部311、基板裏面散乱部321、および基板表面散乱部322を設ける構成とすることで、入射光を半導体領域32内に戻し、留めることができる光量を増やすことができる。よって、画素30wの感度を向上させることができる。 By providing the waveguide 301 in this way, more incident light can be guided to the semiconductor region 32 through the opening 52. Further, by providing the reflecting portion 311, the substrate back surface scattering portion 321 and the substrate surface scattering portion 322, the incident light can be returned to the semiconductor region 32 and the amount of light that can be retained can be increased. Therefore, the sensitivity of the pixel 30w can be improved.
 基板裏面散乱部321と基板表面散乱部322は、例えば、図4に示した第1の実施の形態に係る画素30aのように、半導体領域32の一方の側面から他方の側面までの間に設けられる構成としても良い。 The substrate back surface scattering unit 321 and the substrate surface scattering unit 322 are provided between one side surface of the semiconductor region 32 and the other side surface, for example, as in the pixel 30a according to the first embodiment shown in FIG. It may be configured to be.
 上記したように、第1乃至第21の実施の形態は組み合わせて適用することもできる。また、画素アレイ部10は、第1乃至第21の実施の形態に係わる画素30a乃至30wのうち、異なる実施の形態に係わる画素30が混在しているような構成とすることも可能である。 As described above, the first to 21st embodiments can be applied in combination. Further, the pixel array unit 10 may be configured such that among the pixels 30a to 30w according to the first to 21st embodiments, the pixels 30 according to different embodiments are mixed.
 <適用例>
 上記した第1乃至第21の実施の形態に係わる画素30a乃至30wは、被写体を撮像する撮像素子(以下、適宜、通常画素と記述する)に適用した場合を例に挙げて説明した。通常画素以外にも、例えば、以下に説明するような画素にも適用できる。
<Application example>
The pixels 30a to 30w according to the first to 21st embodiments described above have been described by exemplifying a case where the pixels 30a to 30w are applied to an image pickup device (hereinafter, appropriately referred to as a normal pixel) for imaging a subject. In addition to normal pixels, for example, it can be applied to pixels as described below.
 上記した第1乃至第21の実施の形態に係わる画素30a乃至30wは、測距を行う画素(以下、適宜、測距画素と記述する)にも適用できる。図28は、第10の実施の形態に係る画素30jを、測距画素に適用した場合の構成を示す。図28に示した測距画素は、1つのフォトダイオードPDに対して、転送ゲートとして2つの転送トランジスタTRG1およびTRG2を有し、電荷蓄積部として2つの浮遊拡散領域FD1およびFD2とを有し、フォトダイオードPDで生成された電荷を、2つの浮遊拡散領域FD1およびFD2に振り分ける、いわゆる2タップの画素構造を示している。 The pixels 30a to 30w according to the first to 21st embodiments described above can also be applied to pixels for distance measurement (hereinafter, appropriately referred to as distance measurement pixels). FIG. 28 shows a configuration when the pixel 30j according to the tenth embodiment is applied to the distance measuring pixel. The ranging pixel shown in FIG. 28 has two transfer transistors TRG1 and TRG2 as transfer gates and two stray diffusion regions FD1 and FD2 as charge storage units for one photodiode PD. It shows a so-called two-tap pixel structure in which the electric charge generated by the photodiode PD is distributed to two floating diffusion regions FD1 and FD2.
 2つの転送トランジスタTRG1およびTRG2の間に、表面側反射膜101が形成されている。平面図で見た場合、図29に示すように、フォトダイオードPDの中央付近に表面側反射膜101は形成されている。 A surface-side reflective film 101 is formed between the two transfer transistors TRG1 and TRG2. When viewed in a plan view, as shown in FIG. 29, the surface-side reflective film 101 is formed near the center of the photodiode PD.
 図29は、図28に示した画素30jのトランジスタなどの配置例を示した平面図である。図29に示されるように、矩形の画素30jの中央部の領域に、フォトダイオードPDがN型の半導体領域32で形成されている。フォトダイオードPDの中央には、表面側反射膜101が形成されている。 FIG. 29 is a plan view showing an arrangement example of the transistor of the pixel 30j shown in FIG. 28. As shown in FIG. 29, the photodiode PD is formed in the N-type semiconductor region 32 in the central region of the rectangular pixel 30j. A surface-side reflective film 101 is formed in the center of the photodiode PD.
 フォトダイオードPDの外側であって、矩形の画素30jの四辺の所定の一辺に沿って、転送トランジスタTRG1、切替トランジスタFDG1、リセットトランジスタRST1、増幅トランジスタAMP1、及び、選択トランジスタSEL1が直線的に並んで配置され、矩形の画素30の四辺の他の一辺に沿って、転送トランジスタTRG2、切替トランジスタFDG2、リセットトランジスタRST2、増幅トランジスタAMP2、及び、選択トランジスタSEL2が直線的に並んで配置されている。 The transfer transistor TRG1, the switching transistor FDG1, the reset transistor RST1, the amplification transistor AMP1, and the selection transistor SEL1 are linearly arranged along a predetermined side of four sides of the rectangular pixel 30j outside the photodiode PD. The transfer transistor TRG2, the switching transistor FDG2, the reset transistor RST2, the amplification transistor AMP2, and the selection transistor SEL2 are arranged linearly along the other side of the four sides of the rectangular pixel 30.
 さらに、転送トランジスタTRG、切替トランジスタFDG、リセットトランジスタRST、増幅トランジスタAMP、及び、選択トランジスタSELが形成されている画素30の二辺とは別の辺に、電荷排出トランジスタOFGが配置されている。 Further, the charge discharge transistor OFG is arranged on a side different from the two sides of the pixel 30 on which the transfer transistor TRG, the switching transistor FDG, the reset transistor RST, the amplification transistor AMP, and the selection transistor SEL are formed.
 なお、図29に示した画素回路の配置は、この例に限られず、その他の配置としてもよい。また、ここでは、第10の実施の形態に係る画素30jを、測距画素に適用した場合の構成を示したが、他の実施の形態を測距画素として適用した場合にも、このような配置を適用することはできる。 The arrangement of the pixel circuits shown in FIG. 29 is not limited to this example, and may be other arrangements. Further, here, the configuration when the pixel 30j according to the tenth embodiment is applied to the distance measuring pixel is shown, but such a configuration is also shown when another embodiment is applied as the distance measuring pixel. Arrangements can be applied.
 図30は、第19の実施の形態に係る画素30wを、測距画素に適用した場合の構成を示す。図30に示した測距画素も、2タップ方式で測距を行う画素を例示しているため、1つのフォトダイオードPDに対して、転送ゲートとして2つの転送トランジスタTRG1およびTRG2を有し、電荷蓄積部として2つの浮遊拡散領域FD1およびFD2とを有している。 FIG. 30 shows a configuration when the pixel 30w according to the nineteenth embodiment is applied to the distance measuring pixel. Since the distance measuring pixel shown in FIG. 30 also illustrates a pixel that performs distance measuring by a two-tap method, one photodiode PD has two transfer transistors TRG1 and TRG2 as transfer gates, and has an electric charge. It has two floating diffusion regions FD1 and FD2 as storage portions.
 図30に示したように、導波路301が傾いた状態で形成されている場合、図31に示すように、開口部52を避けた位置に反射部311が設けられる。 As shown in FIG. 30, when the waveguide 301 is formed in an inclined state, the reflection portion 311 is provided at a position avoiding the opening 52 as shown in FIG. 31.
 ここでは、第19の実施の形態に係る画素30wを、測距画素に適用した場合の構成を示したが、他の実施の形態を測距画素として適用した場合にも、このような配置を適用することはできる。 Here, the configuration when the pixel 30w according to the 19th embodiment is applied to the distance measuring pixel is shown, but such an arrangement is also used when the other embodiment is applied as the distance measuring pixel. It can be applied.
 図29乃至図31に示した画素のように、浮遊拡散領域FDを備える構成とした場合、その浮遊拡散領域FDに、迷光成分が入り込まないようにする必要がある。図32は、第10の実施の形態に係る画素30jを、浮遊拡散領域FDを有する通常画素に適用した場合の構成を示す。図32に示した画素30jは、図中右下側に浮遊拡散領域FDを備え、フォトダイオードPDから浮遊拡散領域FDへ電荷の転送を行う転送ゲートTRGを備える。 When the configuration is provided with the floating diffusion region FD as in the pixels shown in FIGS. 29 to 31, it is necessary to prevent the stray light component from entering the floating diffusion region FD. FIG. 32 shows a configuration when the pixel 30j according to the tenth embodiment is applied to a normal pixel having a floating diffusion region FD. The pixel 30j shown in FIG. 32 is provided with a floating diffusion region FD on the lower right side in the drawing, and is provided with a transfer gate TRG for transferring charges from the photodiode PD to the floating diffusion region FD.
 図32に示したように浮遊拡散領域FDが、図中右側に形成されている場合、その浮遊拡散領域FD側が高くなる傾斜を有する表面側反射膜101が形成される。このような傾斜を有する表面側反射膜101は、例えば、図29に示した場合と同じく、平面においてフォトダイオードPDの中央付近に形成される。 When the floating diffusion region FD is formed on the right side in the drawing as shown in FIG. 32, the surface side reflective film 101 having an inclination that the floating diffusion region FD side becomes higher is formed. The surface-side reflective film 101 having such an inclination is formed near the center of the photodiode PD in a plane, as in the case shown in FIG. 29, for example.
 この表面側反射膜101の傾斜を有する面は、浮遊拡散領域FD側に向かないように形成されている。よって、図32に示したように、入射光は、表面側反射膜101の傾斜面に当たり、浮遊拡散領域FDが形成されている側とは逆側の半導体領域32の側面に進む。表面側反射膜101を、浮遊拡散領域FDに傾斜面が向かないように形成することで、浮遊拡散領域FDの方に光が進むようなことを低減させることができる。 The inclined surface of the surface-side reflective film 101 is formed so as not to face the floating diffusion region FD side. Therefore, as shown in FIG. 32, the incident light hits the inclined surface of the surface-side reflective film 101 and travels to the side surface of the semiconductor region 32 opposite to the side on which the floating diffusion region FD is formed. By forming the surface-side reflective film 101 so that the inclined surface does not face the floating diffusion region FD, it is possible to reduce the amount of light traveling toward the floating diffusion region FD.
 図33は、本技術の第18の実施の形態に係る画素30tを、浮遊拡散領域FDを有する通常画素に適用した場合の構成を示す。図33に示した画素30tは、図中左下側に浮遊拡散領域FDを備え、フォトダイオードPDから浮遊拡散領域FDへ電荷の転送を行う転送ゲートTRGを備える。 FIG. 33 shows a configuration when the pixel 30t according to the eighteenth embodiment of the present technology is applied to a normal pixel having a floating diffusion region FD. The pixel 30t shown in FIG. 33 is provided with a floating diffusion region FD on the lower left side in the drawing, and is provided with a transfer gate TRG for transferring charges from the photodiode PD to the floating diffusion region FD.
 図33に示したように浮遊拡散領域FDが、図中左側に形成されている場合、その浮遊拡散領域FD側(近傍)に、反射部351が追加された構成とされる。反射部351は、表面側反射膜34から垂直方向に伸びるように形成されている。反射部351と表面側反射膜34は、一体構成とされていても良いし、別体として構成され、反射部351と表面側反射膜34は接していない構成とすることも可能である。 When the floating diffusion region FD is formed on the left side in the figure as shown in FIG. 33, the reflecting portion 351 is added to the floating diffusion region FD side (nearby). The reflective portion 351 is formed so as to extend in the vertical direction from the surface-side reflective film 34. The reflecting portion 351 and the surface side reflecting film 34 may be integrally formed, or may be formed as a separate body, and the reflecting portion 351 and the surface side reflecting film 34 may not be in contact with each other.
 このような反射部351を有する表面側反射膜34は、例えば、図29に示した場合(図30では表面側反射膜101として記載部分)と同じく、フォトダイオードPDの中央付近に形成される。反射部351は、浮遊拡散領域FDを囲むように構成されていても良い。 The surface-side reflective film 34 having such a reflective portion 351 is formed near the center of the photodiode PD, as in the case shown in FIG. 29 (the portion described as the front-side reflective film 101 in FIG. 30), for example. The reflection unit 351 may be configured to surround the floating diffusion region FD.
 反射部351を設けることで、図33に示したように、入射光は、導波路301により、半導体領域32の図中右側の側面、換言すれば浮遊拡散領域FDが形成されている側と対向する側面に進む。そして入射光は、半導体領域32の側面により反射され、表面側反射膜34に当たり、反射され、反射部351に当たる。反射部351がない場合、浮遊拡散領域FDに反射光が進むが、反射部351を設けることで、浮遊拡散領域FDに反射光が進むことを防ぐことができる。 By providing the reflecting portion 351, as shown in FIG. 33, the incident light is opposed to the right side surface of the semiconductor region 32 in the drawing, in other words, the side on which the floating diffusion region FD is formed, by the waveguide 301. Proceed to the side to do. Then, the incident light is reflected by the side surface of the semiconductor region 32, hits the surface side reflective film 34, is reflected, and hits the reflecting portion 351. If there is no reflecting portion 351, the reflected light advances to the floating diffusion region FD, but by providing the reflecting portion 351, it is possible to prevent the reflected light from advancing to the floating diffusion region FD.
 第1乃至第21の実施の形態に係わる画素30a乃至30wは、メモリを備える画素に対しても適用できる。図34は、メモリを備える画素であり、測距を行う測距画素に、本技術を適用した画素30xの構成を示す図である。また、画素30xは、2タップ方式での測距画素である場合を示す。 The pixels 30a to 30w according to the first to 21st embodiments can also be applied to the pixels provided with the memory. FIG. 34 is a diagram showing a configuration of pixels 30x to which the present technology is applied to the distance measuring pixels that are pixels having a memory and performing distance measuring. Further, the pixel 30x indicates a case where the distance measuring pixel is a 2-tap method.
 図34に示した画素30xは、図21に示した画素30rと同じく、導波路201と光屈折構造部152を備える。画素30xの配線領域33側の半導体基板31には、浮遊拡散領域FD1、メモリ領域Mem1、転送ゲートTRG1、転送ゲートTRG1’と、浮遊拡散領域FD2、メモリ領域Mem2、転送ゲートTRG2、転送ゲートTRG2’が設けられている。 The pixel 30x shown in FIG. 34 includes a waveguide 201 and a light refraction structure portion 152, similarly to the pixel 30r shown in FIG. The semiconductor substrate 31 on the wiring region 33 side of the pixel 30x has a floating diffusion region FD1, a memory region Mem1, a transfer gate TRG1, a transfer gate TRG1', and a floating diffusion region FD2, a memory area Mem2, a transfer gate TRG2, and a transfer gate TRG2'. Is provided.
 フォトダイオードPDからの電荷は、転送ゲートTRG1により、メモリ領域Mem1へと転送される。メモリ領域Mem1に転送された電荷は、転送ゲートTRG1’により、浮遊拡散領域FD1へと転送される。同様に、フォトダイオードPDからの電荷は、転送ゲートTRG2により、メモリ領域Mem2へと転送される。メモリ領域Mem2に転送された電荷は、転送ゲートTRG2’により、浮遊拡散領域FD2へと転送される。 The electric charge from the photodiode PD is transferred to the memory area Mem1 by the transfer gate TRG1. The electric charge transferred to the memory area Mem1 is transferred to the floating diffusion area FD1 by the transfer gate TRG1'. Similarly, the charge from the photodiode PD is transferred to the memory area Mem2 by the transfer gate TRG2. The electric charge transferred to the memory area Mem2 is transferred to the floating diffusion area FD2 by the transfer gate TRG2'.
 転送ゲートTRG1と転送ゲートTRG2は、それぞれ縦型トランジスタ(のゲート)で形成されている。縦型トランジスタのゲートとは、図34に示すように、縦型、図中上下方向に設けられた配線を有し、フォトダイオードPDを構成する半導体領域32の近くまで設けられた配線を有するゲートである。なお、図34では、半導体領域32の近くまで設けられている例を示したが、半導体領域32内まで設けられている構成とすることもできる。 The transfer gate TRG1 and the transfer gate TRG2 are each formed of (gates) vertical transistors. As shown in FIG. 34, the gate of the vertical transistor is a gate having vertical wiring provided in the vertical direction in the drawing and wiring provided close to the semiconductor region 32 constituting the photodiode PD. Is. Although FIG. 34 shows an example in which the semiconductor region 32 is provided close to the semiconductor region 32, the configuration may be such that the semiconductor region 32 is provided.
 このように、メモリ領域Memや浮遊拡散領域FDを設けた場合、これらのメモリ領域Memや浮遊拡散領域FDに迷光成分が入り込まないように、メモリ領域Memや浮遊拡散領域FD上に反射部361が形成される。反射部361は、半導体領域32と半導体基板31との境界に形成されている。また反射部361は、図35に示すように、平面においては、縦型トランジスタのゲートTGR1とゲートTGR2を避けた領域であり、半導体領域32内に形成されている。 In this way, when the memory area Mem and the floating diffusion area FD are provided, the reflection unit 361 is placed on the memory area Mem and the floating diffusion area FD so that the stray light component does not enter the memory area Mem and the floating diffusion area FD. It is formed. The reflection portion 361 is formed at the boundary between the semiconductor region 32 and the semiconductor substrate 31. Further, as shown in FIG. 35, the reflection portion 361 is a region avoiding the gate TGR1 and the gate TGR2 of the vertical transistor in a plane, and is formed in the semiconductor region 32.
 図34に示した構成においては、仮に、メモリ領域Memなどが設けられている領域を透過し、配線領域33側に到達した光があった場合、その光が、反射され、再度メモリ領域Memなどが設けられている領域に戻ることがあると、迷光成分が発生することになる。図34に示したような構成にした場合、表面側反射膜34や表面側反射膜101に該当する膜は配線領域33には設けられない構成とされる。 In the configuration shown in FIG. 34, if there is light that has passed through the area where the memory area Mem or the like is provided and reaches the wiring area 33 side, the light is reflected and the memory area Mem or the like is again used. If it returns to the area where is provided, a stray light component will be generated. When the configuration is as shown in FIG. 34, the surface-side reflective film 34 and the film corresponding to the surface-side reflective film 101 are not provided in the wiring region 33.
 反射部361を設けることで、図34に示したように入射光は、光屈折構造部152で屈折され、半導体領域32の側面方向に進み、反射される。半導体領域32の側面で反射された反射光は、反射部361に当たり、反射される。反射部361により光が反射されるため、反射部361の下側に配置されているメモリ領域Memや浮遊拡散領域FDに光が入り込むようなことを防ぐことができる。 By providing the reflecting portion 361, as shown in FIG. 34, the incident light is refracted by the light refracting structure portion 152, travels toward the side surface of the semiconductor region 32, and is reflected. The reflected light reflected on the side surface of the semiconductor region 32 hits the reflecting portion 361 and is reflected. Since the light is reflected by the reflection unit 361, it is possible to prevent the light from entering the memory area Mem or the floating diffusion area FD arranged under the reflection unit 361.
 図36は、図34に示した画素30xと同じく、メモリを備える画素であり、測距を行う測距画素に、本技術を適用した場合の画素30yの構成例を示す図である。また、図36に示した画素30yは、画素30xと同じく、2タップ方式での測距画素である場合を示す。 FIG. 36 is a pixel provided with a memory like the pixel 30x shown in FIG. 34, and is a diagram showing a configuration example of the pixel 30y when the present technology is applied to the distance measuring pixel for performing distance measurement. Further, the pixel 30y shown in FIG. 36 shows a case where the pixel 30y is a distance measuring pixel in a two-tap method like the pixel 30x.
 図36に示した画素30yは、図26に示した画素30wと同じく、導波路301と反射部311を備える。また画素30yは、画素30xと同じく、配線領域33側の半導体基板31に、浮遊拡散領域FD1、メモリ領域Mem1、転送ゲートTRG1、転送ゲートTRG1’と、浮遊拡散領域FD2、メモリ領域Mem2、転送ゲートTRG2、転送ゲートTRG2’を備える。 The pixel 30y shown in FIG. 36 includes a waveguide 301 and a reflection unit 311 like the pixel 30w shown in FIG. 26. Further, the pixels 30y are the same as the pixels 30x, on the semiconductor substrate 31 on the wiring area 33 side, the floating diffusion area FD1, the memory area Mem1, the transfer gate TRG1, the transfer gate TRG1', and the floating diffusion area FD2, the memory area Mem2, and the transfer gate. It is equipped with TRG2 and transfer gate TRG2'.
 導波路301を設けた場合も、導波路201を設けた場合と同様に(画素30xと同様に)、メモリ領域Memや浮遊拡散領域FDに迷光成分が入り込まないように、メモリ領域Memや浮遊拡散領域FD上に反射部361が形成される。また反射部361は、図35に示したように、平面においては、縦型トランジスタのゲートTGR1とゲートTGR2を避けた領域であり、半導体領域32内に形成されている。 When the waveguide 301 is provided, as in the case where the waveguide 201 is provided (similar to the pixel 30x), the memory area Mem and the floating diffusion are prevented so that the stray light component does not enter the memory area Mem and the floating diffusion area FD. A reflection portion 361 is formed on the region FD. Further, as shown in FIG. 35, the reflection portion 361 is a region avoiding the gate TGR1 and the gate TGR2 of the vertical transistor in a plane, and is formed in the semiconductor region 32.
 反射部361を設けることで、図36に示したように入射光は、導波路301に導かれ、半導体領域32の側面方向に進み、反射される。半導体領域32の側面で反射された反射光は、反射部361に当たり、反射される。反射部361により光が反射されるため、反射部361の下側に配置されているメモリ領域Memや浮遊拡散領域FDに光が入り込むようなことを防ぐことができる。 By providing the reflecting portion 361, as shown in FIG. 36, the incident light is guided to the waveguide 301, travels in the side surface direction of the semiconductor region 32, and is reflected. The reflected light reflected on the side surface of the semiconductor region 32 hits the reflecting portion 361 and is reflected. Since the light is reflected by the reflection unit 361, it is possible to prevent the light from entering the memory area Mem or the floating diffusion area FD arranged under the reflection unit 361.
 なお、測距画素として、SPAD(single photon avalanche diode)を用いることもでき、本技術をSPADに適用することも可能である。 It should be noted that SPAD (single photon avalanche diode) can be used as the distance measuring pixel, and this technology can be applied to SPAD.
 本技術によれば、フォトダイオードに、より多くの入射光を集めることが可能となる。また、フォトダイオードに入射された光を、フォトダイオード内に留めることができ、フォトダイオード外に抜け出る光を減少させることができる。よって、QEロス(量子効率のロス)を低減させることができる。また、導波路を設けることで、低背でも開口部に光を通すことができる構成にできるため、画素を低背化することもでき、画素を小型化することができる。 According to this technology, it is possible to collect more incident light on the photodiode. Further, the light incident on the photodiode can be retained inside the photodiode, and the light that escapes from the photodiode can be reduced. Therefore, QE loss (loss of quantum efficiency) can be reduced. Further, by providing the waveguide, it is possible to configure the structure so that light can pass through the opening even if the height is low, so that the height of the pixel can be reduced and the size of the pixel can be reduced.
 <カメラへの応用例>
 本開示に係る技術(本技術)は、様々な製品に応用することができる。例えば、本技術は、カメラ等の撮像装置に搭載される撮像素子として実現されてもよい。
<Example of application to camera>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the present technology may be realized as an image pickup device mounted on an image pickup device such as a camera.
 図37は、本技術が適用され得る撮像装置の一例であるカメラの概略的な構成例を示すブロック図である。同図のカメラ1000は、レンズ1001と、撮像素子1002と、撮像制御部1003と、レンズ駆動部1004と、画像処理部1005と、操作入力部1006と、フレームメモリ1007と、表示部1008と、記録部1009とを備える。 FIG. 37 is a block diagram showing a schematic configuration example of a camera which is an example of an imaging device to which the present technology can be applied. The camera 1000 in the figure includes a lens 1001, an image pickup element 1002, an image pickup control unit 1003, a lens drive unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, and the like. A recording unit 1009 is provided.
 レンズ1001は、カメラ1000の撮影レンズである。このレンズ1001は、被写体からの光を集光し、後述する撮像素子1002に入射させて被写体を結像させる。 The lens 1001 is a photographing lens of the camera 1000. The lens 1001 collects light from the subject and causes the light to be incident on the image pickup device 1002 described later to form an image of the subject.
 撮像素子1002は、レンズ1001により集光された被写体からの光を撮像する半導体素子である。この撮像素子1002は、照射された光に応じたアナログの画像信号を生成し、デジタルの画像信号に変換して出力する。 The image sensor 1002 is a semiconductor element that captures light from a subject focused by the lens 1001. The image sensor 1002 generates an analog image signal according to the irradiated light, converts it into a digital image signal, and outputs the signal.
 撮像制御部1003は、撮像素子1002における撮像を制御するものである。この撮像制御部1003は、制御信号を生成して撮像素子1002に対して出力することにより、撮像素子1002の制御を行う。また、撮像制御部1003は、撮像素子1002から出力された画像信号に基づいてカメラ1000におけるオートフォーカスを行うことができる。 The image pickup control unit 1003 controls the image pickup in the image pickup device 1002. The image pickup control unit 1003 controls the image pickup device 1002 by generating a control signal and outputting the control signal to the image pickup device 1002. Further, the image pickup control unit 1003 can perform autofocus on the camera 1000 based on the image signal output from the image pickup device 1002.
 ここでオートフォーカスとは、レンズ1001の焦点位置を検出して、自動的に調整するシステムである。このオートフォーカスとして、撮像素子1002に配置された位相差画素により像面位相差を検出して焦点位置を検出する方式(像面位相差オートフォーカス)を使用することができる。また、画像のコントラストが最も高くなる位置を焦点位置として検出する方式(コントラストオートフォーカス)を適用することもできる。撮像制御部1003は、検出した焦点位置に基づいてレンズ駆動部1004を介してレンズ1001の位置を調整し、オートフォーカスを行う。なお、撮像制御部1003は、例えば、ファームウェアを搭載したDSP(Digital Signal Processor)により構成することができる。 Here, autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it. As this autofocus, a method (image plane phase difference autofocus) in which the image plane phase difference is detected by the phase difference pixels arranged in the image sensor 1002 to detect the focal position can be used. It is also possible to apply a method (contrast autofocus) of detecting the position where the contrast of the image is highest as the focal position. The image pickup control unit 1003 adjusts the position of the lens 1001 via the lens drive unit 1004 based on the detected focal position, and performs autofocus. The image pickup control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
 レンズ駆動部1004は、撮像制御部1003の制御に基づいて、レンズ1001を駆動するものである。このレンズ駆動部1004は、内蔵するモータを使用してレンズ1001の位置を変更することによりレンズ1001を駆動することができる。 The lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003. The lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
 画像処理部1005は、撮像素子1002により生成された画像信号を処理するものである。この処理には、例えば、画素毎の赤色、緑色および青色に対応する画像信号のうち不足する色の画像信号を生成するデモザイク、画像信号のノイズを除去するノイズリダクションおよび画像信号の符号化等が該当する。画像処理部1005は、例えば、ファームウェアを搭載したマイコンにより構成することができる。 The image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic to generate an image signal of a color that is insufficient among the image signals corresponding to red, green, and blue for each pixel, noise reduction to remove noise of the image signal, and coding of the image signal. Applicable. The image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
 操作入力部1006は、カメラ1000の使用者からの操作入力を受け付けるものである。この操作入力部1006には、例えば、押しボタンやタッチパネルを使用することができる。操作入力部1006により受け付けられた操作入力は、撮像制御部1003や画像処理部1005に伝達される。その後、操作入力に応じた処理、例えば、被写体の撮像等の処理が起動される。 The operation input unit 1006 receives the operation input from the user of the camera 1000. For example, a push button or a touch panel can be used for the operation input unit 1006. The operation input received by the operation input unit 1006 is transmitted to the image pickup control unit 1003 and the image processing unit 1005. After that, processing according to the operation input, for example, processing such as imaging of the subject is activated.
 フレームメモリ1007は、1画面分の画像信号であるフレームを記憶するメモリである。このフレームメモリ1007は、画像処理部1005により制御され、画像処理の過程におけるフレームの保持を行う。 The frame memory 1007 is a memory that stores a frame that is an image signal for one screen. The frame memory 1007 is controlled by the image processing unit 1005 and holds frames in the process of image processing.
 表示部1008は、画像処理部1005により処理された画像を表示するものである。この表示部1008には、例えば、液晶パネルを使用することができる。 The display unit 1008 displays the image processed by the image processing unit 1005. For this display unit 1008, for example, a liquid crystal panel can be used.
 記録部1009は、画像処理部1005により処理された画像を記録するものである。この記録部1009には、例えば、メモリカードやハードディスクを使用することができる。 The recording unit 1009 records the image processed by the image processing unit 1005. For example, a memory card or a hard disk can be used for the recording unit 1009.
 以上、本開示が適用され得るカメラについて説明した。本技術は以上において説明した構成のうち、撮像素子1002に適用され得る。具体的には、図1において説明した撮像素子1は、撮像素子1002に適用することができる。撮像素子1002に撮像素子1を適用することにより反射光が低減され、カメラ1000により生成される画像の画質の低下を防止することができる。 The cameras to which this disclosure can be applied have been described above. The present technology can be applied to the image pickup device 1002 among the configurations described above. Specifically, the image pickup device 1 described with reference to FIG. 1 can be applied to the image pickup device 1002. By applying the image sensor 1 to the image sensor 1002, the reflected light is reduced, and it is possible to prevent deterioration of the image quality of the image generated by the camera 1000.
 なお、ここでは、一例としてカメラについて説明したが、本開示に係る技術は、その他、例えば、距離センサ等に適用されてもよい。また、本開示は、カメラ等の電子機器の他に、半導体モジュールの形式の半導体装置に適用することもできる。具体的には、図37の撮像素子1002および撮像制御部1003を1つのパッケージに封入した半導体モジュールである撮像モジュールに本開示に係る技術を適用することもできる。 Although the camera has been described here as an example, the technology according to the present disclosure may be applied to other devices such as a distance sensor. Further, the present disclosure can be applied to a semiconductor device in the form of a semiconductor module in addition to an electronic device such as a camera. Specifically, the technique according to the present disclosure can be applied to an image pickup module which is a semiconductor module in which the image pickup device 1002 and the image pickup control unit 1003 of FIG. 37 are enclosed in one package.
 <内視鏡手術システムへの応用例>
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
<Example of application to endoscopic surgery system>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the techniques according to the present disclosure may be applied to endoscopic surgery systems.
 図38は、本開示に係る技術(本技術)が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。 FIG. 38 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technique according to the present disclosure (the present technique) can be applied.
 図38では、術者(医師)11131が、内視鏡手術システム11000を用いて、患者ベッド11133上の患者11132に手術を行っている様子が図示されている。図示するように、内視鏡手術システム11000は、内視鏡11100と、気腹チューブ11111やエネルギー処置具11112等の、その他の術具11110と、内視鏡11100を支持する支持アーム装置11120と、内視鏡下手術のための各種の装置が搭載されたカート11200と、から構成される。 FIG. 38 shows a surgeon (doctor) 11131 performing surgery on patient 11132 on patient bed 11133 using the endoscopic surgery system 11000. As shown, the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as a pneumoperitoneum tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100. , A cart 11200 equipped with various devices for endoscopic surgery.
 内視鏡11100は、先端から所定の長さの領域が患者11132の体腔内に挿入される鏡筒11101と、鏡筒11101の基端に接続されるカメラヘッド11102と、から構成される。図示する例では、硬性の鏡筒11101を有するいわゆる硬性鏡として構成される内視鏡11100を図示しているが、内視鏡11100は、軟性の鏡筒を有するいわゆる軟性鏡として構成されてもよい。 The endoscope 11100 is composed of a lens barrel 11101 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101. In the illustrated example, the endoscope 11100 configured as a so-called rigid mirror having a rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. Good.
 鏡筒11101の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡11100には光源装置11203が接続されており、当該光源装置11203によって生成された光が、鏡筒11101の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者11132の体腔内の観察対象に向かって照射される。なお、内視鏡11100は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。 An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101. A light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101 to be an objective. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens. The endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
 カメラヘッド11102の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU: Camera Control Unit)11201に送信される。 An optical system and an image pickup element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the image pickup element by the optical system. The observation light is photoelectrically converted by the image pickup device, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 11201.
 CCU11201は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡11100及び表示装置11202の動作を統括的に制御する。さらに、CCU11201は、カメラヘッド11102から画像信号を受け取り、その画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。 The CCU11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal for displaying an image based on the image signal, such as development processing (demosaic processing).
 表示装置11202は、CCU11201からの制御により、当該CCU11201によって画像処理が施された画像信号に基づく画像を表示する。 The display device 11202 displays an image based on the image signal processed by the CCU 11201 under the control of the CCU 11201.
 光源装置11203は、例えばLED(light emitting diode)等の光源から構成され、術部等を撮影する際の照射光を内視鏡11100に供給する。 The light source device 11203 is composed of, for example, a light source such as an LED (light LED radio), and supplies irradiation light to the endoscope 11100 when photographing an operating part or the like.
 入力装置11204は、内視鏡手術システム11000に対する入力インタフェースである。ユーザは、入力装置11204を介して、内視鏡手術システム11000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、内視鏡11100による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示等を入力する。 The input device 11204 is an input interface for the endoscopic surgery system 11000. The user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
 処置具制御装置11205は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11112の駆動を制御する。気腹装置11206は、内視鏡11100による視野の確保及び術者の作業空間の確保の目的で、患者11132の体腔を膨らめるために、気腹チューブ11111を介して当該体腔内にガスを送り込む。レコーダ11207は、手術に関する各種の情報を記録可能な装置である。プリンタ11208は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 The treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for cauterizing, incising, sealing a blood vessel, or the like of a tissue. The pneumoperitoneum device 11206 uses a gas in the pneumoperitoneum tube 11111 to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the work space of the operator. To send. The recorder 11207 is a device capable of recording various information related to surgery. The printer 11208 is a device capable of printing various information related to surgery in various formats such as text, images, and graphs.
 なお、内視鏡11100に術部を撮影する際の照射光を供給する光源装置11203は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成することができる。RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置11203において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド11102の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。 The light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof. When a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out. Further, in this case, the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-divided manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to support each of RGB. It is also possible to capture the image in a time-divided manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
 また、光源装置11203は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド11102の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。 Further, the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals. By controlling the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the light intensity to acquire an image in a time-divided manner and synthesizing the image, so-called high dynamic without blackout and overexposure. Range images can be generated.
 また、光源装置11203は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察すること(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得ること等を行うことができる。光源装置11203は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。 Further, the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation. In special light observation, for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrower band than the irradiation light (that is, white light) during normal observation, the surface layer of the mucous membrane So-called narrow band imaging, in which a predetermined tissue such as a blood vessel is photographed with high contrast, is performed. Alternatively, in the special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light. In fluorescence observation, the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent. The light source device 11203 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
 図39は、図38に示すカメラヘッド11102及びCCU11201の機能構成の一例を示すブロック図である。 FIG. 39 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU11201 shown in FIG. 38.
 カメラヘッド11102は、レンズユニット11401と、撮像部11402と、駆動部11403と、通信部11404と、カメラヘッド制御部11405と、を有する。CCU11201は、通信部11411と、画像処理部11412と、制御部11413と、を有する。カメラヘッド11102とCCU11201とは、伝送ケーブル11400によって互いに通信可能に接続されている。 The camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405. CCU11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and CCU11201 are communicatively connected to each other by a transmission cable 11400.
 レンズユニット11401は、鏡筒11101との接続部に設けられる光学系である。鏡筒11101の先端から取り込まれた観察光は、カメラヘッド11102まで導光され、当該レンズユニット11401に入射する。レンズユニット11401は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。 The lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. The observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and incident on the lens unit 11401. The lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
 撮像部11402を構成する撮像素子は、1つ(いわゆる単板式)であってもよいし、複数(いわゆる多板式)であってもよい。撮像部11402が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、撮像部11402は、3D(dimensional)表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者11131は術部における生体組織の奥行きをより正確に把握することが可能になる。
なお、撮像部11402が多板式で構成される場合には、各撮像素子に対応して、レンズユニット11401も複数系統設けられ得る。
The image sensor constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type). When the image pickup unit 11402 is composed of a multi-plate type, for example, each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them. Alternatively, the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D (dimensional) display, respectively. The 3D display enables the operator 11131 to more accurately grasp the depth of the biological tissue in the surgical site.
When the image pickup unit 11402 is composed of a multi-plate type, a plurality of lens units 11401 may be provided corresponding to each image pickup element.
 また、撮像部11402は、必ずしもカメラヘッド11102に設けられなくてもよい。例えば、撮像部11402は、鏡筒11101の内部に、対物レンズの直後に設けられてもよい。 Further, the imaging unit 11402 does not necessarily have to be provided on the camera head 11102. For example, the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
 駆動部11403は、アクチュエータによって構成され、カメラヘッド制御部11405からの制御により、レンズユニット11401のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部11402による撮像画像の倍率及び焦点が適宜調整され得る。 The drive unit 11403 is composed of an actuator, and the zoom lens and focus lens of the lens unit 11401 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 11405. As a result, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
 通信部11404は、CCU11201との間で各種の情報を送受信するための通信装置によって構成される。通信部11404は、撮像部11402から得た画像信号をRAWデータとして伝送ケーブル11400を介してCCU11201に送信する。 The communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU11201. The communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
 また、通信部11404は、CCU11201から、カメラヘッド11102の駆動を制御するための制御信号を受信し、カメラヘッド制御部11405に供給する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。 Further, the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405. The control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition.
 なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、ユーザによって適宜指定されてもよいし、取得された画像信号に基づいてCCU11201の制御部11413によって自動的に設定されてもよい。後者の場合には、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡11100に搭載されていることになる。 The above-mentioned imaging conditions such as frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of CCU11201 based on the acquired image signal. Good. In the latter case, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
 カメラヘッド制御部11405は、通信部11404を介して受信したCCU11201からの制御信号に基づいて、カメラヘッド11102の駆動を制御する。 The camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
 通信部11411は、カメラヘッド11102との間で各種の情報を送受信するための通信装置によって構成される。通信部11411は、カメラヘッド11102から、伝送ケーブル11400を介して送信される画像信号を受信する。 The communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
 また、通信部11411は、カメラヘッド11102に対して、カメラヘッド11102の駆動を制御するための制御信号を送信する。画像信号や制御信号は、電気通信や光通信等によって送信することができる。 Further, the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102. Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
 画像処理部11412は、カメラヘッド11102から送信されたRAWデータである画像信号に対して各種の画像処理を施す。 The image processing unit 11412 performs various image processing on the image signal which is the RAW data transmitted from the camera head 11102.
 制御部11413は、内視鏡11100による術部等の撮像、及び、術部等の撮像により得られる撮像画像の表示に関する各種の制御を行う。例えば、制御部11413は、カメラヘッド11102の駆動を制御するための制御信号を生成する。 The control unit 11413 performs various controls related to the imaging of the surgical site and the like by the endoscope 11100 and the display of the captured image obtained by the imaging of the surgical site and the like. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
 また、制御部11413は、画像処理部11412によって画像処理が施された画像信号に基づいて、術部等が映った撮像画像を表示装置11202に表示させる。この際、制御部11413は、各種の画像認識技術を用いて撮像画像内における各種の物体を認識してもよい。例えば、制御部11413は、撮像画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11112の使用時のミスト等を認識することができる。制御部11413は、表示装置11202に撮像画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させてもよい。手術支援情報が重畳表示され、術者11131に提示されることにより、術者11131の負担を軽減することや、術者11131が確実に手術を進めることが可能になる。 Further, the control unit 11413 causes the display device 11202 to display an image captured by the surgical unit or the like based on the image signal processed by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edge of an object included in the captured image to remove surgical tools such as forceps, a specific biological part, bleeding, and mist when using the energy treatment tool 11112. Can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the surgical support information and presenting it to the surgeon 11131, it is possible to reduce the burden on the surgeon 11131 and to allow the surgeon 11131 to proceed with the surgery reliably.
 カメラヘッド11102及びCCU11201を接続する伝送ケーブル11400は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。 The transmission cable 11400 that connects the camera head 11102 and CCU11201 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
 ここで、図示する例では、伝送ケーブル11400を用いて有線で通信が行われていたが、カメラヘッド11102とCCU11201との間の通信は無線で行われてもよい。 Here, in the illustrated example, the communication is performed by wire using the transmission cable 11400, but the communication between the camera head 11102 and the CCU11201 may be performed wirelessly.
 <移動体への応用例>
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<Example of application to mobiles>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
 図40は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 40 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図40に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(Interface)12053が図示されている。 The vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001. In the example shown in FIG. 40, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050. Further, as a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (Interface) 12053 are shown.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps. In this case, the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches. The body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000. For example, the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image. The vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received. The image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The in-vehicle information detection unit 12040 detects the in-vehicle information. For example, a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit. A control command can be output to 12010. For example, the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 Further, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12030に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Further, the microcomputer 12051 can output a control command to the body system control unit 12030 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030. For example, the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs cooperative control for the purpose of antiglare such as switching the high beam to the low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図40の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The audio image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle. In the example of FIG. 40, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices. The display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
 図41は、撮像部12031の設置位置の例を示す図である。 FIG. 41 is a diagram showing an example of the installation position of the imaging unit 12031.
 図41では、撮像部12031として、撮像部12101、12102、12103、12104、12105を有する。 In FIG. 41, the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
 撮像部12101、12102、12103、12104、12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102、12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部12105は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as, for example, the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100. The imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100. The imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100. The imaging unit 12105 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図41には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 Note that FIG. 41 shows an example of the photographing range of the imaging units 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively, and the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103. The imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100). By obtaining, it is possible to extract as the preceding vehicle a three-dimensional object that is the closest three-dimensional object on the traveling path of the vehicle 12100 and that travels in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, 0 km / h or more). it can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle runs autonomously without depending on the operation of the driver.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104. Such pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine. When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian. The display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
 なお、本明細書において、システムとは、複数の装置により構成される装置全体を表すものである。 In the present specification, the system represents the entire device composed of a plurality of devices.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 The embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
 なお、本技術は以下のような構成も取ることができる。
(1)
 入射光を集光するオンチップレンズと、
 前記入射光の光電変換を行う光電変換部と、
 前記入射光の集光サイズと略同じ大きさの開口部に、前記光電変換部に前記入射光を導く導波路と、
 前記光電変換部を透過した前記入射光を反射する反射膜と、
 前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
 を備える撮像素子。
(2)
 前記反射膜は、傾斜面を有する
 前記(1)に記載の撮像素子。
(3)
 前記傾斜面の傾斜角をθ、前記開口部の大きさをD、および前記光電変換部の深さをTとした場合、
 θ≧(1/2)arctan(D/2T)
が満たされる
 前記(2)に記載の撮像素子。
(4)
 前記光電変換部の前記入射光が入射する側と対向する側に複数の凹凸を有する第2の凹凸領域をさらに備える
 前記(1)乃至(3)のいずれかに記載の撮像素子。
(5)
 前記導波路は、傾斜を有する
 前記(1)乃至(4)のいずれかに記載の撮像素子。
(6)
 前記光電変換部内に、光を反射する反射部をさらに備える
 前記(1)乃至(5)のいずれかに記載の撮像素子。
(7)
 前記反射部は、複数設けられている
 前記(6)に記載の撮像素子。
(8)
 前記光電変換部の前記開口部側には、凹部が形成され、前記凹部の中側の材料と外側の材料との屈折率は異なる
 前記(1)乃至(7)のいずれかに記載の撮像素子。
(9)
 前記反射膜の前記傾斜面は、前記光電変換部で変換された電荷を蓄積する電荷蓄積部が設けられている側が高く、前記電荷蓄積部が設けられていない側が低い
 前記(2)乃至(8)のいずれかに記載の撮像素子。
(10)
 前記光電変換部で変換された電荷を蓄積する電荷蓄積部の近傍に、光を反射する反射部をさらに備える
 前記(1)乃至(9)のいずれかに記載の撮像素子。
(11)
 前記反射膜は、前記光電変換部で変換された電荷を蓄積する電荷蓄積部と前記光電変換部との間に設けられている
 前記(1)に記載の撮像素子。
(12)
 前記導波路は、回折格子である
 前記(1)乃至(11)のいずれかに記載の撮像素子。
(13)
 入射光を集光するオンチップレンズと、
 前記入射光の光電変換を行う光電変換部と、
 前記入射光の集光サイズと略同じ大きさの開口部に、前記光電変換部に前記入射光を導く導波路と、
 前記光電変換部を透過した前記入射光を反射する反射膜と、
 前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
 を備える撮像素子と
 前記撮像素子からの信号を処理する処理部と
 を備える撮像装置。
(14)
 入射光を集光するオンチップレンズと、
 前記入射光の光電変換を行う光電変換部と、
 前記入射光の集光サイズと略同じ大きさの開口部と、
 前記光電変換部を透過した前記入射光を反射する反射膜であり、傾斜面を有する反射膜と、
 前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
 を備える撮像素子。
(15)
 前記光電変換部の前記入射光が入射する側と対向する側に複数の凹凸を有する第2の凹凸領域をさらに備える
 前記(14)に記載の撮像素子。
(16)
 前記光電変換部内に、光を反射する反射部をさらに備える
 前記(14)または(15)に記載の撮像素子。
(17)
 前記反射部は、複数設けられている
 前記(16)に記載の撮像素子。
(18)
 前記光電変換部の前記開口部側には、凹部が形成され、前記凹部の中側の材料と外側の材料との屈折率は異なる
 前記(14)乃至(16)のいずれかに記載の撮像素子。
(19)
 入射光を集光するオンチップレンズと、
 前記入射光の光電変換を行う光電変換部と、
 前記入射光の集光サイズと略同じ大きさの開口部と、
 前記光電変換部を透過した前記入射光を反射する反射膜であり、傾斜面を有する反射膜と、
 前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
 を備える撮像素子と
 前記撮像素子からの信号を処理する処理部と
 を備える撮像装置。
The present technology can also have the following configurations.
(1)
An on-chip lens that collects incident light and
A photoelectric conversion unit that performs photoelectric conversion of the incident light and
An opening having a size substantially the same as the focused size of the incident light, a waveguide for guiding the incident light to the photoelectric conversion unit, and a waveguide.
A reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and
An image pickup device including a concavo-convex region having a plurality of concavities and convexities on the side of the photoelectric conversion unit on which the incident light is incident.
(2)
The image pickup device according to (1) above, wherein the reflective film has an inclined surface.
(3)
When the inclination angle of the inclined surface is θ, the size of the opening is D, and the depth of the photoelectric conversion unit is T,
θ ≧ (1/2) arctan (D / 2T)
The image pickup device according to (2) above.
(4)
The image pickup device according to any one of (1) to (3), further comprising a second uneven region having a plurality of irregularities on the side of the photoelectric conversion unit facing the incident side.
(5)
The image pickup device according to any one of (1) to (4) above, wherein the waveguide has an inclination.
(6)
The image pickup device according to any one of (1) to (5) above, further comprising a reflecting unit that reflects light in the photoelectric conversion unit.
(7)
The image pickup device according to (6) above, wherein a plurality of the reflecting portions are provided.
(8)
The image pickup device according to any one of (1) to (7) above, wherein a recess is formed on the opening side of the photoelectric conversion unit, and the refractive index of the material inside the recess is different from that of the material outside. ..
(9)
The inclined surface of the reflective film is high on the side provided with the charge accumulating portion for accumulating the charges converted by the photoelectric conversion unit and low on the side not provided with the charge accumulating portion (2) to (8). ). The image pickup device.
(10)
The image pickup device according to any one of (1) to (9) above, further comprising a reflection unit that reflects light in the vicinity of the charge storage unit that stores the charge converted by the photoelectric conversion unit.
(11)
The image pickup device according to (1), wherein the reflective film is provided between a charge storage unit that stores charges converted by the photoelectric conversion unit and the photoelectric conversion unit.
(12)
The image pickup device according to any one of (1) to (11) above, wherein the waveguide is a diffraction grating.
(13)
An on-chip lens that collects incident light and
A photoelectric conversion unit that performs photoelectric conversion of the incident light and
An opening having a size substantially the same as the focused size of the incident light, a waveguide for guiding the incident light to the photoelectric conversion unit, and a waveguide.
A reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and
An image pickup apparatus including an image pickup element having a plurality of unevenness regions on the side of the photoelectric conversion unit on which the incident light is incident, and a processing unit for processing a signal from the image pickup element.
(14)
An on-chip lens that collects incident light and
A photoelectric conversion unit that performs photoelectric conversion of the incident light and
An opening having a size substantially the same as the focused size of the incident light,
A reflective film that reflects the incident light that has passed through the photoelectric conversion unit and has an inclined surface.
An image pickup device including a concavo-convex region having a plurality of concavities and convexities on the side of the photoelectric conversion unit on which the incident light is incident.
(15)
The image pickup device according to (14), further comprising a second uneven region having a plurality of irregularities on the side of the photoelectric conversion unit facing the incident side.
(16)
The image pickup device according to (14) or (15), further comprising a reflecting unit that reflects light in the photoelectric conversion unit.
(17)
The image pickup device according to (16), wherein a plurality of the reflecting portions are provided.
(18)
The image pickup device according to any one of (14) to (16), wherein a recess is formed on the opening side of the photoelectric conversion unit, and the refractive index of the material inside the recess is different from that of the material outside. ..
(19)
An on-chip lens that collects incident light and
A photoelectric conversion unit that performs photoelectric conversion of the incident light and
An opening having a size substantially the same as the focused size of the incident light,
A reflective film that reflects the incident light that has passed through the photoelectric conversion unit and has an inclined surface.
An image pickup apparatus including an image pickup element having a plurality of unevenness regions on the side of the photoelectric conversion unit on which the incident light is incident, and a processing unit for processing a signal from the image pickup element.
 1 撮像素子, 10 画素アレイ部, 11 垂直駆動部, 12 カラム信号処理部, 13 制御部, 14,15,16 信号線, 30 画素, 31 半導体基板, 32 半導体領域, 33 配線領域, 34 表面側反射膜, 35 裏面側反射膜, 36 保護膜, 37 オンチップレンズ, 38 分離領域, 41 絶縁層, 42 配線層, 51 基板裏面散乱部, 52 開口部, 71,72 入射光, 91 吸収膜, 101 表面側反射膜, 109 基板裏面散乱部, 111,112,113,114,115 反射部, 131 基板表面散乱部, 151 基板裏面散乱部, 152 光屈折構造部, 201 導波路, 221 カラーフィルタ層, 301 導波路, 311,312,313 反射部, 321 基板裏面散乱部, 322 基板表面散乱部, 351,361 反射部 1 image pickup element, 10 pixel array unit, 11 vertical drive unit, 12 column signal processing unit, 13 control unit, 14, 15, 16 signal lines, 30 pixels, 31 semiconductor substrate, 32 semiconductor area, 33 wiring area, 34 surface side Reflective film, 35 backside reflective film, 36 protective film, 37 on-chip lens, 38 separation area, 41 insulating layer, 42 wiring layer, 51 substrate backside scattering part, 52 opening, 71, 72 incident light, 91 absorbing film, 101 front side reflective film, 109 substrate back surface scattering part, 111, 112, 113, 114, 115 reflection part, 131 substrate surface scattering part, 151 substrate back surface scattering part, 152 light refraction structure part, 201 waveguide, 221 color filter layer , 301 waveguide, 311, 312, 313 reflector, 321 substrate back surface scatterer, 322 substrate surface scatterer, 351, 361 reflector

Claims (19)

  1.  入射光を集光するオンチップレンズと、
     前記入射光の光電変換を行う光電変換部と、
     前記入射光の集光サイズと略同じ大きさの開口部に、前記光電変換部に前記入射光を導く導波路と、
     前記光電変換部を透過した前記入射光を反射する反射膜と、
     前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
     を備える撮像素子。
    An on-chip lens that collects incident light and
    A photoelectric conversion unit that performs photoelectric conversion of the incident light and
    An opening having a size substantially the same as the focused size of the incident light, a waveguide for guiding the incident light to the photoelectric conversion unit, and a waveguide.
    A reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and
    An image pickup device including a concavo-convex region having a plurality of concavities and convexities on the side of the photoelectric conversion unit on which the incident light is incident.
  2.  前記反射膜は、傾斜面を有する
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, wherein the reflective film has an inclined surface.
  3.  前記傾斜面の傾斜角をθ、前記開口部の大きさをD、および前記光電変換部の深さをTとした場合、
     θ≧(1/2)arctan(D/2T)
    が満たされる
     請求項2に記載の撮像素子。
    When the inclination angle of the inclined surface is θ, the size of the opening is D, and the depth of the photoelectric conversion unit is T,
    θ ≧ (1/2) arctan (D / 2T)
    2. The image pickup device according to claim 2.
  4.  前記光電変換部の前記入射光が入射する側と対向する側に複数の凹凸を有する第2の凹凸領域をさらに備える
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, further comprising a second uneven region having a plurality of irregularities on the side of the photoelectric conversion unit facing the incident side.
  5.  前記導波路は、傾斜を有する
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, wherein the waveguide has an inclination.
  6.  前記光電変換部内に、光を反射する反射部をさらに備える
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, further comprising a reflecting unit that reflects light in the photoelectric conversion unit.
  7.  前記反射部は、複数設けられている
     請求項6に記載の撮像素子。
    The image pickup device according to claim 6, wherein a plurality of the reflecting portions are provided.
  8.  前記光電変換部の前記開口部側には、凹部が形成され、前記凹部の中側の材料と外側の材料との屈折率は異なる
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, wherein a recess is formed on the opening side of the photoelectric conversion unit, and the refractive index of the material inside the recess is different from that of the material outside.
  9.  前記反射膜の前記傾斜面は、前記光電変換部で変換された電荷を蓄積する電荷蓄積部が設けられている側が高く、前記電荷蓄積部が設けられていない側が低い
     請求項2に記載の撮像素子。
    The image pickup according to claim 2, wherein the inclined surface of the reflective film is high on the side provided with the charge accumulating portion for accumulating the charges converted by the photoelectric conversion unit and low on the side not provided with the charge accumulating portion. element.
  10.  前記光電変換部で変換された電荷を蓄積する電荷蓄積部の近傍に、光を反射する反射部をさらに備える
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, further comprising a reflection unit that reflects light in the vicinity of the charge storage unit that stores the charges converted by the photoelectric conversion unit.
  11.  前記反射膜は、前記光電変換部で変換された電荷を蓄積する電荷蓄積部と前記光電変換部との間に設けられている
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, wherein the reflective film is provided between a charge storage unit that stores charges converted by the photoelectric conversion unit and the photoelectric conversion unit.
  12.  前記導波路は、回折格子である
     請求項1に記載の撮像素子。
    The image pickup device according to claim 1, wherein the waveguide is a diffraction grating.
  13.  入射光を集光するオンチップレンズと、
     前記入射光の光電変換を行う光電変換部と、
     前記入射光の集光サイズと略同じ大きさの開口部に、前記光電変換部に前記入射光を導く導波路と、
     前記光電変換部を透過した前記入射光を反射する反射膜と、
     前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
     を備える撮像素子と
     前記撮像素子からの信号を処理する処理部と
     を備える撮像装置。
    An on-chip lens that collects incident light and
    A photoelectric conversion unit that performs photoelectric conversion of the incident light and
    An opening having a size substantially the same as the focused size of the incident light, a waveguide for guiding the incident light to the photoelectric conversion unit, and a waveguide.
    A reflective film that reflects the incident light that has passed through the photoelectric conversion unit, and
    An image pickup apparatus including an image pickup element having a plurality of unevenness regions on the side of the photoelectric conversion unit on which the incident light is incident, and a processing unit for processing a signal from the image pickup element.
  14.  入射光を集光するオンチップレンズと、
     前記入射光の光電変換を行う光電変換部と、
     前記入射光の集光サイズと略同じ大きさの開口部と、
     前記光電変換部を透過した前記入射光を反射する反射膜であり、傾斜面を有する反射膜と、
     前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
     を備える撮像素子。
    An on-chip lens that collects incident light and
    A photoelectric conversion unit that performs photoelectric conversion of the incident light and
    An opening having a size substantially the same as the focused size of the incident light,
    A reflective film that reflects the incident light that has passed through the photoelectric conversion unit and has an inclined surface.
    An image pickup device including a concavo-convex region having a plurality of concavities and convexities on the side of the photoelectric conversion unit on which the incident light is incident.
  15.  前記光電変換部の前記入射光が入射する側と対向する側に複数の凹凸を有する第2の凹凸領域をさらに備える
     請求項14に記載の撮像素子。
    The image pickup device according to claim 14, further comprising a second uneven region having a plurality of irregularities on the side of the photoelectric conversion unit facing the incident side.
  16.  前記光電変換部内に、光を反射する反射部をさらに備える
     請求項14に記載の撮像素子。
    The image pickup device according to claim 14, further comprising a reflecting unit that reflects light in the photoelectric conversion unit.
  17.  前記反射部は、複数設けられている
     請求項16に記載の撮像素子。
    The image pickup device according to claim 16, wherein a plurality of the reflecting portions are provided.
  18.  前記光電変換部の前記開口部側には、凹部が形成され、前記凹部の中側の材料と外側の材料との屈折率は異なる
     請求項14に記載の撮像素子。
    The image pickup device according to claim 14, wherein a recess is formed on the opening side of the photoelectric conversion unit, and the refractive index of the material inside the recess is different from that of the material outside.
  19.  入射光を集光するオンチップレンズと、
     前記入射光の光電変換を行う光電変換部と、
     前記入射光の集光サイズと略同じ大きさの開口部と、
     前記光電変換部を透過した前記入射光を反射する反射膜であり、傾斜面を有する反射膜と、
     前記光電変換部の前記入射光が入射する側に複数の凹凸を有する凹凸領域と
     を備える撮像素子と
     前記撮像素子からの信号を処理する処理部と
     を備える撮像装置。
    An on-chip lens that collects incident light and
    A photoelectric conversion unit that performs photoelectric conversion of the incident light and
    An opening having a size substantially the same as the focused size of the incident light,
    A reflective film that reflects the incident light that has passed through the photoelectric conversion unit and has an inclined surface.
    An image pickup apparatus including an image pickup element having a plurality of unevenness regions on the side of the photoelectric conversion unit on which the incident light is incident, and a processing unit for processing a signal from the image pickup element.
PCT/JP2020/043422 2019-12-06 2020-11-20 Imaging element and imaging device WO2021111904A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019220859A JP2021090022A (en) 2019-12-06 2019-12-06 Image pickup device and imaging apparatus
JP2019-220859 2019-12-06

Publications (1)

Publication Number Publication Date
WO2021111904A1 true WO2021111904A1 (en) 2021-06-10

Family

ID=76220541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/043422 WO2021111904A1 (en) 2019-12-06 2020-11-20 Imaging element and imaging device

Country Status (2)

Country Link
JP (1) JP2021090022A (en)
WO (1) WO2021111904A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023095491A1 (en) * 2021-11-29 2023-06-01 ソニーセミコンダクタソリューションズ株式会社 Light-receiving element and electronic device
WO2024018934A1 (en) * 2022-07-22 2024-01-25 ソニーセミコンダクタソリューションズ株式会社 Imaging device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023166884A1 (en) * 2022-03-03 2023-09-07 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging device and method for manufacturing same
WO2023248388A1 (en) * 2022-06-22 2023-12-28 ソニーセミコンダクタソリューションズ株式会社 Light detection device and electronic apparatus

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013505587A (en) * 2009-09-17 2013-02-14 サイオニクス, インコーポレイテッド Photosensitive imaging device and related method
JP2013069718A (en) * 2011-09-20 2013-04-18 Toshiba Corp Solid-state imaging device
JP2013098446A (en) * 2011-11-04 2013-05-20 Sony Corp Solid-state imaging element, method for manufacturing solid-state imaging element, and electronic device
JP2014027178A (en) * 2012-07-27 2014-02-06 Sharp Corp Solid state image sensor and electronic information equipment
JP2016001633A (en) * 2014-06-11 2016-01-07 ソニー株式会社 Solid state image sensor and electronic equipment
JP2016082133A (en) * 2014-10-20 2016-05-16 ソニー株式会社 Solid-state imaging device and electronic apparatus
JP2017108062A (en) * 2015-12-11 2017-06-15 ソニー株式会社 Solid state imaging device, imaging apparatus, and method of manufacturing solid state imaging device
US20180006072A1 (en) * 2016-06-30 2018-01-04 Stmicroelectronics (Crolles 2) Sas Backside illuminated photosensor element with light pipe and light mirror structures
WO2018079296A1 (en) * 2016-10-27 2018-05-03 ソニーセミコンダクタソリューションズ株式会社 Imaging element and electronic device
US20180151759A1 (en) * 2016-11-29 2018-05-31 Taiwan Semiconductor Manufacturing Co., Ltd. Qe approach by double-side, multi absorption structure
US20180182806A1 (en) * 2016-12-28 2018-06-28 Samsung Electronics Co., Ltd. Light sensor
JP2018186211A (en) * 2017-04-27 2018-11-22 ルネサスエレクトロニクス株式会社 Semiconductor device and manufacturing method thereof
US20190067357A1 (en) * 2017-08-30 2019-02-28 Taiwan Semiconductor Manufacturing Co., Ltd. Increased optical path for long wavelength light by grating structure
WO2019069556A1 (en) * 2017-10-03 2019-04-11 ソニーセミコンダクタソリューションズ株式会社 Solid state imaging element, solid state imaging element manufacturing method, and electronic apparatus
JP2019114642A (en) * 2017-12-22 2019-07-11 キヤノン株式会社 Solid state image sensor, electronic equipment and transport equipment
JP2019169668A (en) * 2018-03-26 2019-10-03 ソニーセミコンダクタソリューションズ株式会社 Imaging apparatus and manufacturing method therefor

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013505587A (en) * 2009-09-17 2013-02-14 サイオニクス, インコーポレイテッド Photosensitive imaging device and related method
JP2013069718A (en) * 2011-09-20 2013-04-18 Toshiba Corp Solid-state imaging device
JP2013098446A (en) * 2011-11-04 2013-05-20 Sony Corp Solid-state imaging element, method for manufacturing solid-state imaging element, and electronic device
JP2014027178A (en) * 2012-07-27 2014-02-06 Sharp Corp Solid state image sensor and electronic information equipment
JP2016001633A (en) * 2014-06-11 2016-01-07 ソニー株式会社 Solid state image sensor and electronic equipment
JP2016082133A (en) * 2014-10-20 2016-05-16 ソニー株式会社 Solid-state imaging device and electronic apparatus
JP2017108062A (en) * 2015-12-11 2017-06-15 ソニー株式会社 Solid state imaging device, imaging apparatus, and method of manufacturing solid state imaging device
US20180006072A1 (en) * 2016-06-30 2018-01-04 Stmicroelectronics (Crolles 2) Sas Backside illuminated photosensor element with light pipe and light mirror structures
WO2018079296A1 (en) * 2016-10-27 2018-05-03 ソニーセミコンダクタソリューションズ株式会社 Imaging element and electronic device
US20180151759A1 (en) * 2016-11-29 2018-05-31 Taiwan Semiconductor Manufacturing Co., Ltd. Qe approach by double-side, multi absorption structure
US20180182806A1 (en) * 2016-12-28 2018-06-28 Samsung Electronics Co., Ltd. Light sensor
JP2018186211A (en) * 2017-04-27 2018-11-22 ルネサスエレクトロニクス株式会社 Semiconductor device and manufacturing method thereof
US20190067357A1 (en) * 2017-08-30 2019-02-28 Taiwan Semiconductor Manufacturing Co., Ltd. Increased optical path for long wavelength light by grating structure
WO2019069556A1 (en) * 2017-10-03 2019-04-11 ソニーセミコンダクタソリューションズ株式会社 Solid state imaging element, solid state imaging element manufacturing method, and electronic apparatus
JP2019114642A (en) * 2017-12-22 2019-07-11 キヤノン株式会社 Solid state image sensor, electronic equipment and transport equipment
JP2019169668A (en) * 2018-03-26 2019-10-03 ソニーセミコンダクタソリューションズ株式会社 Imaging apparatus and manufacturing method therefor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023095491A1 (en) * 2021-11-29 2023-06-01 ソニーセミコンダクタソリューションズ株式会社 Light-receiving element and electronic device
WO2024018934A1 (en) * 2022-07-22 2024-01-25 ソニーセミコンダクタソリューションズ株式会社 Imaging device

Also Published As

Publication number Publication date
JP2021090022A (en) 2021-06-10

Similar Documents

Publication Publication Date Title
WO2021111904A1 (en) Imaging element and imaging device
WO2020209126A1 (en) Solid-state imaging device
CN111295761A (en) Imaging element, method for manufacturing imaging element, and electronic apparatus
JP2019046960A (en) Solid-state imaging apparatus and electronic device
EP3739630A1 (en) Solid-state imaging device and electronic device
JP7341141B2 (en) Imaging devices and electronic equipment
WO2020149207A1 (en) Imaging device and electronic equipment
US20230008784A1 (en) Solid-state imaging device and electronic device
WO2020137203A1 (en) Imaging element and imaging device
WO2019220696A1 (en) Imaging element and imaging device
WO2019207978A1 (en) Image capture element and method of manufacturing image capture element
JP2021040088A (en) Solid state image pickup device and electronic apparatus
WO2021085091A1 (en) Solid-state imaging device and electronic apparatus
KR20230071123A (en) Solid-state imaging devices and electronic devices
WO2023013444A1 (en) Imaging device
WO2022091576A1 (en) Solid-state imaging device and electronic apparatus
WO2021045139A1 (en) Imaging element and imaging device
WO2020195180A1 (en) Imaging element and imaging device
WO2019176302A1 (en) Imaging element and method for manufacturing imaging element
WO2023013394A1 (en) Imaging device
JP7316340B2 (en) Solid-state imaging device and electronic equipment
WO2023166884A1 (en) Solid-state imaging device and method for manufacturing same
JP2020064893A (en) Sensor module and electronic apparatus
JP2019036788A (en) Solid-state imaging device
WO2024057806A1 (en) Imaging device and electronic apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20895404

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20895404

Country of ref document: EP

Kind code of ref document: A1