WO2021210060A1 - Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, dispositif d'endoscope et dispositif de microscope pour opération - Google Patents

Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, dispositif d'endoscope et dispositif de microscope pour opération Download PDF

Info

Publication number
WO2021210060A1
WO2021210060A1 PCT/JP2020/016407 JP2020016407W WO2021210060A1 WO 2021210060 A1 WO2021210060 A1 WO 2021210060A1 JP 2020016407 W JP2020016407 W JP 2020016407W WO 2021210060 A1 WO2021210060 A1 WO 2021210060A1
Authority
WO
WIPO (PCT)
Prior art keywords
light receiving
receiving element
light
signal
imaging lens
Prior art date
Application number
PCT/JP2020/016407
Other languages
English (en)
Japanese (ja)
Inventor
理 足立
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2020/016407 priority Critical patent/WO2021210060A1/fr
Publication of WO2021210060A1 publication Critical patent/WO2021210060A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor

Definitions

  • the present invention relates to a solid-state image sensor, an image pickup device, an endoscopic device, and a surgical microscope system.
  • AF accuracy is improved by executing AF control using image plane phase difference pixels or the like for detecting parallax on the imaging surface. ing. Distance information can be obtained by using this function. Therefore, it is expected to be applied to the measurement of the uneven shape using the image plane retardation pixel.
  • 13 and 14 show the principle of detecting parallax.
  • the light generated from the subject SB 10 passes through the imaging lens 500 and the microlens 510 provided on the pixel array of the image pickup device.
  • the image of the pupil of the imaging lens 500 is projected onto the light receiving element (PD) of the pixel by the microlens 510.
  • Light generated from the same position on the subject SB10 is incident on the left region or the right region of the light receiving element depending on the position passing through the pupil of the imaging lens 500. If the light incident on the two regions can be separated, it is possible to reproduce two images with parallax as seen from the right half and the left half of the pupil. Therefore, by using one imaging lens 500, the same situation as when stereoscopic viewing is performed with two lenses can be realized. Parallax can be detected based on the signals in the two regions.
  • FIG. 13 shows a case where the F value of the imaging lens 500 is small. In this case, there are many components of light that have passed through the pupil position of the imaging lens 500 away from the center of the imaging lens 500.
  • FIG. 14 shows a case where the F value of the imaging lens 500 is large. In this case, there are many components of light that have passed through the pupil position near the center of the imaging lens 500.
  • the F value shown in FIG. 13 is small, it is more suitable for parallax detection than when the F value shown in FIG. 14 is large.
  • first method a pair of two pixels is used. The right half of one of the two pixels is shaded and the other left half of the two pixels is shaded.
  • second method a microlens straddling two pixels is formed.
  • the F value of the imaging lens is designed to be as large as possible within a range in which a sense of resolution can be obtained.
  • the F value is large, there are many components of light that have passed through a position close to the center of the imaging lens, and the accuracy of the distance information detected based on the image plane phase difference pixels is lowered. Therefore, it is difficult to achieve both a deep depth of field and highly accurate distance measurement.
  • the first method and the second method for realizing the pixels for detecting the image plane phase difference when an imaging lens having a large F value is used, the accuracy of the distance information is lowered in the same manner as described above.
  • the depth of field becomes shallow and it is difficult to perform imaging using pan focus.
  • the present invention is a solid-state imaging device, an imaging device, an endoscopic device, and a surgical microscope capable of performing imaging at a deep depth of field and obtaining a signal charge suitable for highly accurate distance measurement.
  • the purpose is to provide a system.
  • the solid-state image pickup device has a plurality of light receiving elements and a microlens.
  • the plurality of light receiving elements are arranged in a matrix.
  • the microlens is arranged on the plurality of light receiving elements with respect to the light receiving element group.
  • the light receiving element group includes one or more first light receiving elements included in the plurality of light receiving elements and two or more second light receiving elements included in the plurality of light receiving elements.
  • the first light receiving element receives light that has passed through an imaging lens that forms an optical image of a subject on the plurality of light receiving elements, and generates a first signal charge based on the received light.
  • the two or more second light receiving elements receive light that has passed through different pupil regions of the imaging lens, and generate a second signal charge based on the received light.
  • the first light receiving element receives light that has passed through the first region of the imaging lens including the center of the imaging lens.
  • the second light receiving element receives light that has passed through the second region of the imaging lens including a position outside the first region.
  • the solid-state image sensor may further include a light-shielding layer arranged between the microlens and the plurality of light-receiving elements.
  • a first aperture that allows light that has passed through the imaging lens and the microlens to enter the first light receiving element, and light that has passed through the imaging lens and the microlens is incident on the second light receiving element.
  • the second opening may be formed in the light-shielding layer. The second opening may be larger than the first opening.
  • the first light receiving element and the second light receiving element included in the light receiving element group have an arrangement in which the number of rows and the number of columns are two or more. May be formed.
  • the figure formed by the array may include a first diagonal line passing through the first light receiving element and a second diagonal line passing through the second light receiving element.
  • the width of the second opening in the direction parallel to the second diagonal may be larger than the width of the first opening in the direction parallel to the first diagonal.
  • the light receiving element group may have two or more of the first light receiving elements.
  • the solid-state imaging device mixes the first signal charges generated by the two or more first light receiving elements included in the light receiving element group, converts them into a voltage, and includes the light receiving element group. It may have a charge-voltage conversion element that converts the second signal charge generated by the second light-receiving element into a voltage.
  • the first light receiving element and the second light receiving element included in the light receiving element group have one of the number of rows and the number of columns of 3 or more.
  • An array may be formed in which the number of rows and the other of the number of columns are 1 or more.
  • the first light receiving element may be arranged in the central portion of the arrangement, and the second light receiving element may be arranged in the outer peripheral portion of the arrangement.
  • the image pickup apparatus includes the solid-state image pickup device and the image pickup lens.
  • the image pickup apparatus reads a first signal generated based on the first signal charge from the first light receiving element, and the image pickup device reads the first signal.
  • a readout circuit that reads a second signal generated based on the second signal charge from the second light receiving element may be further provided.
  • the first light receiving element and the second light receiving element included in the light receiving element group may form an array having two or more rows and columns.
  • the readout circuit outputs the first signal and the second signal within a period allocated to each row of the array so that the rate of outputting the first signal and the second signal is leveled between the rows of the array.
  • the second signal may be read out.
  • the imaging device is a light source device capable of switching between a first state of generating visible light and a second state of generating infrared light.
  • the solid-state image sensor is arranged between the microlens and the plurality of light receiving elements, transmits the imaging lens and the infrared light that has passed through the microlens, and transmits the imaging lens and the microlens.
  • An infrared transmissive layer containing a material that blocks the passed visible light may be further provided.
  • the light receiving element group may have two or more light receiving elements.
  • An aperture may be formed in the infrared transmissive layer to allow light that has passed through the imaging lens and the microlens to enter the two or more light receiving elements.
  • the two or more light receiving elements function as the first light receiving element and pass through the imaging lens, the microlens, and the aperture.
  • the visible light may be received and the first signal charge may be generated based on the received visible light.
  • the state of the light source device is the second state
  • the two or more light receiving elements function as the second light receiving element, and the imaging lens, the microlens, and the infrared transmitting layer are formed.
  • the transmitted infrared light may be received and the second signal charge may be generated based on the received infrared light.
  • the endoscope device includes a scope in which the solid-state image sensor is arranged at the tip, and a control unit.
  • the control unit receives the first signal generated based on the first signal charge and the second signal generated based on the second signal charge, and the first signal is used. Based on this, a video signal of the subject is generated, and the parallax between the two second signals corresponding to the second signal charges generated by the two different light receiving elements is calculated.
  • the surgical microscope system includes the solid-state image sensor and the control unit.
  • the control unit receives the first signal generated based on the first signal charge and the second signal generated based on the second signal charge, and the first signal is used. Based on this, a video signal of the subject is generated, and the parallax between the two second signals corresponding to the second signal charges generated by the two different light receiving elements is calculated.
  • the solid-state imaging device, imaging device, endoscopic device, and surgical microscope system can perform imaging at a deep depth of field and are suitable for highly accurate distance measurement.
  • a signal charge can be obtained.
  • FIG. 1 is a plan view of the image pickup device 1 (solid-state image pickup device) according to the first embodiment of the present invention.
  • the image pickup device 1 shown in FIG. 1 has a plurality of microlenses 10, a light-shielding layer 20, a plurality of light-receiving elements, and a plurality of charge-voltage conversion elements 40.
  • FIG. 1 shows a state in which the semiconductor substrate constituting the image pickup device 1 is viewed in a plane in a direction perpendicular to the main surface of the semiconductor substrate.
  • the image pickup device 1 has a structure in which a plurality of layers are laminated. The state of the plurality of layers is shown in FIG.
  • the light receiving element group PG1 includes a light receiving element 30r1, a light receiving element 30r2, a light receiving element 30r3, and a light receiving element 30r4.
  • the light receiving element group PG2 includes a light receiving element 30g1, a light receiving element 30g2, a light receiving element 30g3, and a light receiving element 30g4.
  • the light receiving element group PG3 includes a light receiving element 30b1, a light receiving element 30b2, a light receiving element 30b3, and a light receiving element 30b4. Each light receiving element corresponds to a pixel.
  • Each light receiving element group is arranged at a position corresponding to one microlens 10.
  • the four light receiving elements included in each light receiving element group are photodiodes (optical sensors) and are made of a semiconductor material.
  • the four light receiving elements included in the light receiving element group PG1 receive light in the red wavelength band (red light) and generate a signal charge based on the received red light.
  • the four light receiving elements included in the light receiving element group PG2 receive light in the green wavelength band (green light) and generate a signal charge based on the received green light.
  • the four light receiving elements included in the light receiving element group PG3 receive light in the blue wavelength band (blue light) and generate a signal charge based on the received blue light.
  • 16 light receiving elements are arranged in a matrix.
  • the number of rows and columns of the array of 16 light receiving elements is 4.
  • Each light receiving element group has four light receiving elements.
  • the number of rows and columns of the arrangement of the four light receiving elements included in each light receiving element group is 2.
  • Each of the 16 light receiving elements shown in FIG. 1 is included in any one of a plurality of light receiving element groups.
  • the arrangement of the four light receiving elements included in each light receiving element group forms a substantially square shape.
  • the square has a diagonal line L1 passing through the two vertices and a diagonal line L2 passing through the other two vertices.
  • the charge-voltage conversion element 40 is arranged in each light receiving element group.
  • the charge-voltage conversion element 40 converts the signal charge generated by the light-receiving element into a voltage.
  • the four light receiving elements included in each light receiving element group share one charge-voltage conversion element 40.
  • the light-shielding layer 20 is arranged between the plurality of microlenses 10 and the plurality of light-receiving elements.
  • the light-shielding layer 20 is a thin film made of metal.
  • the opening OP1 is formed at a position that overlaps with the microlens 10 and overlaps with each light receiving element group.
  • the four openings OP1 are shown in FIG.
  • the shapes of the four openings OP1 are the same, and the sizes of the four openings OP1 are the same.
  • the width W2 of the opening OP1 in the direction parallel to the diagonal line L2 is larger than the width W1 of the opening OP1 in the direction parallel to the diagonal line L1.
  • the light-shielding layer 20 blocks a part of the light that has passed through the microlens 10. A part of the light that has passed through the microlens 10 passes through the aperture OP1 and is incident on the light receiving element.
  • the plurality of microlenses 10 are arranged between the imaging lens 50 shown in FIG. 2 and the plurality of light receiving elements.
  • the microlens 10 irradiates the light receiving element with a part of the light that has passed through the imaging lens 50.
  • Each light receiving element group has two first light receiving elements and two second light receiving elements.
  • the two first light receiving elements are arranged along the diagonal line L1.
  • the two second light receiving elements are arranged along the diagonal line L2.
  • the two first light receiving elements in the light receiving element group PG2 are a light receiving element 30g1 and a light receiving element 30g3.
  • the two second light receiving elements in the light receiving element group PG2 are a light receiving element 30g2 and a light receiving element 30g4.
  • the image sensor 1 needs to include one or more light receiving element groups.
  • Each light receiving element group needs to have one or more first light receiving elements and two or more second light receiving elements.
  • the number of light receiving element groups and the number of light receiving elements included in the light receiving element group are not limited to the example shown in FIG.
  • the opening OP1 is virtually divided into an opening OP1a, an opening OP1b, an opening OP1c, and an opening OP1d.
  • the opening OP1a and the opening OP1b are arranged along the diagonal line L1.
  • the opening OP1c and the opening OP1d are arranged along the diagonal line L2.
  • the light that has passed through the opening OP1a and the opening OP1b is incident on the first light receiving element.
  • the light that has passed through the openings OP1a and OP1b in the light receiving element group PG2 is incident on the light receiving element 30g1 and the light receiving element 30g3, respectively.
  • the region of the imaging lens 50 through which the light incident on the light receiving element 30g1 passes and the region of the imaging lens 50 through which the light incident on the light receiving element 30g3 passes are different from each other.
  • the light that has passed through the opening OP1c and the opening OP1d is incident on the second light receiving element.
  • the light that has passed through the opening OP1c and the opening OP1d in the light receiving element group PG2 is incident on the light receiving element 30g2 and the light receiving element 30g4, respectively.
  • the region of the imaging lens 50 through which the light incident on the light receiving element 30g2 passes and the region of the imaging lens 50 through which the light incident on the light receiving element 30g4 passes are different from each other.
  • the first opening and the second opening are formed in the light shielding layer 20.
  • the first aperture is the aperture OP1a and the aperture OP1b, and the light that has passed through the imaging lens 50 and the microlens 10 is incident on the first light receiving element.
  • the second aperture is the aperture OP1c and the aperture OP1d, and the light that has passed through the imaging lens 50 and the microlens 10 is incident on the second light receiving element. Since the width W2 of the opening OP1 is larger than the width W1 of the opening OP1, the opening OP1c and the opening OP1d are larger than the opening OP1a and the opening OP1b.
  • the first light receiving element and the second light receiving element included in each light receiving element group form an array having two or more rows and columns.
  • the figures formed by the array are the diagonal line L1 (first diagonal line) passing through the first light receiving element and the diagonal line L2 (second diagonal line) passing through the second light receiving element.
  • the width of the opening OP1c and the opening OP1d (half the width W2) in the direction parallel to the diagonal line L2 is larger than the width of the opening OP1a and the opening OP1b (half the width W1) in the direction parallel to the diagonal line L1.
  • the light passes through the first region of the imaging lens 50 and the microlens 10, and passes through the aperture OP1a and the aperture OP1b. Further, the light passes through the second region of the imaging lens 50 and the microlens 10, and passes through the aperture OP1c and the aperture OP1d.
  • the size (area) of the opening OP1a and the opening OP1b and the size (area) of the opening OP1c and the opening OP1d are different from each other. Therefore, the size of the first region of the imaging lens 50 (pupil diameter) and the size of the second region of the imaging lens 50 (pupil diameter) are different from each other.
  • the second region of the imaging lens 50 is larger than the first region of the imaging lens 50.
  • the first light receiving element receives light that has passed through the openings OP1a and OP1b
  • the second light receiving element receives light that has passed through the openings OP1c and OP1d.
  • Imaging using the first light receiving element is substantially the same as imaging using an imaging lens having a large F value.
  • the signal charge obtained by the first light receiving element is not suitable for high-precision distance measurement, but the depth of field is deep in the imaging using the first light receiving element.
  • the signal charge obtained by the first light receiving element is used to generate a color image.
  • Imaging using the second light receiving element is substantially the same as imaging using an imaging lens having a small F value.
  • the depth of field is shallow, but the signal charge obtained by the second light receiving element is suitable for highly accurate distance measurement.
  • the signal charge obtained by the second light receiving element is used to detect the phase difference.
  • FIG. 2 is a cross-sectional view of the image pickup device 1 at the position of the line AA'shown in FIG.
  • FIG. 2 shows the configuration of the image pickup element 1 in the cross section passing through the light receiving element 30r1, the light receiving element 30r3, the light receiving element 30b1, and the light receiving element 30b3.
  • the imaging lens 50 is shown in FIG.
  • the image pickup lens 50 and the image pickup device 1 are included in the image pickup apparatus.
  • Each light receiving element shown in FIG. 2 is a first light receiving element that generates a first signal charge for generating a color image.
  • An insulating layer IL1 is formed on the surface of each light receiving element.
  • a light-shielding layer 20 is formed inside the insulating layer IL1.
  • the opening OP1 is formed at a position where it partially overlaps with the plurality of light receiving elements.
  • the width of the opening OP1 in the cross section shown in FIG. 2 is W1.
  • a color filter CFr and a color filter CFb are formed on the insulating layer IL1.
  • the color filter CFr and the color filter CFb are arranged between the plurality of microlenses 10 and the light-shielding layer 20.
  • the color filter CFr is arranged at a position where it overlaps with the light receiving element 30r1, the light receiving element 30r2, the light receiving element 30r3, and the light receiving element 30r4.
  • the color filter CFb is arranged at a position where it overlaps with the light receiving element 30b1, the light receiving element 30b2, the light receiving element 30b3, and the light receiving element 30b4.
  • Light that has passed through the imaging lens 50 and the microlens 10 is incident on the color filter CFr and the color filter CFb. Red light passes through the color filter CFr, and blue light passes through the color filter CFb. A part of the light transmitted through the color filter CFr passes through the opening OP1 and is incident on the light receiving element 30r1, the light receiving element 30r2, the light receiving element 30r3, and the light receiving element 30r4. A part of the light transmitted through the color filter CFb passes through the opening OP1 and is incident on the light receiving element 30b1, the light receiving element 30b2, the light receiving element 30b3, and the light receiving element 30b4.
  • the insulating layer IL2 is formed on the color filter CFr and the color filter CFb.
  • a plurality of microlenses 10 are formed on the surface of the insulating layer IL2.
  • An imaging lens 50 is arranged optically in front of the plurality of microlenses 10.
  • the light receiving element 30r1, the light receiving element 30r2, the light receiving element 30r3, and the light receiving element 30r4 are more than the positions P1 and P2 separated from the center C1 of the imaging lens 50 by a distance D1 in the direction perpendicular to the optical axis of the imaging lens 50. It receives the light that has passed through the inner region R1.
  • Each light receiving element (first light receiving element) shown in FIG. 2 receives light that has passed through a position near the center C1 of the imaging lens 50. Therefore, the first light receiving element receives light that has passed through an imaging lens having a substantially large F value. As a result, imaging is performed with a deep depth of field.
  • the charge-voltage conversion element 40 mixes the first signal charges generated by the two or more first light-receiving elements included in the light-receiving element group, and the voltage. Convert to. As a result, the time for reading the signal is shortened, and a high S / N ratio is realized.
  • FIG. 3 is a cross-sectional view of the image pickup device 1 at the position of line BB'shown in FIG.
  • the configuration of the image pickup device 1 in the cross section passing through the two light receiving elements 30g2 and the two light receiving elements 30g4 is shown in FIG. Further, the imaging lens 50 is shown in FIG. The description of the same configuration as that shown in FIG. 2 will be omitted.
  • Each light receiving element shown in FIG. 3 is a second light receiving element that generates a second signal charge for detecting a phase difference.
  • the width of the opening OP1 in the cross section shown in FIG. 3 is W2.
  • a color filter CFg is formed on the insulating layer IL1.
  • the color filter CFg is arranged between the plurality of microlenses 10 and the light shielding layer 20.
  • the color filter CFg is arranged at a position where it overlaps with the light receiving element 30g1, the light receiving element 30g2, the light receiving element 30g3, and the light receiving element 30g4.
  • Light that has passed through the imaging lens 50 and the microlens 10 is incident on the color filter CFg. Green light passes through the color filter CFg. A part of the light transmitted through the color filter CFg passes through the opening OP1 and is incident on the light receiving element 30g1, the light receiving element 30g2, the light receiving element 30g3, and the light receiving element 30g4.
  • the light receiving element 30g1, the light receiving element 30g2, the light receiving element 30g3, and the light receiving element 30g4 are separated from the center C1 of the imaging lens 50 by a distance D2 in the direction perpendicular to the optical axis of the imaging lens 50. It receives the light that has passed through the inner region R2.
  • the region R2 includes the region R1 shown in FIG. 1 and is larger than the region R1.
  • the region R2 includes a position outside the region R1.
  • Each light receiving element (second light receiving element) shown in FIG. 3 receives light that has passed through a position far from the center C1 of the imaging lens 50. Therefore, the second light receiving element receives light that has passed through the imaging lens having a substantially small F value. As a result, a signal charge suitable for highly accurate distance measurement can be obtained.
  • the charge-voltage conversion element 40 individually converts the second signal charge generated by the second light-receiving element included in the light-receiving element group into a voltage. As a result, saturation of the charge-voltage conversion element 40 is avoided, and a signal for performing highly accurate distance measurement can be obtained.
  • FIG. 4 shows a method of reading a signal from each light receiving element.
  • the image pickup apparatus has a readout circuit 60 shown in FIG.
  • the read circuit 60 reads the first signal generated based on the first signal charge from the first light receiving element, and receives the second signal generated based on the second signal charge as the second light receiving element. Read from the element.
  • the reading circuit 60 reads the first signal from the light receiving element 30r1, the light receiving element 30r3, the light receiving element 30g1, the light receiving element 30g3, the light receiving element 30b1, and the light receiving element 30b3.
  • the reading circuit 60 reads the second signal from the light receiving element 30r2, the light receiving element 30r4, the light receiving element 30g2, the light receiving element 30g4, the light receiving element 30b2, and the light receiving element 30b4.
  • the readout circuit 60 outputs the first signal and the second signal within the period assigned to each row of the array of the plurality of light receiving elements so that the rate of outputting the first signal and the second signal is leveled between the rows of the
  • Character strings including numbers are shown on each light receiving element shown in FIG. Each character string indicates a signal generated based on the signal charge in each light receiving element.
  • the light receiving element corresponding to the odd number generates the first signal charge.
  • the charge-voltage conversion element 40 mixes the first signal charges generated by the two or more first light receiving elements included in the light receiving element group and converts them into the first signal.
  • the read circuit 60 reads the first signal. The first signal is used to generate a color image.
  • the light receiving element corresponding to the even number generates a second signal charge.
  • the charge-voltage conversion element 40 converts the second signal charge into a second signal.
  • the read circuit 60 reads out the second signal individually. The second signal is used to detect the phase difference.
  • the read circuit 60 reads the signal G4 of the light receiving element 30g4, the signal (G1 + G3) of the light receiving element 30g1 and the light receiving element 30g3, and the signal B4 of the light receiving element 30b4 in the first period assigned to the first line.
  • the reading circuit 60 reads out the signal G2 of the light receiving element 30g2, the signal (B1 + B3) of the light receiving element 30b1 and the light receiving element 30b3, and the signal B2 of the light receiving element 30b2 in the second period assigned to the second line.
  • the reading circuit 60 reads out the signal R4 of the light receiving element 30r4, the signal (R1 + R3) of the light receiving element 30r1 and the light receiving element 30r3, and the signal G4 of the light receiving element 30g4 in the third period assigned to the third line.
  • the reading circuit 60 reads out the signal R2 of the light receiving element 30r2, the signal of the light receiving element 30g1 and the light receiving element 30g3 (G1 + G3), and the signal G2 of the light receiving element 30g2 in the fourth period assigned to the fourth line.
  • the read circuit 60 reads out three signals in the period assigned to each line. One of the three signals corresponds to the signal charge generated by mixing the signal charges of the two first light receiving elements. Since the number of signals read out in each period is constant, the rate at which the signals are output is leveled.
  • the image pickup element 1 has a plurality of light receiving elements (light receiving element 30r1, light receiving element 30g1, light receiving element 30b1, etc.) and a microlens 10.
  • the plurality of light receiving elements are arranged in a matrix.
  • the microlens 10 is arranged on a plurality of light receiving elements with respect to a light receiving element group (light receiving element group PG1, light receiving element group PG2, and light receiving element group PG3).
  • the light receiving element group includes one or more first light receiving elements (light receiving element 30r1, light receiving element 30r3, etc.) included in the plurality of light receiving elements and two or more second light receiving elements (light receiving elements 30r3) included in the plurality of light receiving elements.
  • the first light receiving element receives light that has passed through an imaging lens 50 that forms an optical image of a subject on a plurality of light receiving elements, and generates a first signal charge based on the received light.
  • the two or more second light receiving elements receive light that has passed through different regions of the imaging lens 50, and generate a second signal charge based on the received light.
  • the first light receiving element receives light that has passed through the first region (region R1) of the imaging lens 50 including the center C1 of the imaging lens 50.
  • the second light receiving element receives light that has passed through the second region (region R2) of the imaging lens 50 including the positions (positions P3 and P4) outside the first region.
  • the image sensor 1 can perform imaging at a deep depth of field and can obtain a signal charge suitable for highly accurate distance measurement.
  • FIG. 5 is a plan view of the image sensor 1a of the first modification of the first embodiment of the present invention.
  • the image pickup device 1a shown in FIG. 5 has a plurality of microlenses 10, a light-shielding layer 20, a plurality of light-receiving elements, and a plurality of charge-voltage conversion elements 40.
  • FIG. 5 shows a state in which the semiconductor substrate constituting the image pickup device 1a is viewed in a plane in a direction perpendicular to the main surface of the semiconductor substrate.
  • the microlens 10 and the light-shielding layer 20 are not shown in FIG.
  • a light receiving element group PG1 a light receiving element group PG2, a light receiving element group PG3, and a light receiving element group PG4 are arranged.
  • the description of the same light receiving element group as the light receiving element group shown in FIG. 1 will be omitted.
  • the light receiving element group PG4 includes a light receiving element 30c1, a light receiving element 30c2, a light receiving element 30c3, and a light receiving element 30c4.
  • the light receiving element group PG4 is arranged at predetermined intervals in the vertical direction and the horizontal direction.
  • the four light receiving elements included in the light receiving element group PG4 receive green light and blue light, and generate a signal charge based on the received green light and blue light.
  • the light receiving element 30c1 and the light receiving element 30c3 are first light receiving elements that generate a first signal charge for generating a color image.
  • the light receiving element 30c2 and the light receiving element 30c4 are second light receiving elements that generate a second signal charge for detecting the phase difference.
  • the plurality of light receiving elements of the image pickup element 1a receive light in a band wider than the band of light received by the plurality of light receiving elements of the image pickup element 1 shown in FIG. Therefore, the image sensor 1a can irradiate the subject with narrow band light different for each frame to obtain the spectral characteristics of the subject, or irradiate the subject with white light to perform high-sensitivity imaging.
  • FIG. 6 is a plan view of the image sensor 1b of the second modification of the first embodiment of the present invention.
  • the image pickup device 1b shown in FIG. 6 has a plurality of microlenses 10, a light-shielding layer 20, a plurality of light-receiving elements, and a plurality of charge-voltage conversion elements 40.
  • FIG. 6 shows a state in which the semiconductor substrate constituting the image pickup device 1b is viewed in a plane in a direction perpendicular to the main surface of the semiconductor substrate.
  • the microlens 10 and the light-shielding layer 20 are not shown in FIG.
  • the light receiving element group PG11 includes a light receiving element 30r1, a light receiving element 30c2, a light receiving element 30r3, and a light receiving element 30c4.
  • the light receiving element group PG12 includes a light receiving element 30g1, a light receiving element 30c2, a light receiving element 30g3, and a light receiving element 30c4.
  • the light receiving element group PG13 includes a light receiving element 30b1, a light receiving element 30c2, a light receiving element 30b3, and a light receiving element 30c4. Each light receiving element is the same as the light receiving element shown in FIG. 1 or FIG.
  • the light receiving element 30r1, the light receiving element 30r3, the light receiving element 30g1, the light receiving element 30g3, the light receiving element 30b1, and the light receiving element 30b3 are first light receiving elements that generate a first signal charge for generating a color image.
  • the light receiving element 30c2 and the light receiving element 30c4 are second light receiving elements that generate a second signal charge for detecting the phase difference.
  • the light receiving element 30c2 and the light receiving element 30c4 may have sensitivity in the entire wavelength region as well as green light and blue light.
  • the charge-voltage conversion element 40 individually converts the signal charge generated by the second light-receiving element into a voltage.
  • the second light receiving element receives light in a wavelength region wider than the wavelength region of the light received by the first light receiving element. Therefore, the signal charge generated by the second light receiving element is relatively large. Therefore, a high S / N ratio is realized and the accuracy of distance measurement is improved.
  • the charge-voltage conversion element 40 mixes the signal charges generated by the two first light-receiving elements included in the light-receiving element group and converts them into a voltage. As a result, a high S / N ratio is realized and the image quality of the color image is improved.
  • NBI Near Band Imaging
  • a blood vessel at a shallow position is detected based on a blue image
  • a blood vessel at a deep position is detected based on a green image.
  • the red color filter is not required in this imaging method, for example, the light receiving element 30r1 and the light receiving element 30r3 may be replaced with a light receiving element 30g1 or a light receiving element similar to the light receiving element 30b1.
  • FIG. 7 is a plan view of the image sensor 1c according to the second embodiment of the present invention.
  • the image pickup device 1c shown in FIG. 7 has a plurality of microlenses 10, a plurality of light receiving elements, and a plurality of charge-voltage conversion elements 40.
  • the image pickup device 1c does not have the light-shielding layer 20 shown in FIG.
  • FIG. 7 shows a state in which the semiconductor substrate constituting the image pickup device 1c is viewed in a plane in a direction perpendicular to the main surface of the semiconductor substrate.
  • the image pickup device 1c has a structure in which a plurality of layers are laminated. The state of the plurality of layers is shown in FIG.
  • the light receiving element group PG21 has nine light receiving elements 30r.
  • the light receiving element group PG22 has nine light receiving elements 30 g.
  • the light receiving element group PG23 has nine light receiving elements 30b.
  • the light receiving element 30rc is at the center of the nine light receiving elements 30r included in the light receiving element group PG21.
  • the light receiving element 30 gc is located at the center of the nine light receiving elements 30 g included in the light receiving element group PG22.
  • the light receiving element 30bc is located at the center of the nine light receiving elements 30b included in the light receiving element group PG23.
  • the light receiving element 30rc, the light receiving element 30 gc, and the light receiving element 30 bc are first light receiving elements that generate a first signal charge for generating a color image.
  • the eight light receiving elements 30r surrounding the light receiving element 30 rc, the eight light receiving elements 30 g surrounding the light receiving element 30 cc, and the eight light receiving elements 30 b surrounding the light receiving element 30 bc have a second signal charge for detecting the phase difference. Is a second light receiving element that produces.
  • the charge-voltage conversion element 40 converts the signal charge generated by the first light-receiving element into a voltage. Since the first light receiving element receives light that has passed through a position close to the center of the imaging lens 50, imaging is performed at a deep depth of field.
  • the charge-voltage conversion element 40 individually converts the signal charge generated by the second light-receiving element into a voltage.
  • the signal is read from the eight second light receiving elements of each light receiving element group and averaged. As a result, a signal suitable for high-precision distance measurement can be obtained.
  • the first light receiving element and the second light receiving element included in each light receiving element group form an array in which one of the number of rows and the number of columns is 3 or more and the other of the number of rows and the number of columns is 1 or more. There is a need.
  • the first light receiving element is arranged at the center of the array, and the second light receiving element is arranged at the outer periphery of the arrangement.
  • the number of light receiving element groups and the number of light receiving elements included in the light receiving element group are not limited to the example shown in FIG.
  • the number of rows in the array of light receiving elements may be 1, and the number of columns in the array may be 3.
  • the three light receiving elements are arranged in a row.
  • One first light receiving element is arranged in the center of the array and two second light receiving elements are arranged on both sides of the first light receiving element.
  • the image sensor 1c can perform imaging at a deep depth of field and can obtain a signal charge suitable for highly accurate distance measurement.
  • FIG. 8 is a plan view of the image sensor 1d according to the third embodiment of the present invention.
  • the image pickup device 1d shown in FIG. 8 has a plurality of microlenses 10, a plurality of light receiving elements, a plurality of charge-voltage conversion elements 40, and an infrared transmission layer 70.
  • FIG. 8 shows a state in which the semiconductor substrate constituting the image pickup device 1d is viewed in a plane in a direction perpendicular to the main surface of the semiconductor substrate.
  • the image pickup device 1d has a structure in which a plurality of layers are laminated. The state of the plurality of layers is shown in FIG.
  • the light receiving element group PG31 includes a light receiving element 30r1, a light receiving element 30g2, a light receiving element 30b3, and a light receiving element 30g4.
  • Each light receiving element functions as a first light receiving element that generates a first signal charge for generating a color image, and a second light receiving element that generates a second signal charge for detecting a phase difference. Functions as.
  • the infrared transmissive layer 70 is arranged between the plurality of microlenses 10 and the plurality of light receiving elements.
  • the infrared transmission layer 70 is a thin film formed of a material that transmits infrared light and blocks visible light.
  • the opening OP2 is formed at a position overlapping the microlens 10 and the light receiving element group PG31.
  • the opening OP2 overlaps the central portion of the light receiving element group PG31.
  • the opening OP2 overlaps a part of each of the four light receiving elements included in the light receiving element group PG31.
  • the infrared transmission layer 70 transmits infrared light that has passed through the microlens 10.
  • the infrared light transmitted through the infrared transmission layer 70 is incident on the four light receiving elements included in each light receiving element group PG31.
  • the infrared transmission layer 70 blocks a part of visible light that has passed through the microlens 10.
  • a part of the light that has passed through the microlens 10 passes through the aperture OP2 and is incident on the four light receiving elements included in each light receiving element group PG31.
  • the imaging lens 50, the imaging element 1d, and the light source device shown in FIG. 2 are included in the imaging device.
  • the light source device is not shown in FIG.
  • the light source device 101 shown in FIG. 10 in the fourth embodiment described later may be used.
  • the light source device can switch between a first state of generating visible light and a second state of generating infrared light.
  • the state of the light source device is one of a first state and a second state.
  • the infrared transmissive layer 70 includes a material that transmits infrared light that has passed through the imaging lens 50 and the microlens 10 and blocks visible light that has passed through the imaging lens 50 and the microlens 10.
  • the light receiving element group PG31 has two or more light receiving elements.
  • An aperture OP2 is formed in the infrared transmissive layer 70 so that light that has passed through the imaging lens 50 and the microlens 10 is incident on two or more light receiving elements.
  • the state of the light source device is the first state
  • the two or more light receiving elements function as the first light receiving element.
  • the two or more light receiving elements receive visible light that has passed through the imaging lens 50, the microlens 10, and the aperture OP2, and generate a first signal charge based on the received visible light.
  • the state of the light source device is the second state
  • the two or more light receiving elements function as the second light receiving element.
  • the two or more light receiving elements receive infrared light that has passed through the imaging lens 50, the microlens 10, and the infrared transmission layer 70, and generate a second signal charge based on the received infrared light.
  • the number of light receiving element groups and the number of light receiving elements included in the light receiving element group are not limited to the example shown in FIG.
  • FIG. 9 shows the respective timings of light emission and signal reading.
  • the state of the light source device is the first state
  • the light source device When the state of the light source device is the first state, the light source device generates visible light. For example, a light source device produces white light.
  • the infrared transmission layer 70 functions as a light-shielding layer. A part of the light that has passed through the imaging lens 50 and the microlens 10 is blocked by the infrared transmission layer 70. A part of the light that has passed through the imaging lens 50 and the microlens 10 passes through the aperture OP2 and is incident on the light receiving element 30r1, the light receiving element 30g2, the light receiving element 30b3, and the light receiving element 30g4. Each light receiving element produces a first signal charge.
  • each light receiving element receives light that has passed through a position near the center of the imaging lens 50. Therefore, each light receiving element receives light that has passed through an imaging lens having a substantially large F value. As a result, imaging is performed with a deep depth of field.
  • the state of the light source device becomes the second state, and the light source device generates infrared light.
  • the light source device generates near-infrared light having a wavelength near 850 nm.
  • the infrared light passes through the imaging lens 50, the microlens 10, and the infrared transmission layer 70, and is incident on the light receiving element 30r1, the light receiving element 30g2, the light receiving element 30b3, and the light receiving element 30g4.
  • Each light receiving element produces a second signal charge.
  • each light receiving element receives light that has passed through a position far from the center of the imaging lens 50. Therefore, each light receiving element receives light that has passed through an imaging lens having a substantially small F value. As a result, a signal charge suitable for highly accurate distance measurement can be obtained.
  • the image sensor 1d can perform imaging at a deep depth of field and can obtain a signal charge suitable for highly accurate distance measurement.
  • FIG. 10 shows the configuration of the endoscope device 2 according to the fourth embodiment of the present invention.
  • the endoscope device 2 shown in FIG. 10 includes a light source unit 100, a scope 110, a control unit 120, and a monitor 130.
  • the light source unit 100 has a light source device 101 and a lens 102.
  • the light source device 101 generates light.
  • the light source device 101 may function as the light source device of the third embodiment that generates visible light and infrared light.
  • the lens 102 causes the light generated from the light source device 101 to enter the scope 110.
  • the scope 110 includes a light guide 111, an illumination lens 112, an imaging lens 113, and an image pickup element 114.
  • the illumination lens 112, the image pickup lens 113, and the image pickup element 114 are arranged at the tip 110a of the scope 110.
  • the light generated from the light source device 101 enters the light guide 111 via the lens 102.
  • the light guide 111 transmits the light generated from the light source device 101 to the tip 110a of the scope 110.
  • the light transmitted by the light guide 111 is applied to the subject SB1 by the illumination lens 112.
  • the imaging lens 113 is arranged adjacent to the illumination lens 112.
  • the light reflected by the subject SB1 is incident on the imaging lens 113.
  • the image pickup lens 113 causes the light from the subject SB1 to enter the image pickup element 114.
  • the image sensor 114 is the same as the image sensor of any one of the first to third embodiments.
  • the image pickup device 114 includes a first light receiving element that generates a first signal charge for generating a color image, and a second light receiving element that generates a second signal charge for detecting a phase difference. ..
  • the image sensor 114 generates a first signal based on the first signal charge and a second signal based on the second signal charge.
  • the control unit 120 has at least one of a processor and a logic circuit.
  • the processor is at least one of a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
  • the logic circuit is at least one of ASIC (Application Specific Integrated Circuit) and FPGA (Field-Programmable Gate Array).
  • the control unit 120 can include one or more processors.
  • the control unit 120 may include one or more logic circuits.
  • the control unit 120 receives the first signal and the second signal from the image sensor 114.
  • the control unit 120 generates a video signal of the subject SB1 based on the first signal.
  • the control unit 120 calculates the phase difference (parallax) between the two second signals corresponding to the second signal charges generated by the two second light receiving elements that are different from each other.
  • the control unit 120 calculates the distance to the subject SB1 based on the calculated phase difference, and executes AF control based on the calculated distance.
  • the control unit 120 may measure the shape of the subject SB1 by calculating the distance to the subject SB1.
  • the monitor 130 is a liquid crystal display, an organic EL (Electroluminescence) display, or the like.
  • the monitor 130 displays a color image based on the video signal generated by the control unit 120.
  • the endoscope device 2 has the same image pickup device 114 as any one of the image pickup devices of the first to third embodiments. Therefore, the endoscope device 2 can perform imaging at a deep depth of field and can obtain a signal charge suitable for highly accurate distance measurement.
  • FIG. 11 shows the configuration of the surgical microscope system 3 according to the fifth embodiment of the present invention.
  • the surgical microscope system 3 shown in FIG. 11 includes a mirror body portion 150, a support portion 160, a monitor 170, a monitor arm 180, a control unit 190, and a foot switch 200.
  • the mirror body unit 150 is the main body of the imaging unit for photographing the subject SB2.
  • the support portion 160 supports the mirror body portion 150.
  • the monitor 170 displays a color image of the subject.
  • the monitor arm 180 supports the monitor 170 so that the monitor 170 can move.
  • the foot switch 200 is an operation unit operated by the user U1.
  • FIG. 12 shows the configuration of the mirror body portion 150 and the support portion 160.
  • the mirror body portion 150 includes an imaging lens 151 and an imaging element 152.
  • the image pickup lens 151 causes the light from the subject SB2 to enter the image pickup element 152.
  • the image sensor 152 is the same as the image sensor of any one of the first to third embodiments.
  • the image pickup device 152 includes a first light receiving element that generates a first signal charge for generating a color image, and a second light receiving element that generates a second signal charge for detecting a phase difference. ..
  • the image sensor 152 generates a first signal based on the first signal charge and generates a second signal based on the second signal charge.
  • a grip 210 is attached to the exterior of the mirror body 150 so that the user U1 can easily move the mirror body 150.
  • a push button type grip switch 211 is arranged on the grip 210. The user U1 holds the grip 210 and presses the grip switch 211 in order to move the mirror body portion 150. As a result, the user U1 moves the support portion 160. When the user U1 releases the grip switch 211, the support portion 160 is fixed in the current form. As a result, the mirror body portion 150 is fixed at the position and orientation desired by the user U1.
  • the support portion 160 has a pedestal 161 fixed at a predetermined position, a plurality of arm rods 162, and a plurality of joints 163 that flexibly connect the arm rods 162.
  • the arm rod 162 has an electromagnetic brake not shown in FIG.
  • the state of the electromagnetic brake is set to the operating state or the stopped state.
  • the support portion 160 is fixed.
  • each joint 163 becomes rotatable.
  • the control unit 190 is arranged on the gantry 161.
  • the control unit 190 has at least one of a processor and a logic circuit.
  • the control unit 190 can include one or more processors.
  • the control unit 190 can include one or more logic circuits.
  • the control unit 190 receives the first signal and the second signal from the image sensor 152.
  • the control unit 190 generates a video signal of the subject SB2 based on the first signal.
  • the control unit 190 calculates the phase difference (parallax) between the two second signals corresponding to the second signal charges generated by the two second light receiving elements that are different from each other.
  • the control unit 190 calculates the distance to the subject SB2 based on the calculated phase difference, and executes AF control based on the calculated distance.
  • the control unit 190 may measure the shape of the subject SB2 by calculating the distance to the subject SB2.
  • the surgical microscope system 3 has the same image pickup device 152 as any one of the image pickup devices of the first to third embodiments. Therefore, the surgical microscope system 3 can perform imaging at a deep depth of field and can obtain a signal charge suitable for highly accurate distance measurement.
  • the solid-state image sensor, the image pickup device, the endoscopic device, and the surgical microscope system can perform imaging at a deep depth of field, and can be used for highly accurate distance measurement. A suitable signal charge can be obtained.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un élément d'imagerie à semi-conducteurs qui comprend une pluralité d'éléments de réception de lumière et une microlentille. Un groupe d'éléments de réception de lumière comprend un ou plusieurs premiers éléments de réception de lumière et au moins deux seconds éléments de réception de lumière. Le premier élément de réception de lumière reçoit de la lumière qui a traversé une lentille de formation d'image pour former une image optique d'un sujet sur la pluralité d'éléments de réception de lumière. Les deux seconds éléments de réception de lumière ou plus reçoivent de la lumière qui a traversé des régions différentes les unes des autres de la lentille de formation d'image. Le premier élément de réception de lumière reçoit de la lumière qui a traversé une première région de la lentille de formation d'image, la première région comprenant le centre de la lentille de formation d'image. Le second élément de réception de lumière reçoit de la lumière qui a traversé une seconde région de la lentille de formation d'image, la seconde région comprenant une position à l'extérieur de la première région.
PCT/JP2020/016407 2020-04-14 2020-04-14 Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, dispositif d'endoscope et dispositif de microscope pour opération WO2021210060A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/016407 WO2021210060A1 (fr) 2020-04-14 2020-04-14 Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, dispositif d'endoscope et dispositif de microscope pour opération

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/016407 WO2021210060A1 (fr) 2020-04-14 2020-04-14 Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, dispositif d'endoscope et dispositif de microscope pour opération

Publications (1)

Publication Number Publication Date
WO2021210060A1 true WO2021210060A1 (fr) 2021-10-21

Family

ID=78085271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/016407 WO2021210060A1 (fr) 2020-04-14 2020-04-14 Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, dispositif d'endoscope et dispositif de microscope pour opération

Country Status (1)

Country Link
WO (1) WO2021210060A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010268972A (ja) * 2009-05-21 2010-12-02 Hoya Corp 医療用観察システムおよびプロセッサ
JP2013145314A (ja) * 2012-01-13 2013-07-25 Canon Inc 画像処理装置、撮像装置、制御方法、及びプログラム
JP2017161512A (ja) * 2016-03-04 2017-09-14 キヤノン株式会社 測距装置及び移動体

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010268972A (ja) * 2009-05-21 2010-12-02 Hoya Corp 医療用観察システムおよびプロセッサ
JP2013145314A (ja) * 2012-01-13 2013-07-25 Canon Inc 画像処理装置、撮像装置、制御方法、及びプログラム
JP2017161512A (ja) * 2016-03-04 2017-09-14 キヤノン株式会社 測距装置及び移動体

Similar Documents

Publication Publication Date Title
US9490281B2 (en) Image sensor and image capturing apparatus
JP4967296B2 (ja) 撮像素子、焦点検出装置、および、撮像システム
JP6172982B2 (ja) 撮像装置及びカメラシステム
JP5513326B2 (ja) 撮像素子及び撮像装置
US8988582B2 (en) Multi-channel image sensors
US6958862B1 (en) Use of a lenslet array with a vertically stacked pixel array
US20150358593A1 (en) Imaging apparatus and image sensor
JP5176959B2 (ja) 撮像素子および撮像装置
JP5159700B2 (ja) 光学装置及び焦点検出方法
JP2011176715A (ja) 裏面照射型撮像素子および撮像装置
JP5421207B2 (ja) 固体撮像装置
JP5591851B2 (ja) 固体撮像装置および携帯情報端末
JP2007317951A (ja) 光検出素子および撮像装置
JP2013157442A (ja) 撮像素子および焦点検出装置
JP5211590B2 (ja) 撮像素子および焦点検出装置
JP2012212878A (ja) 撮像素子および撮像装置
JP2008211631A (ja) 焦点検出素子、焦点検出装置および撮像装置
JP2009145527A (ja) 撮像素子、焦点検出装置および撮像装置
JP2008085159A (ja) 撮像素子及び内視鏡装置
WO2017195613A1 (fr) Élément de capture d'image à semi-conducteurs et dispositif électronique
JP2017152658A (ja) 撮像素子および撮像装置
WO2021210060A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, dispositif d'endoscope et dispositif de microscope pour opération
JP2010243772A (ja) 撮像装置
JP2022189971A (ja) 撮像素子
TW201525533A (zh) 彩色濾光器陣列及固態影像感測器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20931036

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20931036

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP