WO2023095518A1 - Dispositif de détection de lumière, et appareil électronique - Google Patents

Dispositif de détection de lumière, et appareil électronique Download PDF

Info

Publication number
WO2023095518A1
WO2023095518A1 PCT/JP2022/039816 JP2022039816W WO2023095518A1 WO 2023095518 A1 WO2023095518 A1 WO 2023095518A1 JP 2022039816 W JP2022039816 W JP 2022039816W WO 2023095518 A1 WO2023095518 A1 WO 2023095518A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
pixels
imaging
phase difference
difference detection
Prior art date
Application number
PCT/JP2022/039816
Other languages
English (en)
Japanese (ja)
Inventor
貴大 矢田
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2023095518A1 publication Critical patent/WO2023095518A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present disclosure relates to photodetection devices and electronic devices.
  • the amount of light mixed in from the adjacent pixels differs between the imaging pixels adjacent to the phase difference detection pixels and the imaging pixels that are not due to the presence or absence of the light shielding film of the adjacent pixels.
  • the amount of light mixed in from adjacent pixels varies between imaging pixels.
  • One aspect of the present disclosure provides a photodetector and an electronic device capable of suppressing variations in the amount of light mixed in from adjacent pixels between imaging pixels.
  • a photodetector includes a plurality of pixels arranged two-dimensionally, and the plurality of pixels includes an imaging pixel including a photoelectric conversion unit, a photoelectric conversion unit, and a part of the photoelectric conversion unit. and a phase difference detection pixel including an opening that is a portion where the photoelectric conversion unit is not covered with the light shielding film, and the imaging pixel is adjacent to another imaging pixel in a specific direction
  • a first imaging pixel that is not adjacent to the phase difference detection pixel, and a second imaging pixel that is adjacent to the phase difference detection pixel such that the opening of the phase difference detection pixel is adjacent in a specific direction include.
  • An electronic device is an electronic device equipped with a photodetector, wherein the photodetector includes a plurality of pixels arranged two-dimensionally, and the plurality of pixels are photoelectric conversion units. and a phase difference detection pixel including a photoelectric conversion unit and a light shielding film that partially covers the photoelectric conversion unit, wherein the imaging pixel is located adjacent to another imaging pixel in a specific direction a plurality of first imaging pixels that are not adjacent to the phase difference detection pixels; and an aperture that is adjacent to the phase difference detection pixels in a specific direction and that is a portion of the phase difference detection pixels in which the photoelectric conversion portion is not covered with a light shielding film. and a second imaging pixel adjacent to the portion.
  • FIG. 1 is a diagram showing an example of a schematic configuration of a photodetector according to an embodiment
  • FIG. 4 is a diagram showing an example of a schematic configuration of a pixel region
  • FIG. 3 is a diagram schematically showing a cross section taken along line III-III in FIG. 2
  • FIG. 3 is a diagram schematically showing a cross section taken along line IV-IV of FIG. 2
  • FIG. 6 is a diagram schematically showing a cross section taken along line VI-VI of FIG. 5; It is a figure which shows the modification of a pixel arrangement
  • FIG. 14 is a diagram schematically showing a cross section taken along line XIV-XIV in FIG. 13; It is a figure which shows the modification of a pixel arrangement
  • FIG. 16 is a diagram schematically showing a cross section along line XVI-XVI of FIG. 15; It is a figure which shows the modification of a pixel arrangement
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system
  • FIG. FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit
  • 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system
  • FIG. 3 is a block diagram showing an example of functional configurations of a camera head and a CCU;
  • FIG. 1 is a diagram showing an example of a schematic configuration of a photodetector according to an embodiment.
  • the illustrated photodetector 1 is a solid-state imaging device, and each element is provided on a substrate 11 (for example, a silicon substrate or the like).
  • the photodetector 1 includes a pixel region 3 , a vertical drive circuit 4 , a column signal processing circuit 5 , a horizontal drive circuit 6 , an output circuit 7 , a control circuit 8 , vertical signal lines 9 , horizontal signal lines 10 , and an input/output terminal 12 .
  • the pixel region 3 includes a plurality of pixels 2.
  • a plurality of pixels 2 are arranged in a two-dimensional array.
  • An XYZ coordinate system for the pixel area 3 is illustrated.
  • the X-axis direction corresponds to the row direction of the array, for example, the lateral direction (horizontal direction) of the photodetector 1 .
  • the Y-axis direction corresponds to the column direction of the array, for example, the longitudinal direction (perpendicular direction) of the photodetector 1 .
  • the Z-axis direction corresponds to, for example, the front-rear direction of the photodetector 1 .
  • the photodetector 1 particularly detects light from the front (light traveling along the negative direction of the Z axis).
  • the pixel 2 includes a photoelectric conversion unit (corresponding to a photoelectric conversion unit 201 described later), a pixel transistor, and the like.
  • the photoelectric conversion unit includes a photoelectric conversion element such as a photodiode (PD).
  • the pixel transistor is, for example, a MOS (Metal Oxide Semiconductor) transistor.
  • a plurality of pixel transistors may be provided in one pixel 2 . Examples of the plurality of pixel transistors are transfer transistors, reset transistors, amplification transistors, selection transistors, and the like.
  • transfer transistors are used to transfer charges generated in photoelectric conversion elements.
  • a reset transistor is used to reset the charge.
  • Amplification transistors are used to generate pixel signals corresponding to charges.
  • the select transistor is used to make the pixel signal appear on the signal line.
  • Various known circuit configurations including these pixel transistors may be provided in each pixel 2 .
  • a shared pixel structure may be adopted.
  • the photoelectric conversion units and transfer transistors of each of several pixels 2, one floating diffusion, and one each of other pixel transistors may be shared.
  • the vertical drive circuit 4 , column signal processing circuit 5 , horizontal drive circuit 6 , output circuit 7 and control circuit 8 constitute a peripheral circuit section of the pixel region 3 .
  • the control circuit 8 will be explained first.
  • the control circuit 8 receives an input clock and data instructing the operation mode, etc., and outputs data such as internal information of the photodetector 1 .
  • the control circuit 8 generates a clock signal and a control signal that serve as references for operations of the vertical drive circuit 4, the column signal processing circuit 5, the horizontal drive circuit 6, etc., based on the vertical synchronization signal, horizontal synchronization signal, and master clock.
  • the control circuit 8 inputs (supplies) the generated signals to the vertical driving circuit 4, the column signal processing circuit 5, the horizontal driving circuit 6, and the like.
  • the vertical drive circuit 4 includes, for example, a shift register.
  • the vertical drive circuit 4 selects a pixel drive wiring, supplies a pulse for driving the pixels to the selected pixel drive wiring, and drives the pixels 2 row by row.
  • the vertical drive circuit 4 sequentially selectively scans the pixels 2 in the vertical direction row by row, and supplies pixel signals from the pixels 2 to the column signal processing circuit 5 through vertical signal lines 9 .
  • the column signal processing circuit 5 is arranged, for example, for each pixel column, and performs signal processing such as noise removal on pixel signals from the pixels 2 of one row for each pixel column.
  • the column signal processing circuit 5 performs signal processing such as CDS (Correlated Double Sampling) for removing fixed pattern noise unique to the pixels 2, signal amplification, and AD (Analog to Digital) conversion.
  • a horizontal selection switch (not shown) is connected between the output stage of the column signal processing circuit 5 and the horizontal signal line 10 .
  • the horizontal driving circuit 6 includes, for example, a shift register. By sequentially outputting horizontal scanning pulses, the horizontal drive circuit 6 sequentially selects each of the column signal processing circuits 5 and causes each of the column signal processing circuits 5 to output a pixel signal to the horizontal signal line 10 .
  • the output circuit 7 performs signal processing on pixel signals sequentially supplied from each of the column signal processing circuits 5 through the horizontal signal line 10 and outputs the processed signal. For example, buffering is performed, black level adjustment, column variation correction, various digital signal processing, and the like are performed.
  • the input/output terminal 12 exchanges signals with the outside.
  • FIG. 2 is a diagram showing an example of a schematic configuration of a pixel region.
  • One pixel 2 corresponds to one color. As colors, three primary colors of red R, green G and blue B will be taken as an example for explanation.
  • Each of the plurality of pixels 2 corresponds to one of red R, green G and blue B.
  • Each pixel 2 is illustrated with different hatching so that it can be distinguished by color.
  • a plurality of pixels 2 are arranged in a Bayer array with 1 ⁇ 1 pixels as a basic pattern.
  • the plurality of pixels 2 includes imaging pixels 21 and 22 and phase difference detection pixels 23 .
  • the imaging pixel 21 and the imaging pixel 22 are normal pixels including photoelectric conversion units and the like. Pixel signals of the imaging pixels 21 and 22 form a captured image or the like.
  • the imaging pixel 21 is adjacent to another imaging pixel (another imaging pixel 21 in this example) but not adjacent to the phase difference detection pixel 23 in a specific direction (the X-axis direction in this example). is a pixel. Description will also be made with reference to FIG.
  • FIG. 3 is a diagram schematically showing a cross section along line III-III in FIG.
  • the imaging pixel 21 includes a photoelectric conversion unit 201, a color filter 202, a lens 203, and a light shielding wall 204.
  • Illustrations of other elements included in the imaging pixel 21, such as pixel transistors, are omitted. Materials for each element may be appropriately selected within the scope of common general technical knowledge, unless otherwise specified.
  • the photoelectric conversion unit 201 is, for example, a photoelectric conversion element such as the photodiode described above.
  • a color filter 202 is provided for the photoelectric conversion unit 201 .
  • the color filter 202 is provided in front of the photoelectric conversion unit 201 (positive side of the Z-axis) and allows light of the color corresponding to the imaging pixel 21 to pass therethrough.
  • the color filter 202 that transmits green G light is labeled with the letter "G”.
  • the color filter 202 that transmits red R light is labeled with the letter “R”.
  • a lens 203 is a microchip lens provided on the color filter 202 in front of the color filter 202 (positive direction of the Z axis) so as to condense light onto the photoelectric conversion unit 201 .
  • a light shielding wall 204 is provided between the photoelectric conversion unit 201 and the color filter 202 so as to block light from the adjacent imaging pixels 21 .
  • the light shielding wall 204 may be shared between adjacent pixels 2 .
  • the imaging pixel 21 is adjacent to another imaging pixel 21 in a specific direction.
  • the imaging pixel 21 corresponding to green G, the imaging pixel 21 corresponding to red R, and the imaging pixel 21 corresponding to green G are arranged adjacently in this order in the positive direction of the X axis. be done.
  • part of the light that has passed through the lens 203 and the color filter 202 of the imaging pixel 21 is mixed into the photoelectric conversion unit 201 of the adjacent imaging pixel 21. do.
  • the light from the imaging pixel 21 corresponding to green G mixes into the photoelectric conversion unit 201 of the imaging pixel 21 corresponding to red R. Color mixture occurs in the imaging pixels 21 .
  • the imaging pixel 22 is a second imaging pixel adjacent to the phase difference detection pixel 23 so that the opening op (described later) of the phase difference detection pixel 23 is adjacent in a specific direction.
  • Pixel signals from the phase difference detection pixels 23 are used, for example, to obtain an image plane phase difference AF function.
  • the phase difference detection pixels 23 are a pair of phase difference detection pixels 23 .
  • One phase difference detection pixel 23 of the pair is illustrated as a phase difference detection pixel 23A.
  • the other phase difference detection pixel 23 of the pair is illustrated as a phase difference detection pixel 23B. Description will also be made with reference to FIG.
  • FIG. 4 is a diagram schematically showing a cross section along line IV-IV in FIG.
  • a phase difference detection pixel 23B corresponding to green G, an imaging pixel 22 corresponding to red R, and a phase difference detection pixel 23A corresponding to green G are arranged adjacent to each other in this order in the positive direction of the X axis.
  • the color filters 202 of the phase difference detection pixel 23A and the phase difference detection pixel 23B pass light of a color different from the color of light passed by the color filter 202 of the imaging pixel 22 .
  • the phase difference detection pixel 23A and the phase difference detection pixel 23B include the light shielding film 205 and the opening op compared to the imaging pixel 21 (FIG. 3) and the imaging pixel 22. They are different in that respect.
  • the light shielding film 205 is provided between the photoelectric conversion unit 201 and the color filter 202 (for example, on the photoelectric conversion unit 201) so as to partially cover the photoelectric conversion unit 201.
  • An example of the material of the light shielding film 205 is tungsten (W).
  • the opening op is a portion of the phase difference detection pixel 23 where the photoelectric conversion unit 201 is not covered with the light shielding film 205 .
  • the opening op exposes a portion of the photoelectric conversion unit 201 on the imaging pixel 22 side. That is, the phase difference detection pixel 23 is arranged so that the opening op is located on the imaging pixel 22 side.
  • the light shielding film 205 and the opening op of the phase difference detection pixel 23A are referred to as the light shielding film 205A and the opening opA.
  • the opening opA of the phase difference detection pixel 23A exposes one side portion of the photoelectric conversion section 201 in a specific direction (in this example, the portion on the X-axis negative direction side).
  • the phase difference detection pixel 23A is arranged such that the opening opA is located on the imaging pixel 22 side (in this example, on the X-axis negative direction side).
  • the phase difference detection pixel 23A has a structure in which the left side is opened and the right side is shielded from light.
  • the light shielding film 205 and the opening op of the phase difference detection pixel 23B are shown as a light shielding film 205B and an opening opB.
  • the opening opB of the phase difference detection pixel 23B exposes the other side portion of the photoelectric conversion unit 201 in the specific direction (in this example, the portion on the X-axis positive direction side).
  • the phase difference detection pixel 23B is arranged such that the opening opB is located on the imaging pixel 22 side (in this example, on the X-axis positive direction side).
  • the phase difference detection pixel 23B has a structure in which the right side is opened and the left side is shielded from light.
  • phase difference detection using the phase difference detection pixels 23A and the phase difference detection pixels 23B as described above and the specific processing related to the image plane phase difference AF are well known, detailed description thereof will be omitted.
  • part of the light that has passed through the lens 203 and the color filter 202 of the phase difference detection pixel 23 passes through the opening op and passes through the adjacent imaging pixel. 22 is mixed into the photoelectric conversion unit 201 .
  • the light from the phase difference detection pixel 23 corresponding to green G mixes in the photoelectric conversion unit 201 of the imaging pixel 22 corresponding to red R. Color mixture occurs in the imaging pixels 22 .
  • the photodetector 1 As described above, in the photodetector 1 , light from adjacent pixels is mixed in the imaging pixel 22 in the same manner as in the imaging pixel 21 . Variation in the amount of light mixed from adjacent pixels (variation in sensitivity) between the imaging pixels 21 and 22 (between the imaging pixels), and thus variation in the degree of color mixture can be suppressed. It is possible to suppress deterioration of image quality caused by variations.
  • a comparative example is also used for explanation.
  • FIG. 5 is a diagram showing a comparative example.
  • phase difference detection pixels 23A corresponding to green G, imaging pixels 22E corresponding to red R, and phase difference detection pixels 23B corresponding to green G are arranged in this order in the positive direction of the X axis. placed adjacent to each other.
  • FIG. 6 is a diagram schematically showing a cross section taken along line VI-VI in FIG. Part of the light that has passed through the lens 203 and the color filter 202 of the phase difference detection pixel 23 is blocked by the light shielding film 205 , as schematically indicated by the outline arrow. Specifically, in the phase difference detection pixel 23A, light is blocked by the light blocking film 205-A. In the phase difference detection pixel 23B, light is blocked by the light blocking film 205-B. Mixing of light from the phase difference detection pixel 23 into the imaging pixel 22E is suppressed. Variation in the amount of light mixed in from adjacent pixels increases between the imaging pixel 21 and the imaging pixel 22E. This problem is addressed by the photodetector device 1 according to the embodiment, as described above.
  • the light from the adjacent phase difference detection pixel 23 detects the phase difference. It is shielded by the light shielding film 205 of the pixel 23 and is not mixed. However, even in this case, in the pixel 2, the light from the adjacent pixel on one side does not mix.
  • the mixed amount of light from the adjacent pixels can be brought closer to the mixed amount in the imaging pixel 21 than in the case where the light from the adjacent pixels on both sides is not mixed as in the above comparative example. Since the degree of influence caused by the difference in the mixed amount is small (because it is only about half that of the comparative example), it can be easily dealt with by signal processing such as correction.
  • the Bayer array of 1 ⁇ 1 pixels has been described as an example of the pixel array.
  • various pixel arrangements other than this may be adopted.
  • the case where the color corresponding to the imaging pixel 22 is red R has been described as an example.
  • the color corresponding to the imaging pixel 22 may of course be a color other than red R. Description will be made with reference to FIGS. 7 to 12.
  • FIG. 7 is a diagram showing a modification of the pixel array.
  • the plurality of pixels 2 are arranged with 3 ⁇ 3 pixels as a unit pattern.
  • the unit pattern is composed of two pixels 2 corresponding to red R, two pixels 2 corresponding to green G, and five pixels 2 corresponding to green G.
  • phase difference detection pixel 23B corresponding to green G
  • the imaging pixel 22 corresponding to blue B
  • the phase difference detection pixel 23A corresponding to green G
  • the phase difference detection pixel 23 is arranged such that the opening op is positioned on the imaging pixel 22 side.
  • the image pickup pixel 22 is mixed with light from adjacent pixels to the same extent as the image pickup pixel 21 .
  • the paired phase difference detection pixels 23A and phase difference detection pixels 23B may be any combination, and may be two phase difference detection pixels 23 located on opposite sides of the imaging pixel 22, It may be two phase difference detection pixels 23 arranged side by side between the imaging pixels 22 .
  • FIG. 8 is a diagram showing a modification of the pixel array.
  • the plurality of pixels 2 are arranged in a Bayer array (quad Bayer array) using 2 ⁇ 2 pixels 2 corresponding to the same color as a unit pattern.
  • the pixel signals of each pixel 2 in the unit pattern are added, for example, and read out.
  • the phase difference detection pixels 23 may be arranged relatively densely as shown in FIG. 8(A), or arranged relatively sparsely as shown in FIG. 8(B). may be There is no particular restriction on the density of the phase difference detection pixels 23, and they may be arranged in any period.
  • phase difference detection pixel 23B corresponding to green G
  • the two imaging pixels 22 corresponding to red R and the phase difference detection pixel 23A corresponding to green G are arranged adjacent to each other in this order.
  • the phase difference detection pixel 23 is arranged such that the opening op is located on the imaging pixel 22 side.
  • the image pickup pixel 22 is mixed with light from adjacent pixels to the same extent as the image pickup pixel 21 .
  • a comparative example is also used for explanation.
  • FIG. 9 is a diagram showing a comparative example.
  • a phase difference detection pixel 23A corresponding to green G two imaging pixels 22E corresponding to red R, and a phase difference detection pixel 23B corresponding to green G are arranged adjacent to each other in this order.
  • the phase difference detection pixel 23 is arranged such that the opening op is located on the opposite side of the imaging pixel 22E. Light from the phase difference detection pixel 23 is blocked by the light shielding film 205 and does not enter the imaging pixel 22E. As a result, the amount of light mixed in from adjacent pixels varies greatly between the imaging pixel 21 and the imaging pixel 22E. This may affect the above-described added pixel signals, leading to deterioration of image quality and the like. This problem is addressed by the pixel arrangement of FIG. 8 previously described.
  • FIG. 10 is a diagram showing a modification of the pixel array.
  • the plurality of pixels 2 are arranged in a Bayer array with 3 ⁇ 3 pixels 2 corresponding to the same color as a basic pattern (3 ⁇ 3 Bayer array).
  • the phase difference detection pixels 23 may be arranged relatively densely as shown in FIG. 10(A), or relatively sparsely arranged as shown in FIG. 10(B). may be placed.
  • phase difference detection pixels 23B corresponding to green G
  • imaging pixels 22 corresponding to red R imaging pixels 21 corresponding to red R
  • imaging pixels 22 corresponding to red R and green G
  • the phase difference detection pixels 23A are arranged adjacently in this order.
  • the phase difference detection pixel 23 is arranged so that the opening op of the light shielding film 205 is positioned on the imaging pixel 22 side.
  • the image pickup pixel 22 is mixed with light from adjacent pixels to the same extent as the image pickup pixel 21 .
  • a comparative example is also used for explanation.
  • FIG. 11 is a diagram showing a comparative example.
  • a phase difference detection pixel 23A corresponding to green G In the positive direction of the X-axis, a phase difference detection pixel 23A corresponding to green G, an imaging pixel 22E corresponding to red R, an imaging pixel 21 corresponding to red R, an imaging pixel 22E corresponding to red R, and an imaging pixel 22E corresponding to green R
  • the phase difference detection pixels 23B are arranged adjacent to each other in this order.
  • the phase difference detection pixel 23 is arranged such that the opening op of the light shielding film 205 is located on the opposite side of the imaging pixel 22E. Light from the adjacent pixels, that is, the phase difference detection pixels 23 is blocked by the light shielding film 205 and does not enter the imaging pixels 22E. Variation in the amount of light mixed in from adjacent pixels increases between the imaging pixel 21 and the imaging pixel 22E. This problem is addressed by the pixel arrangement of FIG. 10 previously described.
  • FIG. 12 is a diagram showing a modification of the pixel array.
  • the plurality of pixels 2 are arranged in a Bayer array with 2 ⁇ 1 pixels 2 corresponding to the same color as a basic pattern (2 ⁇ 1 Bayer array).
  • the phase difference detection pixel 23B corresponding to green G, the two imaging pixels 22 corresponding to red R, and the phase difference detection pixel 23A corresponding to green G are arranged adjacent to each other in this order.
  • the phase difference detection pixel 23 is arranged such that the opening op is positioned on the imaging pixel 22 side.
  • the image pickup pixel 22 is mixed with light from adjacent pixels to the same extent as the image pickup pixel 21 .
  • the specific direction in which the imaging pixels 22 and the phase difference detection pixels 23 are arranged is the X-axis direction (row direction).
  • the specific direction may be a direction other than the X-axis direction.
  • the colors (types of the color filters 202) corresponding to the imaging pixels 22 and the phase difference detection pixels 23 may be the same. Description will be made with reference to FIGS. 13 to 16. FIG.
  • FIG. 13 is a diagram showing a modification of the pixel array.
  • the imaging pixels 22 and the phase difference detection pixels 23 are arranged in the Y-axis direction (column direction).
  • the phase difference detection pixels 23 arranged in this way are referred to as a phase difference detection pixel 23A2 and a phase difference detection pixel 23B2.
  • pixel 2 which is labeled in FIG. 13, as an example.
  • a phase difference detection pixel 23B2 corresponding to green G two imaging pixels 22 corresponding to blue B, and a phase difference detection pixel 23A2 corresponding to green G are arranged adjacent to each other in this order. .
  • FIG. 14 is a diagram schematically showing a cross section along line XIV-XIV in FIG.
  • the light shielding film 205A2 of the phase difference detection pixel 23A2 exposes one side portion of the photoelectric conversion unit 201 in the Y-axis direction (the portion on the Y-axis positive direction side in this example).
  • the light shielding film 205B2 of the phase difference detection pixel 23B2 exposes the other side portion of the photoelectric conversion unit 201 in the Y-axis direction (the portion on the Y-axis negative direction side in this example).
  • FIG. 15 is a diagram showing a modification of the array.
  • the imaging pixels 22 and the phase difference detection pixels 23 are arranged in an intermediate direction (diagonal direction) between the X-axis direction (row direction) and the Y-axis direction (column direction).
  • the phase difference detection pixels 23 arranged in this way are referred to as a phase difference detection pixel 23A3 and a phase difference detection pixel 23B3.
  • phase difference detection pixel 23B2 corresponding to green G the imaging pixel 22 corresponding to green G
  • the phase difference detection pixel 23A3 corresponding to green G are arranged in this order. placed adjacent to each other.
  • FIG. 16 is a diagram schematically showing a cross section along line XVI-XVI in FIG.
  • the light-shielding film 205A3 of the phase difference detection pixel 23A3 exposes one side portion of the photoelectric conversion unit 201 in the oblique direction (in this example, the portion on the intermediate side between the positive direction of the X-axis and the positive direction of the Y-axis).
  • the light shielding film 205B3 of the phase difference detection pixel 23B3 exposes the other side portion of the photoelectric conversion unit 201 in the oblique direction (in this example, the portion on the intermediate side of the X-axis negative direction and the Y-axis negative direction).
  • phase difference detection pixels 23 correspond to green G
  • the phase difference detection pixels 23 may be pixels 2 corresponding to colors other than green G. Description will be made with reference to FIG.
  • FIG. 17 is a diagram showing a modification of the pixel array.
  • the phase difference detection pixel 23B corresponding to red R, the two imaging pixels 22 corresponding to green G, and the phase difference detection pixel 23A corresponding to red R are arranged adjacent to each other in this order. Also in this case, it is possible to suppress variations in the amount of light mixed in from adjacent pixels between the imaging pixels 21 and 22 .
  • the phase difference detection pixels 23A and the phase difference detection pixels 23B may be the phase difference detection pixels 23 corresponding to blue B.
  • phase difference detection pixels 23 include the color filters 202 .
  • the phase difference detection pixels 23 may not include the color filters 202 . It is possible to avoid the loss of the amount of light due to the color filter 202 and improve the sensitivity of the phase difference detection pixel 23, for example.
  • red R, green G, and blue B are exemplified as the colors corresponding to the pixels 2 .
  • the pixels 2 may correspond to colors other than these.
  • FIG. 18 is a diagram showing an example of a schematic configuration of an electronic device.
  • the electronic device 101 are imaging devices such as digital still cameras and digital video cameras, and mobile phones having such imaging functions.
  • an electronic device 101 includes an optical system 102, a photodetector 103, and a DSP (Digital Signal Processor) 104.
  • FIG. A DSP 104 , a display device 105 , an operation system 106 , a memory 108 , a recording device 109 and a power supply system 110 are also connected via a bus 107 . For example, still images and moving images can be captured.
  • the optical system 102 includes one or more lenses, guides image light (incident light) from a subject to the photodetector 103, and forms an image on the light receiving surface (sensor section) of the photodetector 103. Let Electrons are accumulated in the photodetector 103 for a certain period of time according to the image formed on the light receiving surface via the optical system 102 . A signal corresponding to the electrons accumulated in the pixel 2 of the photodetector 103 is supplied to the DSP 104 .
  • the DSP 104 performs various signal processing on the signal from the photodetector 103 to obtain an image, and temporarily stores the image data in the memory 108 .
  • the image data stored in the memory 108 is recorded in the recording device 109 or supplied to the display device 105 to display the image.
  • the operation system 106 receives various operations by the user and supplies operation signals to each block of the electronic device 101 , and the power supply system 110 supplies power required to drive each block of the electronic device 101 .
  • the photodetector 103 (corresponding to the photodetector 1 described above) is mounted, it is possible to suppress deterioration of image quality, for example.
  • the photodetector 1 includes a plurality of pixels 2 arranged two-dimensionally.
  • the plurality of pixels 2 includes an imaging pixel 21 and an imaging pixel 22 including a photoelectric conversion unit 201, a photoelectric conversion unit 201, a light shielding film 205 that covers a part of the photoelectric conversion unit 201, and a light shielding film 205 that covers a part of the photoelectric conversion unit 201. and a phase difference detection pixel 23 including an opening op, which is a portion not covered with the film 205 .
  • the imaging pixel 21 is a first imaging pixel that is adjacent to other imaging pixels (the imaging pixel 21 and the imaging pixel 22) but not adjacent to the phase difference detection pixel 23 in a specific direction (for example, the X-axis direction).
  • the imaging pixel 22 is a second imaging pixel adjacent to the phase difference detection pixel 23 such that the opening op of the phase difference detection pixel 23 is adjacent in a specific direction.
  • the imaging pixel 22 is adjacent to the phase difference detecting pixel 23 in a specific direction such that the opening op of the phase difference detecting pixel 23 is located on the imaging pixel 22 side. That is, the phase difference detection pixel 23 is arranged adjacent to the imaging pixel 22 so that the opening op is located on the imaging pixel 22 side.
  • the image pickup pixel 22 as well as in the image pickup pixel 21, light from adjacent pixels is mixed. Therefore, it is possible to suppress variations in the amount of light mixed in from adjacent pixels between the imaging pixels 21 and 22 (between the imaging pixels). For example, when the photodetector 1 is a solid-state imaging device, image quality deterioration can be suppressed.
  • the phase difference detection pixel 23, one or more imaging pixels 22 (second imaging pixels), and the position The phase difference detection pixels 23 may be arranged adjacently in this order. For example, as described with reference to FIG. op may be adjacent in this order.
  • the phase difference detection pixel 23, one or more imaging pixels 22 (second imaging pixels), one or more imaging pixels 21 (first imaging pixels), one or more imaging pixels 22 (second imaging pixels), and phase difference detection pixels 23 may be arranged adjacent to each other in this order. For example, by arranging the imaging pixels 22 between the phase difference detection pixels 23 in this manner, the imaging pixels 22 can mix light from adjacent pixels in the same manner as the imaging pixels 21 .
  • the phase difference detection pixels 23 are a pair of phase difference detection pixels 23, and the opening opA of one phase difference detection pixel 23A of the pair is arranged in a specific direction. , and the opening opB of the other phase difference detection pixel 23B of the pair is exposed to the other side portion of the photoelectric conversion unit 201 in a specific direction. (For example, the portion on the X-axis positive direction side) may be exposed.
  • an image plane phase difference AF function can be obtained.
  • the imaging pixels 21 and 22 may include the color filters 202 provided for the photoelectric conversion units 201 .
  • the phase difference detection pixels 23 may also include color filters 202 provided for the photoelectric conversion units 201 .
  • the color filter 202 of the phase difference detection pixel 23 may pass light of a color different from the color of light passed by the color filter 202 of the imaging pixel 22 (second imaging pixel). When such a color filter 202 is present, it is possible to cause color mixture in the image pickup pixel 22 as in the image pickup pixel 21 . It is possible to suppress variations in the degree of color mixing between the imaging pixels 21 and the imaging pixels 22 . For example, deterioration of image quality can be suppressed.
  • the specific direction is the row direction (X-axis direction) of the plurality of pixels 2. direction), the column direction (Y-axis direction) of the plurality of pixels 2, and the intermediate direction (diagonal direction) between the pixel row direction and the column direction of the pixels 2.
  • the imaging pixels 22 and the phase difference detection pixels 23 are in such various directions.
  • the electronic device 101 described with reference to FIG. 18 and the like is also one of the disclosed technologies.
  • a photodetector 103 is mounted on the electronic device 101 .
  • the photodetector 103 has a configuration similar to that of the photodetector 1 described above. With such an electronic device 101 as well, as described above, it is possible to suppress variation in the amount of light mixed in from adjacent pixels between the imaging pixels 21 and 22 . For example, deterioration of image quality can be suppressed.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
  • FIG. 19 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (Interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
  • the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
  • the microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit.
  • a control command can be output to 12010 .
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) functions including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) functions including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
  • the audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 20 is a diagram showing an example of the installation position of the imaging unit 12031.
  • FIG. 20 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the imaging unit 12031 has imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose, side mirrors, rear bumper, back door, and windshield of the vehicle 12100, for example.
  • An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
  • Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
  • An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
  • the imaging unit 12105 provided above the windshield in the passenger compartment is mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 20 shows an example of the imaging range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of an imaging unit 12104 provided in the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the traveling path of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
  • automatic brake control including following stop control
  • automatic acceleration control including following start control
  • the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 .
  • recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian.
  • the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to, for example, the imaging unit 12031 among the configurations described above.
  • the photodetector 1 can be applied to the imaging unit 12031 .
  • FIG. 21 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (this technology) can be applied.
  • FIG. 21 illustrates a situation in which an operator (doctor) 11131 is performing surgery on a patient 11132 on a patient bed 11133 using an endoscopic surgery system 11000 .
  • an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as a pneumoperitoneum tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 for supporting the endoscope 11100. , and a cart 11200 loaded with various devices for endoscopic surgery.
  • An endoscope 11100 is composed of a lens barrel 11101 whose distal end is inserted into the body cavity of a patient 11132 and a camera head 11102 connected to the proximal end of the lens barrel 11101 .
  • an endoscope 11100 configured as a so-called rigid scope having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible scope having a flexible lens barrel. good.
  • the tip of the lens barrel 11101 is provided with an opening into which the objective lens is fitted.
  • a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel 11101 by a light guide extending inside the lens barrel 11101, where it reaches the objective. Through the lens, the light is irradiated toward the observation object inside the body cavity of the patient 11132 .
  • the endoscope 11100 may be a straight scope, a perspective scope, or a side scope.
  • An optical system and an imaging element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is focused on the imaging element by the optical system.
  • the imaging device photoelectrically converts the observation light to generate an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image.
  • the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
  • CCU Camera Control Unit
  • the CCU 11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the operations of the endoscope 11100 and the display device 11202 in an integrated manner. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various image processing such as development processing (demosaicing) for displaying an image based on the image signal.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201 .
  • the light source device 11203 is composed of a light source such as an LED (light emitting diode), for example, and supplies the endoscope 11100 with irradiation light for imaging a surgical site or the like.
  • a light source such as an LED (light emitting diode)
  • LED light emitting diode
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204 .
  • the user inputs an instruction or the like to change imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100 .
  • the treatment instrument control device 11205 controls driving of the energy treatment instrument 11112 for tissue cauterization, incision, blood vessel sealing, or the like.
  • the pneumoperitoneum device 11206 inflates the body cavity of the patient 11132 for the purpose of securing the visual field of the endoscope 11100 and securing the operator's working space, and injects gas into the body cavity through the pneumoperitoneum tube 11111. send in.
  • the recorder 11207 is a device capable of recording various types of information regarding surgery.
  • the printer 11208 is a device capable of printing various types of information regarding surgery in various formats such as text, images, and graphs.
  • the light source device 11203 that supplies the endoscope 11100 with irradiation light for photographing the surgical site can be composed of, for example, a white light source composed of an LED, a laser light source, or a combination thereof.
  • a white light source is configured by a combination of RGB laser light sources
  • the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. It can be carried out.
  • the observation target is irradiated with laser light from each of the RGB laser light sources in a time division manner, and by controlling the drive of the imaging device of the camera head 11102 in synchronization with the irradiation timing, each of RGB can be handled. It is also possible to pick up images by time division. According to this method, a color image can be obtained without providing a color filter in the imaging element.
  • the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time.
  • the drive of the imaging device of the camera head 11102 in synchronism with the timing of the change in the intensity of the light to obtain an image in a time-division manner and synthesizing the images, a high dynamic A range of images can be generated.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissues, by irradiating light with a narrower band than the irradiation light (i.e., white light) during normal observation, the mucosal surface layer So-called Narrow Band Imaging, in which a predetermined tissue such as a blood vessel is imaged with high contrast, is performed.
  • fluorescence observation may be performed in which an image is obtained from fluorescence generated by irradiation with excitation light.
  • the body tissue is irradiated with excitation light and the fluorescence from the body tissue is observed (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is examined.
  • a fluorescence image can be obtained by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 can be configured to supply narrowband light and/or excitation light corresponding to such special light observation.
  • FIG. 22 is a block diagram showing an example of functional configurations of the camera head 11102 and CCU 11201 shown in FIG.
  • the camera head 11102 has a lens unit 11401, an imaging section 11402, a drive section 11403, a communication section 11404, and a camera head control section 11405.
  • the CCU 11201 has a communication section 11411 , an image processing section 11412 and a control section 11413 .
  • the camera head 11102 and the CCU 11201 are communicably connected to each other via a transmission cable 11400 .
  • a lens unit 11401 is an optical system provided at a connection with the lens barrel 11101 . Observation light captured from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401 .
  • a lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the number of imaging elements constituting the imaging unit 11402 may be one (so-called single-plate type) or plural (so-called multi-plate type).
  • image signals corresponding to RGB may be generated by each image pickup element, and a color image may be obtained by synthesizing the image signals.
  • the imaging unit 11402 may be configured to have a pair of imaging elements for respectively acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display.
  • the 3D display enables the operator 11131 to more accurately grasp the depth of the living tissue in the surgical site.
  • a plurality of systems of lens units 11401 may be provided corresponding to each imaging element.
  • the imaging unit 11402 does not necessarily have to be provided in the camera head 11102 .
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is configured by an actuator, and moves the zoom lens and focus lens of the lens unit 11401 by a predetermined distance along the optical axis under control from the camera head control unit 11405 . Thereby, the magnification and focus of the image captured by the imaging unit 11402 can be appropriately adjusted.
  • the communication unit 11404 is composed of a communication device for transmitting and receiving various information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400 .
  • the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies it to the camera head control unit 11405 .
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and/or information to specify the magnification and focus of the captured image. Contains information about conditions.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately designated by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. good.
  • the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function and AWB (Auto White Balance) function.
  • the camera head control unit 11405 controls driving of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102 .
  • the communication unit 11411 receives image signals transmitted from the camera head 11102 via the transmission cable 11400 .
  • the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102 .
  • Image signals and control signals can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various types of image processing on the image signal, which is RAW data transmitted from the camera head 11102 .
  • the control unit 11413 performs various controls related to imaging of the surgical site and the like by the endoscope 11100 and display of the captured image obtained by imaging the surgical site and the like. For example, the control unit 11413 generates control signals for controlling driving of the camera head 11102 .
  • control unit 11413 causes the display device 11202 to display a captured image showing the surgical site and the like based on the image signal that has undergone image processing by the image processing unit 11412 .
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edges of objects included in the captured image, thereby detecting surgical instruments such as forceps, specific body parts, bleeding, mist during use of the energy treatment instrument 11112, and the like. can recognize.
  • the control unit 11413 may use the recognition result to display various types of surgical assistance information superimposed on the image of the surgical site. By superimposing and presenting the surgery support information to the operator 11131, the burden on the operator 11131 can be reduced and the operator 11131 can proceed with the surgery reliably.
  • a transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
  • wired communication is performed using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to, for example, the imaging unit 11402 of the camera head 11102 among the configurations described above.
  • the photodetector 1 can be applied to the imaging unit 10402 .
  • the technology according to the present disclosure may also be applied to, for example, a microsurgery system.
  • the present technology can also take the following configuration.
  • (1) comprising a plurality of pixels arranged two-dimensionally,
  • the plurality of pixels are an imaging pixel including a photoelectric conversion unit; a phase difference detection pixel including a photoelectric conversion unit, a light shielding film that partially covers the photoelectric conversion unit, and an opening that is a portion of the photoelectric conversion unit that is not covered with the light shielding film; including
  • the imaging pixels are A first imaging pixel adjacent to another imaging pixel in a specific direction but not adjacent to the phase difference detection pixel; a second imaging pixel adjacent to the phase difference detection pixel such that the aperture of the phase difference detection pixel is adjacent in the specific direction; including, Photodetector.
  • phase difference detection pixel, one or more of the second imaging pixels, and the phase difference detection pixel are arranged adjacent to each other in this order.
  • the opening of the phase difference detection pixel, one of the second imaging pixels, and the opening of the phase difference detection pixel are adjacent in this order.
  • the phase difference detection pixel, one or more of the second imaging pixels, one or more of the first imaging pixels, one or more of the second imaging pixels, and the phase difference. the detection pixels are arranged adjacently in this order, The photodetector according to (1).
  • the phase difference detection pixels are pairs of phase difference detection pixels, the opening of one phase difference detection pixel of the pair exposes one side portion of the photoelectric conversion unit in the specific direction; the opening of the other phase difference detection pixel of the pair exposes the other side portion of the photoelectric conversion unit in the specific direction;
  • the imaging pixel includes a color filter provided for the photoelectric conversion unit, The photodetector according to any one of (1) to (5).
  • the phase difference detection pixel includes a color filter provided for the photoelectric conversion unit, The photodetector according to (6).
  • the color filter of the phase difference detection pixel passes light of a color different from the color of light passed by the color filter of the second imaging pixel, The photodetector according to (7).
  • the specific direction is a row direction of the plurality of pixels; a column direction of the plurality of pixels; and an intermediate direction between the row direction and the column direction of the plurality of pixels; including at least one of The photodetector according to any one of (1) to (8).
  • An electronic device equipped with a photodetector is comprising a plurality of pixels arranged two-dimensionally, The plurality of pixels are an imaging pixel including a photoelectric conversion unit; A phase difference detection pixel including a photoelectric conversion unit and a light shielding film covering a part of the photoelectric conversion unit; including The imaging pixels are a plurality of first imaging pixels adjacent to other imaging pixels in a specific direction but not adjacent to the phase difference detection pixels; a second imaging pixel that is adjacent to the phase difference detection pixel in the specific direction and adjacent to an opening that is a portion of the phase difference detection pixel in which the photoelectric conversion unit is not covered with the light shielding film; including, Electronics.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

L'invention concerne un dispositif de détection de lumière (1) pourvu d'une pluralité de pixels (2) disposés en deux dimensions. La pluralité de pixels (2) comprend : des pixels d'imagerie (21, 22) comprenant une unité de conversion photoélectrique (201) ; et un pixel de détection de différence de phase (23) comprenant une unité de conversion photoélectrique (201), un film de blocage de la lumière (205) couvrant une partie de l'unité de conversion photoélectrique (201), et une ouverture (op) qui est une partie où l'unité de conversion photoélectrique (201) n'est pas couverte par le film de blocage de la lumière (205). Les pixels d'imagerie (21, 22) comprennent : un premier pixel d'imagerie (21) qui est adjacent à d'autres pixels d'imagerie (21, 22) mais qui n'est pas adjacent au pixel de détection de différence de phase (23) dans une direction particulière ; et un deuxième pixel d'imagerie (22) qui est adjacent au pixel de détection de différence de phase (23) de sorte que l'ouverture (op) du pixel de détection de différence de phase (23) est adjacente dans une direction particulière.
PCT/JP2022/039816 2021-11-25 2022-10-26 Dispositif de détection de lumière, et appareil électronique WO2023095518A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-191032 2021-11-25
JP2021191032A JP2023077673A (ja) 2021-11-25 2021-11-25 光検出装置及び電子機器

Publications (1)

Publication Number Publication Date
WO2023095518A1 true WO2023095518A1 (fr) 2023-06-01

Family

ID=86539358

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/039816 WO2023095518A1 (fr) 2021-11-25 2022-10-26 Dispositif de détection de lumière, et appareil électronique

Country Status (2)

Country Link
JP (1) JP2023077673A (fr)
WO (1) WO2023095518A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014046037A1 (fr) * 2012-09-19 2014-03-27 富士フイルム株式会社 Dispositif de formation d'image et procédé destiné à le commander
WO2018003501A1 (fr) * 2016-06-28 2018-01-04 ソニー株式会社 Dispositif d'imagerie à semi-conducteurs, appareil électronique, procédé de commande de lentille et véhicule

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014046037A1 (fr) * 2012-09-19 2014-03-27 富士フイルム株式会社 Dispositif de formation d'image et procédé destiné à le commander
WO2018003501A1 (fr) * 2016-06-28 2018-01-04 ソニー株式会社 Dispositif d'imagerie à semi-conducteurs, appareil électronique, procédé de commande de lentille et véhicule

Also Published As

Publication number Publication date
JP2023077673A (ja) 2023-06-06

Similar Documents

Publication Publication Date Title
US11847759B2 (en) Image processing device, image processing method, and image processing program
JP2018200980A (ja) 撮像装置および固体撮像素子、並びに電子機器
US11889206B2 (en) Solid-state imaging device and electronic equipment
WO2019220696A1 (fr) Élément d'imagerie et dispositif d'imagerie
TW202133413A (zh) 固態攝像裝置及電子機器
WO2019207978A1 (fr) Élément de capture d'image et procédé de fabrication d'élément de capture d'image
WO2018173793A1 (fr) Élément de capture d'image à semiconducteur et dispositif électronique
US20240321917A1 (en) Imaging device
WO2021171796A1 (fr) Dispositif d'imagerie à semi-conducteur et appareil électronique
WO2023095518A1 (fr) Dispositif de détection de lumière, et appareil électronique
WO2021100338A1 (fr) Élément de capture d'image à semi-conducteurs
WO2023162496A1 (fr) Dispositif d'imagerie
WO2023080011A1 (fr) Dispositif d'imagerie et appareil électronique
WO2023013156A1 (fr) Élément d'imagerie et dispositif électronique
US20240347557A1 (en) Imaging device
WO2022158170A1 (fr) Photodétecteur et dispositif électronique
WO2023195316A1 (fr) Dispositif de détection de lumière
WO2021090663A1 (fr) Dispositif de capture d'image et procédé de correction de signal de pixel dans un dispositif de capture d'image
WO2021186911A1 (fr) Dispositif d'imagerie et appareil électronique
WO2023195315A1 (fr) Dispositif de détection de lumière
WO2024029408A1 (fr) Dispositif d'imagerie
EP4415047A1 (fr) Dispositif d'imagerie
WO2023132137A1 (fr) Élément d'imagerie et appareil électronique
WO2024095832A1 (fr) Photodétecteur, appareil électronique et élément optique
KR20240152856A (ko) 촬상 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22898302

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE