WO2022190867A1 - 撮像装置および測距システム - Google Patents

撮像装置および測距システム Download PDF

Info

Publication number
WO2022190867A1
WO2022190867A1 PCT/JP2022/007398 JP2022007398W WO2022190867A1 WO 2022190867 A1 WO2022190867 A1 WO 2022190867A1 JP 2022007398 W JP2022007398 W JP 2022007398W WO 2022190867 A1 WO2022190867 A1 WO 2022190867A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pixel array
pixel
semiconductor substrate
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/007398
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
秀晃 富樫
香織 瀧本
雅博 瀬上
慶 中川
信宏 河合
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Sony Group Corp
Original Assignee
Sony Semiconductor Solutions Corp
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp, Sony Group Corp filed Critical Sony Semiconductor Solutions Corp
Priority to EP22766828.2A priority Critical patent/EP4307378A4/en
Priority to DE112022001551.9T priority patent/DE112022001551T5/de
Priority to US18/546,684 priority patent/US12237348B2/en
Priority to JP2023505277A priority patent/JPWO2022190867A1/ja
Priority to CN202280019409.0A priority patent/CN116941040A/zh
Priority to KR1020237029303A priority patent/KR20230156694A/ko
Publication of WO2022190867A1 publication Critical patent/WO2022190867A1/ja
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors
    • H10F39/18Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
    • H10F39/184Infrared image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/802Geometry or disposition of elements in pixels, e.g. address-lines or gate electrodes
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/802Geometry or disposition of elements in pixels, e.g. address-lines or gate electrodes
    • H10F39/8023Disposition of the elements in pixels, e.g. smaller elements in the centre of the imager compared to larger elements at the periphery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/17Colour separation based on photon absorption depth, e.g. full colour resolution obtained simultaneously at each pixel location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/705Pixels for depth measurement, e.g. RGBZ
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/707Pixels for event detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors
    • H10F39/18Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors
    • H10F39/18Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
    • H10F39/182Colour image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors
    • H10F39/191Photoconductor image sensors
    • H10F39/193Infrared image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/803Pixels having integrated switching, control, storage or amplification elements
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/805Coatings
    • H10F39/8053Colour filters

Definitions

  • the present disclosure relates to imaging devices and ranging systems.
  • an imaging device that stacks a pixel array that captures a visible light image and a pixel array that captures an infrared light image.
  • an imaging device there is one that performs imaging control of a visible light image using information of an infrared light image (see, for example, Patent Document 1).
  • the present disclosure proposes an imaging device and a ranging system that can effectively use visible light image information.
  • an imaging device has a semiconductor substrate, a first pixel array, a second pixel array, and a controller.
  • the first pixel array is provided on the semiconductor substrate, has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and emits light in a first wavelength region including a visible light region.
  • the first light-receiving pixels to be converted are arranged two-dimensionally.
  • a second pixel array is provided in the semiconductor substrate at a position overlapping with the first light receiving pixels in a thickness direction of the semiconductor substrate, and photoelectrically converts light in a second wavelength region including an infrared light region.
  • Light-receiving pixels are arranged two-dimensionally.
  • the control unit drives and controls the second pixel array based on the signals photoelectrically converted by the first pixel array.
  • FIG. 1 is an explanatory diagram showing a configuration example of a ranging system according to the present disclosure
  • FIG. It is an explanatory view showing an example of composition of an imaging device concerning this indication.
  • FIG. 2 is a cross-sectional explanatory diagram of a pixel array according to the present disclosure
  • 1 is a circuit diagram showing an example of a first readout circuit according to the present disclosure
  • FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
  • FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
  • FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
  • FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
  • FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
  • FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a
  • FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
  • FIG. 4 is an explanatory diagram showing an operation example of an imaging device according to the present disclosure
  • 4 is a flowchart showing an example of processing executed by an imaging device according to the present disclosure
  • 7 is a flowchart showing a modified example of processing executed by an imaging device according to the present disclosure
  • FIG. 11 is an explanatory diagram showing a configuration example of a ranging system according to a modification of the present disclosure
  • FIG. 11 is a flowchart showing an example of processing executed by an imaging device according to a modification of the present disclosure
  • FIG. 1 is an explanatory diagram showing a configuration example of a ranging system according to the present disclosure.
  • the distance measurement system 100 shown in FIG. 1 captures an infrared light image (hereinafter sometimes referred to as a “distance image”) of a subject irradiated with infrared light, and the light flight time (Time This is a system that realizes a d (direct) TOF sensor that measures the distance to an object based on Of Tlight (TOF).
  • the ranging system 100 may be a system that implements i (indirect) ToF.
  • the ranging system 100 includes a light source 101, an imaging optical system 102, an imaging device 1, an ISP (Image Signal Processor) 103, an input device 104, a display device 105, and a data storage unit 106.
  • ISP Image Signal Processor
  • a light source 101 is an infrared light laser that emits infrared light toward a subject to be distance-measured.
  • the light source 101 is driven and controlled by the ISP 103, and emits infrared light while blinking at a predetermined high frequency at high speed.
  • the imaging optical system 102 includes a lens or the like that forms an image of the infrared light reflected by the subject on the light receiving unit of the imaging device 1 .
  • the imaging device 1 is a device that captures a distance image, which is an infrared light image of a subject, and outputs image data of the distance image to the ISP 103 .
  • the imaging device 1 can capture not only a distance image but also a visible light image. When capturing a visible light image, the imaging device 1 outputs image data of the visible light image to the ISP 103 as necessary.
  • the ISP 103 drives and controls the light source 101, and measures the distance to the subject based on the light flight time of the infrared light from the image data of the distance image input from the imaging device 1 and the phase shift information from the input light. , the measurement result and the distance image are output to the display device 105 and the data storage unit 106 . Further, when the image data of the visible light image is input from the imaging device 1, the ISP 103 outputs the image data of the visible light image to the display device 105 and the data storage unit 106 as necessary. Furthermore, ISP 103 can output colored three-dimensional information obtained by synthesizing distance information and RGB information to display device 105 and data storage unit 106 as needed.
  • the display device 105 is, for example, a liquid crystal display, and displays the distance measurement result measured by the ISP 103, an infrared light image, or a visible light image.
  • the data storage unit 106 is, for example, a memory, and stores the distance measurement result measured by the ISP 103, the image data of the infrared light image, or the image data of the visible light image.
  • the input device 104 receives user operations for performing various settings of the ranging system 100 and user operations for causing the ranging system 100 to perform ranging, and outputs signals corresponding to the user operations to the ISP 103 .
  • the imaging device 1 also includes a pixel array 10 , an AD (Analog to Digital) converter 20 , and a data processor 30 . Note that FIG. 1 shows part of the components of the imaging apparatus 1 . A specific configuration example of the imaging device 1 will be described later with reference to FIG.
  • the pixel array 10 photoelectrically converts the incident light from the imaging optical system 102 into a range image pixel signal and a visible light image pixel signal, and outputs the pixel signals to the AD conversion unit 20 .
  • the AD conversion unit 20 AD-converts the pixel signals of the distance image and the visible light image and outputs them to the data processing unit 30 .
  • the data processing unit 30 performs various types of image processing and image analysis on the image data of the range image and the visible light image, and outputs the image data after image processing and image analysis to the ISP 103 .
  • the distance measuring system 100 is an iToF sensor, for example, it is necessary to calculate a distance image (distance data) from four (times) phase data.
  • phase data for four times are once stored in a memory, and then calculated to create one distance image (distance data).
  • the data storage unit 106 for example, flash memory
  • a memory unit for example, memory such as SRAM (Static Random Access Memory) for temporarily storing phase data.
  • SRAM is a memory in which information can be written and read at high speed, so it is suitable for distance calculation processing that must be performed at high speed. Such a memory should be able to store information only when the power is on.
  • Ranging system 100 uses the memory built into ISP 103 for temporary storage of phase data in the case of an iToF sensor.
  • the distance measurement system 100 needs to operate the light source 101 and the pixel array 10 that captures the distance image at high frequencies, and consumes more power than a general visible light sensor. . Therefore, the ranging system 100 effectively utilizes visible light image data in the imaging device 1 to reduce power consumption.
  • the imaging device 1 first captures a visible light image that consumes relatively little power for imaging, and does not capture a range image until a moving object is detected in the visible light image. After that, when the imaging device 1 detects a moving object in the visible light image, the imaging device 1 causes the light source 101 to emit light, captures a distance image that consumes relatively large power for imaging, and performs distance measurement. Thereby, the ranging system 100 can appropriately reduce power consumption.
  • the imaging device 1 first captures a visible light image that consumes relatively little power for imaging, and does not capture a range image until a moving object is detected in the visible light image.
  • the imaging device 1 causes the light source 101 to emit light, captures a distance image that consumes relatively large power for imaging, and performs distance measurement.
  • the ranging system 100 can appropriately reduce power consumption.
  • FIG. 2 is an explanatory diagram showing a configuration example of an imaging device according to the present disclosure.
  • the imaging device 1 includes a pixel array 10, a global control circuit 41, a first row control circuit 42, a second row control circuit 43, a first readout circuit 21, and a first data processing circuit. It includes a section 22 , a second readout circuit 23 , and a second data processing section 24 .
  • Global control circuit 41, first row control circuit 42, and second row control circuit 43 are included in control unit 40 shown in FIG.
  • the pixel array 10 includes a first pixel array 11 and a second pixel array 12 .
  • first pixels a plurality of first light-receiving pixels (hereinafter referred to as "first pixels”) 13 corresponding to each pixel of the visible light image are arranged two-dimensionally (in a matrix).
  • second pixels a plurality of second light-receiving pixels (hereinafter referred to as "second pixels") 14 corresponding to each pixel of the infrared light image are arranged two-dimensionally (in a matrix).
  • the first pixel 13 photoelectrically converts light in the first wavelength range including the visible light range into signal charges according to the amount of received light.
  • the first pixel 13 photoelectrically converts red light, green light, and blue light.
  • the second pixel 14 photoelectrically converts light in the second wavelength range including the infrared light range.
  • the second pixels 14 photoelectrically convert infrared light into signal charges corresponding to the amount of received light.
  • the first pixel array 11 is laminated on the second pixel array 12 .
  • the first pixel array 11 photoelectrically converts incident visible light to capture a visible light image.
  • the second pixel array 12 photoelectrically converts the infrared light transmitted through the first pixel array to capture an infrared light image.
  • the imaging device 1 can capture a visible light image and an infrared light image with one pixel array 10 .
  • the global control circuit 41 controls the first row control circuit 42 and the ranging pulse generator 50 based on the control signal from the ISP 103 .
  • the first row control circuit 42 drives and controls the first pixel array 11 to capture a visible light image.
  • the first pixel array 11 photoelectrically converts visible light into signal charges and accumulates them.
  • the first readout circuit 21 reads out a signal corresponding to the signal charge from the first pixel array 11 and outputs it to the first data processing section 22 .
  • the first data processing unit 22 AD-converts the signal input from the first reading circuit 21 to acquire image data of a visible light image, and performs predetermined image processing and image analysis on the image data.
  • the first data processing unit 22 outputs image data after image processing and image analysis to the ISP 103 as necessary.
  • the first data processing unit 22 outputs the result of image analysis to the second row control circuit 43 to operate the second row control circuit 43 .
  • image analysis the first data processing unit 22 determines, for example, whether the subject in the visible light image is moving or not, and operates the second row control circuit 43 only when the subject is moving.
  • the first data processing unit 22 detects the changed first pixels 13 from the image data captured by the first pixel array 11 .
  • the first pixel 13 having a change here means, for example, the first pixel 13 in which the difference between the signal charge amount accumulated in the image data of the previous frame and the signal charge amount accumulated in the image data of the current frame exceeds a threshold value. is.
  • the second row control circuit 43 starts driving the second pixel array 12 when the first data processing unit 22 detects the changed first pixels 13 . Thereby, the imaging device 1 can appropriately reduce the power consumption required for driving the second pixel array 12 .
  • the first data processing unit 22 may be configured to control the second readout circuit 23 so as to read out signals only from the second pixels 14 corresponding to the changing first pixels 13 . Thereby, the imaging device 1 can also reduce the power consumption of the second readout circuit 23 .
  • the first data processing unit 22 can cause the second row control circuit 43 to drive only the second pixels 14 corresponding to the first pixels 13 having a change in the first pixel array 11 .
  • the imaging device 1 can further reduce power consumption required for driving the second pixel array 12 .
  • the first data processing unit 22 sets a threshold for the amount of change in the first pixel array 11 instead of the binary value of whether the subject is moving or not, and sets the first threshold when the amount of change due to the movement of the subject exceeds the threshold. It is also possible to drive only the second pixels corresponding to the pixels 13 by the second row control circuit 43 . Thereby, the imaging device 1 can minimize the number of second pixels 14 to be driven.
  • the first data processing section 22 may be configured so that the operating frequency of the second row control circuit 43 can be changed according to changes in the first pixel array 11 .
  • the first data processing unit 22 increases the operating frequency of the second row control circuit 43 when the change in the first pixel array 11 (moving speed of the object) is large.
  • the imaging device 1 can capture a clear image of a subject moving at a high speed using the second pixel array.
  • the second row control circuit 43 drives and controls the second pixel array 12 to capture an infrared light image.
  • the second pixel array 12 photoelectrically converts infrared light into signal charges and accumulates them.
  • the second readout circuit 23 reads a signal corresponding to the signal charge from the second pixel array 12 and outputs it to the second data processing section 24 .
  • the second data processing unit 24 AD-converts the signal input from the second reading circuit 23 to obtain image data of a distance image, which is an infrared light image, and performs predetermined image processing on the image data. Output to ISP 103 .
  • the range of the second pixels 14 to be driven may be determined by the control unit 40 (second row control circuit 43), by a dedicated determination circuit, or by the ISP 103. good too.
  • the second row control circuit 43, determination circuit, or ISP 103 acquires the image data of the visible light image from the first data processing unit 22, and detects the first pixels 13 with changes from the image data. Then, the second row control circuit 43, the determination circuit, or the ISP 103 determines the range of the second pixels 14 arranged at the position overlapping the detected first pixels 13 as the range of the second pixels 14 to be driven.
  • the distance measurement system 100 cannot calculate the distance from one (time) distance image. It is also possible to calculate the distance to the subject from a plurality of distance images stored internally.
  • FIG. 3 is a cross-sectional explanatory diagram of a pixel array according to the present disclosure.
  • FIG. 3 schematically shows a cross section of the pixel array 10 corresponding to one second pixel 14 .
  • the positive direction of the Z-axis in the orthogonal coordinate system shown in FIG. 3 is referred to as "up”
  • the negative direction of the Z-axis is referred to as "down” for convenience.
  • the pixel array 10 includes a so-called longitudinal spectral imaging element having a structure in which the second pixels 14 and the first pixels 13 are stacked in the Z-axis direction, which is the thickness direction.
  • the pixel array 10 includes an intermediate layer 60 provided between the second pixels 14 and the first pixels 13, and a multilayer wiring layer 61 provided on the opposite side of the first pixels 13 as viewed from the second pixels 14. Prepare.
  • a sealing film 62 is arranged on the side opposite to the intermediate layer 60 when viewed from the first pixels 13 .
  • the first pixel 13 is an indirect TOF (hereinafter referred to as "iTOF") sensor that acquires a distance image (distance information) by TOF.
  • the first pixel 13 has a semiconductor substrate 66 , a photoelectric conversion region 67 , a fixed charge layer 68 , a pair of gate electrodes 69A and 69B, floating diffusions 70A and 70B, and a through electrode 71 .
  • the semiconductor substrate 66 is, for example, an n-type silicon substrate, and has a P-well in a predetermined internal region.
  • a lower surface of the semiconductor substrate 66 faces the multilayer wiring layer 61 .
  • the upper surface of the semiconductor substrate 66 faces the intermediate layer 60 and has a fine uneven structure. As a result, the upper surface of the semiconductor substrate 66 appropriately scatters the incident infrared light, thereby increasing the optical path length.
  • the semiconductor substrate 66 may also have a fine uneven structure formed on the bottom surface.
  • the photoelectric conversion region 67 is a photoelectric conversion element composed of a PIN (Positive Intrinsic Negative) type photodiode.
  • the photoelectric conversion region 67 mainly receives light having wavelengths in the infrared region (infrared light) among the light incident on the pixel array 10, photoelectrically converts the light into signal charges according to the amount of received light, and accumulates the signal charges.
  • the fixed charge layer 68 is provided so as to cover the top surface and side surfaces of the semiconductor substrate 66 .
  • the fixed charge layer 68 contains negative fixed charges that suppress the generation of dark current due to the interface state of the upper surface of the semiconductor substrate 66 that is the light receiving surface.
  • the fixed charge layer 68 forms a hole accumulation layer in the vicinity of the upper surface of the semiconductor substrate 66, and suppresses generation of electrons from the upper surface of the semiconductor substrate 66 by the hole accumulation layer.
  • the fixed charge layer 68 also extends between the inter-pixel area light shielding wall 72 and the photoelectric conversion area 67 .
  • the gate electrodes 69A and 69B extend from the lower surface of the semiconductor substrate 66 to a position reaching the photoelectric conversion region 67.
  • the gate electrodes 69A, 69B transfer the signal charge accumulated in the photoelectric conversion region 67 to the floating diffusions 70A, 70B when a predetermined voltage is applied.
  • the floating diffusions 70A and 70B are floating diffusion regions that temporarily hold the signal charges transferred from the photoelectric conversion region 67.
  • the signal charges held in the floating diffusions 70A and 70B are read out as pixel signals by the second readout circuit 23 (see FIG. 2).
  • a wiring 74 is provided inside an insulating layer 73 in the multilayer wiring layer 61 .
  • the insulating layer 73 is made of, for example, silicon oxide.
  • the wiring 74 is made of metal such as copper or gold, for example.
  • a first readout circuit 21 and a second readout circuit 23 are also provided inside the insulating layer 73 .
  • the intermediate layer 60 has an optical filter 75 embedded in the insulating layer 73 and an inter-pixel area light shielding film 76 .
  • the optical filter 75 is made of, for example, an organic material, and mainly selectively transmits light with frequencies in the infrared region.
  • the inter-pixel region light shielding film 76 reduces color mixture between adjacent pixels.
  • the first pixel 13 has a first electrode 77, a semiconductor layer 78, a photoelectric conversion layer 79, and a second electrode 80, which are stacked in order from a position closer to the photoelectric conversion region 67. Furthermore, the first pixel 13 has a charge storage electrode 82 provided below the semiconductor layer 78 so as to face the semiconductor layer 78 with the insulating film 81 interposed therebetween.
  • the charge storage electrode 82 and the first electrode 77 are separated from each other, and are provided on the same layer, for example.
  • the first electrode 77 is connected to the upper end of the through electrode 71, for example.
  • the first electrode 77, the second electrode 80, and the charge storage electrode 82 are formed of, for example, a light-transmitting conductive film such as ITO (indium tin oxide) or IZO (indium zinc oxide containing indium). .
  • the photoelectric conversion layer 79 converts light energy into electrical energy, and is formed by including two or more kinds of organic materials that function as p-type semiconductors and n-type semiconductors, for example.
  • a p-type semiconductor functions as an electron donor (donor).
  • the n-type semiconductor functions as an electron acceptor.
  • the photoelectric conversion layer 79 has a bulk heterojunction structure.
  • a bulk heterojunction structure is a p/n junction formed by mixing a p-type semiconductor and an n-type semiconductor.
  • the photoelectric conversion layer 79 separates incident light into electrons and holes at the p/n junction.
  • the charge storage electrode 82 forms a capacitor together with the insulating film 81 and the semiconductor layer 78, and the signal charge generated in the photoelectric conversion layer 79 is transferred to the region of the semiconductor layer 78 facing the charge storage electrode 82 via the insulating film 81. accumulate.
  • the charge storage electrode 82 is provided at a position corresponding to each color filter 63 and on-chip lens 65 .
  • excitons thus generated move to the interface between the electron donor and the electron acceptor that constitute the photoelectric conversion layer 79 and are separated into electrons and holes.
  • the electrons and holes generated here move to the second electrode 80 or the semiconductor layer 78 and are accumulated due to the difference in carrier concentration and the internal electric field due to the potential difference between the first electrode 77 and the second electrode 80 .
  • the first electrode 77 is set at a positive potential and the second electrode 80 is set at a negative potential.
  • holes generated in the photoelectric conversion layer 79 move to the second electrode 80 side. Electrons generated in the photoelectric conversion layer 79 are attracted to the charge storage electrode 82 and stored in the region of the semiconductor layer 78 corresponding to the charge storage electrode 82 via the insulating film 81 .
  • the electrons accumulated in the region of the semiconductor layer 78 corresponding to the charge storage electrode 82 via the insulating film 81 are read as follows.
  • the potential V1 is applied to the first electrode 77 and the potential V2 is applied to the charge storage electrode 82 .
  • the potential V1 is set higher than the potential V2.
  • the electrons accumulated in the region of the semiconductor layer 78 corresponding to the charge accumulation electrode 82 via the insulating film 81 are transferred to the first electrode 77 and read out.
  • the semiconductor layer 78 is provided below the photoelectric conversion layer 79, and charges (e.g., electrons) are accumulated in regions of the semiconductor layer 78 corresponding to the charge accumulation electrodes 82 via the insulating film 81.
  • charges e.g., electrons
  • the following effects are obtained. Compared to the case where charges (for example, electrons) are accumulated in the photoelectric conversion layer 79 without providing the semiconductor layer 78, recombination of holes and electrons during charge accumulation is prevented, and the accumulated charges (for example, electrons) to the first electrode 77 can be increased.
  • reading electrons has been described here, reading of holes may be performed. When reading holes, the relationship between the potentials V1, V2 and V3 described above is reversed.
  • FIG. 4 is a circuit diagram illustrating an example of a first readout circuit according to the present disclosure.
  • the first readout circuit 21 has, for example, a floating diffusion FD, a reset transistor RST, an amplification transistor AMP, and a selection transistor SEL.
  • a floating diffusion FD is connected between the first electrode 77 and the amplification transistor AMP.
  • the floating diffusion FD converts the signal charge transferred by the first electrode 77 into a voltage signal and outputs the voltage signal to the amplification transistor AMP.
  • the reset transistor RST is connected between the floating diffusion FD and the power supply.
  • a drive signal is applied to the gate electrode of the reset transistor RST to turn on the reset transistor RST, the potential of the floating diffusion FD is reset to the level of the power supply.
  • the amplification transistor AMP has a gate electrode connected to the floating diffusion FD and a drain electrode connected to a power supply.
  • a source electrode of the amplification transistor AMP is connected to the vertical signal line via the selection transistor SEL.
  • the selection transistor SEL is connected between the source electrode of the amplification transistor AMP and the vertical signal line.
  • a drive signal is applied to the gate electrode of the selection transistor SEL, and when the selection transistor SEL is turned on, the pixel signal output from the amplification transistor AMP is output to the AD conversion section 20 via SEL and the vertical signal line.
  • the AD conversion section 20 AD-converts the pixel signal based on the control signal input from the control section 40 and outputs the result to the data processing section 30 .
  • FIG. 5 to 8 are explanatory diagrams showing layout examples of pixels according to the present disclosure in plan view.
  • a plurality of on-chip lenses 65 are arranged two-dimensionally (in rows and columns).
  • a plurality of color filters 63 are two-dimensionally (in rows and columns) arranged in a layer below the layer where the on-chip lens 65 is arranged, as shown in FIG.
  • the color filter 63 includes a filter R selectively transmitting red light, a filter G selectively transmitting green light, and a filter G selectively transmitting blue light.
  • Each color filter 63 (one filter R, G, B) is provided at a position corresponding to one on-chip lens 65 .
  • the corresponding positions here are, for example, positions that overlap each other in the Z-axis direction.
  • the filters R, G, and B are arranged, for example, according to a color arrangement method called Bayer arrangement. Note that the arrangement of the filters R, G, and B is not limited to the Bayer arrangement.
  • each color filter 63 is indicated by a dashed line in order to clarify the positional relationship between each charge storage electrode 82 and each color filter 63 .
  • Each charge storage electrode 82 is provided at a position corresponding to one filter R, G, B, respectively. The corresponding positions here are, for example, positions that overlap each other in the Z-axis direction.
  • the plurality of photoelectric conversion regions 67 in the second pixel array 12 are arranged in two layers below the layer in which the charge storage electrodes 82 in the first pixel array 11 are arranged. Arranged in a dimension (matrix). A plurality of through electrodes 71 are provided around each photoelectric conversion region 67 .
  • each color filter 63 is indicated by a broken line, and each on-chip lens 65 is indicated by a dotted line. showing.
  • Each photoelectric conversion region 67 is provided at a position corresponding to 16 on-chip lenses 65 and 16 color filters 63 arranged in a 4 ⁇ 4 matrix. The corresponding positions here are, for example, positions that overlap each other in the Z-axis direction.
  • one photoelectric conversion region 67 is provided for 16 on-chip lenses 65 and 16 color filters 63, but this is just an example.
  • the pixel array 10 may be provided with one photoelectric conversion region 67 for four on-chip lenses 65 and 16 color filters 63, and one on-chip lens 65 and 16 color filters 63 may be provided.
  • one photoelectric conversion region 67 may be provided.
  • FIG. 9 is an explanatory diagram showing an operation example of the imaging device according to the present disclosure.
  • the imaging apparatus 1 first performs first pixel control by the control unit 40 (step S1), and drives the first pixel array 11 (step S2).
  • the first pixel array 11 photoelectrically converts incident light into signal charges by the first pixels 13 and accumulates the signal charges.
  • the imaging device 1 reads out pixel signals corresponding to the signal charges from the first pixel array 11 by the first readout circuit 21 (step S3), and AD-converts the pixel signals by the first data processing unit 22 .
  • the first data processing unit 22 outputs the AD-converted pixel signal (image data of the visible light image) as necessary (step S4).
  • the imaging device 1 evaluates the visible light image by the first data processing unit 22 (step S5).
  • the first data processing unit 22 detects the first pixels 13 that have changed in the visible image.
  • the first data processing unit 22 detects the first pixels 13 in which the difference between the signal charge amount accumulated in the image data of the previous frame and the signal charge amount accumulated in the image data of the current frame exceeds a threshold.
  • the threshold is set by the ISP 103, for example (step S6).
  • the first data processing unit 22 outputs the detection result of the changed first pixel 13 to the second row control circuit 43 and the ISP 103 .
  • the ISP 103 When the ISP 103 detects the changed first pixel 13, the ISP 103 performs sensor control (step S7) and drives the light source 101 (step S8). As a result, the light source 101 emits light while blinking at a high frequency to irradiate the subject 110 with infrared light. As described above, in the distance measuring system 100, the light source 101 emits light only after the first pixel 13 having a change is detected. Therefore, the power consumption of the light source 101 can be reduced compared to the case where the light source 101 emits light all the time. .
  • the ISP 103 causes the light source 101 to emit light at a relatively low luminance before the first pixel 13 with a change is detected, and when the first pixel 13 with a change is detected, the light emission intensity of the light source 101 is increased. It can also be raised to emit light with high brightness.
  • the imaging device 1 performs second pixel control by the second row control circuit 43 (step S9), and drives the second pixel array 12 (step S10).
  • the second pixel array 12 photoelectrically converts the incident infrared light transmitted through the first pixel array 11 into signal charges by the second pixels 14, and accumulates the signal charges.
  • the second row control circuit 43 drives the second pixel array 12 only after the first pixel 13 with a change is detected. Power consumption of the array 12 can be reduced.
  • the second row control circuit 43 can drive all the second pixels 14, but can also selectively drive the second pixels 14 corresponding to the first pixels 13 that have changed.
  • the second row control circuit 43 can reduce power consumption by selectively driving the second pixels 14 corresponding to the changed first pixels 13 .
  • the second row control circuit 43 changes the drive frequency of the second pixels 14 corresponding to the first pixels 13 when the amount of change in the first pixels 13 that have changed is large, for example, when the amount of change exceeds the threshold value. It can be higher than normal.
  • the imaging device 1 reads pixel signals corresponding to the signal charges from the second pixel array 12 by the second readout circuit 23 (step S11), AD-converts the pixel signals by the second data processing unit 24, and after the AD conversion pixel signals (image data of an infrared light image) are output (step S12).
  • the second data processing unit 24 reads pixel signals from all the second pixels 14, AD-converts only the pixel signals of the second pixels 14 corresponding to the first pixels 13 with changes, and converts the pixel signals after the AD conversion. (image data of an infrared light image) can be output. Thereby, the imaging device 1 can reduce the power consumption required for AD conversion.
  • the second data processing unit 24 reads pixel signals only from the pixel signals of the second pixels 14 corresponding to the first pixels 13 with change, AD-converts the read pixel signals, and converts the pixel signals after the AD conversion. (image data of an infrared light image) can be output. As a result, the imaging device 1 can reduce power consumption required for reading pixel signals.
  • the imaging device 1 effectively uses the information of the visible light image, and drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11. Therefore, the light source 101 and the imaging device 1 to reduce power consumption.
  • FIG. 10 is a flowchart illustrating an example of processing executed by an imaging device according to the present disclosure.
  • the image capturing apparatus 1 performs various settings, drives the first pixel array 11 under image capturing conditions based on the external light conditions, and captures a visible light image as a first image (step S101).
  • the imaging device 1 photoelectrically converts visible light by the first pixel array 11 and accumulates signal charges (step S102). Subsequently, the imaging device 1 reads signal charges (signals) from the first pixel array 11 and AD-converts the signals (step S103). Subsequently, the imaging device 1 outputs a first image that is a visible light image. At this time, the imaging device 1 can also thin out and output the first pixels 13 from the first pixel array 11 .
  • the imaging device 1 analyzes the information of the first image and determines imaging conditions for the second image, which is an infrared light image (step S105). Specifically, the imaging device 1 detects the first pixel 13 with a change, and determines the coordinates (X1, Y1 to X2, Y2) of the scanning range of the second pixel 14 .
  • the scanning range determined at this time may be a single pixel, an aggregated area of a plurality of pixels, or a combination thereof.
  • the imaging device 1 detects the position of the subject's face in the visible light image by the first data processing unit 22, It is also possible to determine only a portion of the image as the scanning range. In addition, since the position of the subject may change over time, the imaging device 1 can appropriately track the position of the subject in the visible light image and set a larger scanning range. .
  • the imaging device 1 performs various settings, drives the second pixel array 12 under imaging conditions based on the external light conditions, and captures an infrared light image (step S106). After that, the imaging device 1 photoelectrically converts the infrared light by the second pixel array 12 and accumulates signal charges (step S107).
  • the imaging device 1 reads signal charges (signals) from the second pixels 14 and AD-converts the signals (step S108). Subsequently, the imaging device 1 outputs the signals of the first pixels 13 and/or the second pixels 14 corresponding to the scanning range (X1, Y1 to X2, Y2) to the ISP 103 (step S109), and ends the process.
  • FIG. 11 is a flowchart illustrating a modification of processing executed by an imaging device according to the present disclosure. As shown in FIG. 11, in the process according to the modification, the processes of steps S208 and S209 are different from the processes of steps S108 and S109 shown in FIG. Since the processing of steps S101 to S107 shown in FIG. 11 is the same as the processing of S101 to S107 shown in FIG. 10, redundant description will be omitted here.
  • the imaging device 1 accumulates signal charges (signals) by the second pixels 14 (step S107), and then scans the scanning range (X1, Y1 to X2, Y2) determined in step S105.
  • the signal of the second pixel 14 is read out and AD-converted (step S208).
  • the imaging device 1 outputs the signal of the first pixel 13 and/or the second pixel 14 to the ISP 103 (step S209), and ends the process.
  • FIG. 12 is an explanatory diagram showing a configuration example of a ranging system according to a modification of the present disclosure.
  • the distance measuring system 100a according to the modification differs from the distance measuring system 100 shown in FIG. It is similar to the ranging system 100 shown. Therefore, the driving control of the light source 101 by the control unit 40a will be described here, and redundant description of other configurations will be omitted.
  • the control unit 40a controls the distance measurement pulse generation unit 50. As a result, the light emission intensity of the light source 101 is raised above normal, or the operating frequency of the light source 101 is raised above normal.
  • the control unit 40 a also adjusts the drive frequency of the second pixel array 12 in accordance with the change in the operating frequency of the light source 101 .
  • the control unit 40a when the image analysis by the first data processing unit 22 reveals that the subject of the visible light image is a face and the face is located closer to the imaging optical system 102 than the preset location, the control unit 40a , the emission intensity of the light source 101 is made lower than usual.
  • control unit 40a can irradiate the subject with appropriate infrared light according to the moving speed of the subject and the distance to the subject. Therefore, the second pixel array 12 can capture an appropriate infrared light image for distance measurement.
  • FIG. 13 is a flowchart illustrating an example of processing executed by an imaging device according to a modification of the present disclosure
  • steps S305 to S310 differs from the processing of steps S105 to S109 shown in FIG. Since the processing of steps S101 to S104 shown in FIG. 13 is the same as the processing of steps S101 to S104 shown in FIG. 10, redundant description will be omitted here.
  • the imaging device 1a causes the first pixel array 11 to accumulate signal charges (signals) (step S104), then analyzes the information of the first image to obtain the infrared light image. 2 Determine the imaging conditions for the images.
  • the imaging device 1a detects the first pixels 13 with changes, and determines the coordinates (X1, Y1 to X2, Y2) of the scanning range of the second pixels 14. Further, the imaging device 1a determines the emission intensity and emission time (operating frequency) of the light source 101 (step S305). After that, the imaging device 1a causes the light source 101 to emit light with the determined emission intensity and emission time (operating frequency) (step S306).
  • the imaging device 1a opens the gate of the second pixel 14 at a timing corresponding to the determined light emitting time (operating frequency) (step S307), closes the gate of the second pixel 14 (step S308), Infrared light received by the second pixels 14 is photoelectrically converted to accumulate charges.
  • the imaging device 1a reads the signals of the pixels corresponding to the scanning range (X1, Y1 to X2, Y2) and AD-converts them (step S309). Subsequently, the imaging device 1a outputs the signal of the first pixel 13 and/or the second pixel 14 to the ISP 103 (step S310), and ends the process.
  • the imaging device may omit the optical filter 75 shown in FIG. If the optical filter 75 is not provided, both the first pixel array 11 and the second pixel array 12 will capture visible light images.
  • the thickness of the photoelectric conversion layer 79 is made thinner than the thickness shown in FIG. 3, the light receiving sensitivity of the first pixel array 11 is lowered.
  • the imaging device captures a low-sensitivity first image with the first pixel array 11, captures a second image with a higher sensitivity than the first image with the second pixel array 12, and captures a second image.
  • 1 image and the 2nd image are output to ISP103.
  • the ISP 103 can generate an HDR image by HDR (High Dynamic Range) synthesis of the first image and the second image.
  • the imaging device can capture a low-sensitivity image and a high-sensitivity image having the same number of pixels. Further, if a filter that transmits infrared light is added to the imaging optical system 102, it becomes possible to generate an infrared light HDR image.
  • the imaging apparatus can effectively use the information of the visible light image and appropriately reduce the power consumption by performing the same control as described above by the control units 40 and 40a.
  • the imaging device 1 has a semiconductor substrate 66 , a first pixel array 11 , a second pixel array 12 and a control section 40 .
  • the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
  • the second pixel array 12 is provided in the semiconductor substrate 66 at a position overlapping the first light receiving pixels 13 in the thickness direction of the semiconductor substrate 66, and photoelectrically converts light in a second wavelength region including an infrared light region. Two light-receiving pixels 14 are arranged two-dimensionally.
  • the control unit 40 drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11 . Thereby, the imaging device 1 can effectively use the information of the visible light image.
  • the imaging device 1 includes a data processing unit 30 that detects the first pixels 13 with changes from the image data captured by the first pixel array 11 .
  • the control unit 40 outputs a signal photoelectrically converted by the second light-receiving pixel 14 corresponding to the first pixel 13 having a change detected by the data processing unit 30 .
  • the imaging device 1 outputs only the second light-receiving pixels 14 corresponding to the ROI (Region Of Interest) region of interest in the visible image, so that the power consumption required for outputting information can be appropriately reduced. can.
  • ROI Region Of Interest
  • the control unit 40 outputs the signal read from the second light-receiving pixel 14 corresponding to the first pixel 13 with a change among all the signals read from the second pixel array 12 .
  • the imaging device 1 outputs only the second light-receiving pixels 14 corresponding to the ROI region of interest in the visible image, so power consumption required for outputting information can be appropriately reduced.
  • the control unit 40 reads out and outputs photoelectrically converted signals from the second light receiving pixels 14 corresponding to the first pixels 13 with change.
  • the imaging device 1 reads and outputs only the second light receiving pixels 14 corresponding to the ROI region of interest in the visible image from the second pixel array 12, thereby appropriately reducing power consumption required for reading information. can do.
  • the control unit 40 starts driving the second pixel array 12 when the data processing unit 30 detects the changed first pixels 13 .
  • the imaging device 1 does not drive the second pixel array 12 until the changed first pixels 13 are detected, so that the power consumption of the second pixel array 12 can be reduced.
  • the control unit 40 thins out some of the first light-receiving pixels 13 to output photoelectrically converted signals from the first pixel array 11 .
  • the imaging device 1 can reduce power consumption required for outputting signals photoelectrically converted by the first pixel array 11 .
  • the imaging device 1 has a semiconductor substrate 66 , a first pixel array 11 , a second pixel array 12 and a control section 40 .
  • the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
  • the second pixel array 12 is provided at a position overlapping the first light receiving pixels 13 in the thickness direction of the semiconductor substrate 66 in the semiconductor substrate 66 , and has a light receiving sensitivity different from that of the first light receiving pixels 13 .
  • the second light-receiving pixels 14 that photoelectrically convert the light in the first wavelength band that has passed through are arranged two-dimensionally.
  • the control unit 40 drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11 . Thereby, the imaging device 1 can capture a high-sensitivity image and a low-sensitivity image with one pixel array 10 .
  • the imaging device 1 has a semiconductor substrate 66 , a first pixel array 11 , a second pixel array 12 and a control section 40 .
  • the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
  • the second pixel array 12 is provided on the same plane as the first pixel array 11, and the second light receiving pixels 14 that photoelectrically convert light in the second wavelength region including the infrared light region are arranged two-dimensionally.
  • the control unit 40 drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11 .
  • the imaging device 1 can effectively use the information of the visible light image even when the first pixel array 11 and the second pixel array 12 are arranged on the same plane.
  • the ranging system 100a includes a light source 101 and an imaging device 1a.
  • a light source 101 emits infrared light.
  • the imaging device 1a captures an image of a subject irradiated with infrared light, and measures the distance to the subject based on the captured image.
  • the imaging device 1a has a semiconductor substrate 66, a first pixel array 11, a second pixel array 12, and a controller 40a.
  • the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
  • the second pixel array 12 is provided in the semiconductor substrate 66 at a position overlapping the first light receiving pixels 13 in the thickness direction of the semiconductor substrate 66, and photoelectrically converts light in a second wavelength region including an infrared light region.
  • Two light-receiving pixels 14 are arranged two-dimensionally.
  • the control unit 40 a drives and controls the second pixel array 12 and the light source 101 based on the signals photoelectrically converted by the first pixel array 11 , and controls the distance to the subject based on the signals photoelectrically converted by the second light receiving pixels 14 . Measure distance.
  • the distance measurement system 100a can improve the distance measurement accuracy by effectively using the information of the visible light image and appropriately driving and controlling the light source 101 .
  • the present technology can also take the following configuration.
  • a semiconductor substrate A first light-receiving pixel that is provided on the semiconductor substrate, has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
  • a first pixel array in which are arranged two-dimensionally;
  • a second light-receiving pixel provided in the semiconductor substrate at a position overlapping with the first light-receiving pixel in the thickness direction of the semiconductor substrate and photoelectrically converting light in a second wavelength region including an infrared light region two-dimensionally.
  • a second array of pixels a second array of pixels; and a controller that drives and controls the second pixel array based on signals photoelectrically converted by the first pixel array.
  • a data processing unit that detects the first light-receiving pixels having a change from the image data captured by the first pixel array
  • the control unit The imaging device according to (1), wherein a signal photoelectrically converted by the second light-receiving pixel corresponding to the changed first light-receiving pixel is output.
  • the control unit The imaging device according to (2), wherein among all signals read from the second pixel array, signals read from the second light-receiving pixels corresponding to the changed first light-receiving pixels are output.
  • the signals photoelectrically converted by the second pixel array are read out and output.
  • a first light-receiving pixel that is provided on the semiconductor substrate has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
  • a first pixel array in which are arranged two-dimensionally; provided in the semiconductor substrate at a position overlapping with the first light-receiving pixels in the thickness direction of the semiconductor substrate; a second pixel array in which second light-receiving pixels that photoelectrically convert light in the area are arranged two-dimensionally; and a controller that drives and controls the second pixel array based on signals photoelectrically converted by the first pixel array.
  • a first light-receiving pixel that is provided on the semiconductor substrate has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
  • a first pixel array in which are arranged two-dimensionally;
  • a second pixel array provided on the same plane as the first pixel array, in which second light-receiving pixels photoelectrically converting light in a second wavelength region including an infrared light region are arranged two-dimensionally; and a controller that drives and controls the second pixel array based on signals photoelectrically converted by the first pixel array.
  • a light source that emits infrared light
  • an imaging device that captures an image of a subject irradiated with the infrared light and measures the distance to the subject based on the captured image
  • the imaging device is a semiconductor substrate
  • a first light-receiving pixel that is provided on the semiconductor substrate, has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
  • a ranging system comprising a controller and .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Solid State Image Pick-Up Elements (AREA)
PCT/JP2022/007398 2021-03-12 2022-02-22 撮像装置および測距システム Ceased WO2022190867A1 (ja)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP22766828.2A EP4307378A4 (en) 2021-03-12 2022-02-22 IMAGING DEVICE AND DISTANCE MEASUREMENT SYSTEM
DE112022001551.9T DE112022001551T5 (de) 2021-03-12 2022-02-22 Abbildungsvorrichtung und Entfernungsmesssystem
US18/546,684 US12237348B2 (en) 2021-03-12 2022-02-22 Imaging device and ranging system
JP2023505277A JPWO2022190867A1 (enExample) 2021-03-12 2022-02-22
CN202280019409.0A CN116941040A (zh) 2021-03-12 2022-02-22 成像设备和测距系统
KR1020237029303A KR20230156694A (ko) 2021-03-12 2022-02-22 촬상 장치 및 측거 시스템

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021040358 2021-03-12
JP2021-040358 2021-03-12

Publications (1)

Publication Number Publication Date
WO2022190867A1 true WO2022190867A1 (ja) 2022-09-15

Family

ID=83227724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/007398 Ceased WO2022190867A1 (ja) 2021-03-12 2022-02-22 撮像装置および測距システム

Country Status (8)

Country Link
US (1) US12237348B2 (enExample)
EP (1) EP4307378A4 (enExample)
JP (1) JPWO2022190867A1 (enExample)
KR (1) KR20230156694A (enExample)
CN (1) CN116941040A (enExample)
DE (1) DE112022001551T5 (enExample)
TW (1) TW202245464A (enExample)
WO (1) WO2022190867A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025094517A1 (ja) * 2023-10-30 2025-05-08 ソニーセミコンダクタソリューションズ株式会社 情報処理装置、撮像装置及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015050331A (ja) * 2013-09-02 2015-03-16 ソニー株式会社 固体撮像素子およびその製造方法、並びに電子機器
JP2016005189A (ja) * 2014-06-18 2016-01-12 オリンパス株式会社 撮像素子、撮像装置
JP2018136123A (ja) * 2015-06-24 2018-08-30 株式会社村田製作所 距離センサ及びユーザインタフェース装置
JP2018142838A (ja) 2017-02-27 2018-09-13 日本放送協会 撮像素子、撮像装置及び撮影装置
JP2021022875A (ja) * 2019-07-29 2021-02-18 キヤノン株式会社 光電変換装置、光電変換システム、および移動体

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130100524A (ko) 2012-03-02 2013-09-11 삼성전자주식회사 3차원 이미지 센서의 구동 방법
CN105164610B (zh) * 2013-04-30 2018-05-25 惠普发展公司,有限责任合伙企业 深度传感器
US10863098B2 (en) * 2013-06-20 2020-12-08 Microsoft Technology Licensing. LLC Multimodal image sensing for region of interest capture
US10593055B2 (en) * 2018-03-23 2020-03-17 Capsovision Inc Method and apparatus for capturing images and associated 3D model based on a single image sensor and structured-light patterns in the visible spectrum
JP2020170966A (ja) * 2019-04-04 2020-10-15 キヤノン株式会社 撮像装置、撮像装置の制御方法
KR20250004388A (ko) * 2019-06-21 2025-01-07 소니 세미컨덕터 솔루션즈 가부시키가이샤 광전 변환 소자, 광검출 장치, 광검출 시스템, 전자 기기 및 이동체
US12185018B2 (en) * 2019-06-28 2024-12-31 Apple Inc. Stacked electromagnetic radiation sensors for visible image sensing and infrared depth sensing, or for visible image sensing and infrared image sensing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015050331A (ja) * 2013-09-02 2015-03-16 ソニー株式会社 固体撮像素子およびその製造方法、並びに電子機器
JP2016005189A (ja) * 2014-06-18 2016-01-12 オリンパス株式会社 撮像素子、撮像装置
JP2018136123A (ja) * 2015-06-24 2018-08-30 株式会社村田製作所 距離センサ及びユーザインタフェース装置
JP2018142838A (ja) 2017-02-27 2018-09-13 日本放送協会 撮像素子、撮像装置及び撮影装置
JP2021022875A (ja) * 2019-07-29 2021-02-18 キヤノン株式会社 光電変換装置、光電変換システム、および移動体

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4307378A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025094517A1 (ja) * 2023-10-30 2025-05-08 ソニーセミコンダクタソリューションズ株式会社 情報処理装置、撮像装置及びプログラム

Also Published As

Publication number Publication date
TW202245464A (zh) 2022-11-16
KR20230156694A (ko) 2023-11-14
EP4307378A1 (en) 2024-01-17
DE112022001551T5 (de) 2024-01-18
US12237348B2 (en) 2025-02-25
US20240145496A1 (en) 2024-05-02
CN116941040A (zh) 2023-10-24
EP4307378A4 (en) 2024-08-14
JPWO2022190867A1 (enExample) 2022-09-15

Similar Documents

Publication Publication Date Title
US10903279B2 (en) Solid state image sensor pixel electrode below a photoelectric conversion film
US20220359587A1 (en) Solid-state imaging device and electronic apparatus
CN102201419B (zh) 固体摄像元件、其制造方法和电子装置
CN102804754B (zh) 固体摄像装置和相机
US20060181629A1 (en) Photoelectric conversion layer-stacked solid-state imaging element
US11594568B2 (en) Image sensor and electronic device
US10536659B2 (en) Solid-state image capturing element, manufacturing method therefor, and electronic device
WO2016104177A1 (ja) 固体撮像素子およびその製造方法、並びに電子機器
US20130063631A1 (en) Solid-state imaging apparatus and camera
WO2018105334A1 (ja) 固体撮像素子及び電子機器
US20250259952A1 (en) Semiconductor element, apparatus, and chip
US10804303B2 (en) Image sensors comprising an organic photo-detector, a photo-detector array and dual floating diffusion nodes and electronic devices including the same
WO2015198876A1 (ja) 撮像素子、電子機器
US20090294815A1 (en) Solid state imaging device including a semiconductor substrate on which a plurality of pixel cells have been formed
US20240192054A1 (en) Light detection apparatus and electronic device
WO2022190867A1 (ja) 撮像装置および測距システム
US20240178254A1 (en) Light-receiving element and electronic apparatus
US12336312B2 (en) Light receiving element and electronic apparatus
US20240162271A1 (en) Image sensor arrangement, image sensor device and method for operating an image sensor arrangement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22766828

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023505277

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 18546684

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 202280019409.0

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 112022001551

Country of ref document: DE

Ref document number: 2022766828

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022766828

Country of ref document: EP

Effective date: 20231012

WWG Wipo information: grant in national office

Ref document number: 18546684

Country of ref document: US