WO2022190867A1 - 撮像装置および測距システム - Google Patents
撮像装置および測距システム Download PDFInfo
- Publication number
- WO2022190867A1 WO2022190867A1 PCT/JP2022/007398 JP2022007398W WO2022190867A1 WO 2022190867 A1 WO2022190867 A1 WO 2022190867A1 JP 2022007398 W JP2022007398 W JP 2022007398W WO 2022190867 A1 WO2022190867 A1 WO 2022190867A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- pixel array
- pixel
- semiconductor substrate
- imaging device
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 121
- 239000004065 semiconductor Substances 0.000 claims abstract description 81
- 239000000758 substrate Substances 0.000 claims abstract description 63
- 238000006243 chemical reaction Methods 0.000 claims abstract description 58
- 238000012545 processing Methods 0.000 claims description 70
- 238000010586 diagram Methods 0.000 description 16
- 238000012986 modification Methods 0.000 description 13
- 230000004048 modification Effects 0.000 description 13
- 238000005259 measurement Methods 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 12
- 238000009792 diffusion process Methods 0.000 description 11
- 238000007667 floating Methods 0.000 description 11
- 238000010191 image analysis Methods 0.000 description 8
- 230000003321 amplification Effects 0.000 description 7
- 238000013500 data storage Methods 0.000 description 7
- 238000000034 method Methods 0.000 description 7
- 238000003199 nucleic acid amplification method Methods 0.000 description 7
- 238000009825 accumulation Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 2
- 230000004397 blinking Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 239000011368 organic material Substances 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000000701 chemical imaging Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052738 indium Inorganic materials 0.000 description 1
- APFVFJFRJDLVQX-UHFFFAOYSA-N indium atom Chemical compound [In] APFVFJFRJDLVQX-UHFFFAOYSA-N 0.000 description 1
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 230000006798 recombination Effects 0.000 description 1
- 238000005215 recombination Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 229910052814 silicon oxide Inorganic materials 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- YVTHLONGBIQYBO-UHFFFAOYSA-N zinc indium(3+) oxygen(2-) Chemical compound [O--].[Zn++].[In+3] YVTHLONGBIQYBO-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14649—Infrared imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
- H01L27/14605—Structural or functional details relating to the position of the pixel elements, e.g. smaller pixel elements in the center of the imager compared to pixel elements at the periphery
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14609—Pixel-elements with integrated switching, control, storage or amplification elements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14645—Colour imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14665—Imagers using a photoconductor layer
- H01L27/14669—Infrared imagers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/17—Colour separation based on photon absorption depth, e.g. full colour resolution obtained simultaneously at each pixel location
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/705—Pixels for depth measurement, e.g. RGBZ
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/707—Pixels for event detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/77—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
Definitions
- the present disclosure relates to imaging devices and ranging systems.
- an imaging device that stacks a pixel array that captures a visible light image and a pixel array that captures an infrared light image.
- an imaging device there is one that performs imaging control of a visible light image using information of an infrared light image (see, for example, Patent Document 1).
- the present disclosure proposes an imaging device and a ranging system that can effectively use visible light image information.
- an imaging device has a semiconductor substrate, a first pixel array, a second pixel array, and a controller.
- the first pixel array is provided on the semiconductor substrate, has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and emits light in a first wavelength region including a visible light region.
- the first light-receiving pixels to be converted are arranged two-dimensionally.
- a second pixel array is provided in the semiconductor substrate at a position overlapping with the first light receiving pixels in a thickness direction of the semiconductor substrate, and photoelectrically converts light in a second wavelength region including an infrared light region.
- Light-receiving pixels are arranged two-dimensionally.
- the control unit drives and controls the second pixel array based on the signals photoelectrically converted by the first pixel array.
- FIG. 1 is an explanatory diagram showing a configuration example of a ranging system according to the present disclosure
- FIG. It is an explanatory view showing an example of composition of an imaging device concerning this indication.
- FIG. 2 is a cross-sectional explanatory diagram of a pixel array according to the present disclosure
- 1 is a circuit diagram showing an example of a first readout circuit according to the present disclosure
- FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
- FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
- FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
- FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
- FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
- FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a
- FIG. 4 is an explanatory diagram showing an example arrangement of pixels in a plan view according to the present disclosure
- FIG. 4 is an explanatory diagram showing an operation example of an imaging device according to the present disclosure
- 4 is a flowchart showing an example of processing executed by an imaging device according to the present disclosure
- 7 is a flowchart showing a modified example of processing executed by an imaging device according to the present disclosure
- FIG. 11 is an explanatory diagram showing a configuration example of a ranging system according to a modification of the present disclosure
- FIG. 11 is a flowchart showing an example of processing executed by an imaging device according to a modification of the present disclosure
- FIG. 1 is an explanatory diagram showing a configuration example of a ranging system according to the present disclosure.
- the distance measurement system 100 shown in FIG. 1 captures an infrared light image (hereinafter sometimes referred to as a “distance image”) of a subject irradiated with infrared light, and the light flight time (Time This is a system that realizes a d (direct) TOF sensor that measures the distance to an object based on Of Tlight (TOF).
- the ranging system 100 may be a system that implements i (indirect) ToF.
- the ranging system 100 includes a light source 101, an imaging optical system 102, an imaging device 1, an ISP (Image Signal Processor) 103, an input device 104, a display device 105, and a data storage unit 106.
- ISP Image Signal Processor
- a light source 101 is an infrared light laser that emits infrared light toward a subject to be distance-measured.
- the light source 101 is driven and controlled by the ISP 103, and emits infrared light while blinking at a predetermined high frequency at high speed.
- the imaging optical system 102 includes a lens or the like that forms an image of the infrared light reflected by the subject on the light receiving unit of the imaging device 1 .
- the imaging device 1 is a device that captures a distance image, which is an infrared light image of a subject, and outputs image data of the distance image to the ISP 103 .
- the imaging device 1 can capture not only a distance image but also a visible light image. When capturing a visible light image, the imaging device 1 outputs image data of the visible light image to the ISP 103 as necessary.
- the ISP 103 drives and controls the light source 101, and measures the distance to the subject based on the light flight time of the infrared light from the image data of the distance image input from the imaging device 1 and the phase shift information from the input light. , the measurement result and the distance image are output to the display device 105 and the data storage unit 106 . Further, when the image data of the visible light image is input from the imaging device 1, the ISP 103 outputs the image data of the visible light image to the display device 105 and the data storage unit 106 as necessary. Furthermore, ISP 103 can output colored three-dimensional information obtained by synthesizing distance information and RGB information to display device 105 and data storage unit 106 as needed.
- the display device 105 is, for example, a liquid crystal display, and displays the distance measurement result measured by the ISP 103, an infrared light image, or a visible light image.
- the data storage unit 106 is, for example, a memory, and stores the distance measurement result measured by the ISP 103, the image data of the infrared light image, or the image data of the visible light image.
- the input device 104 receives user operations for performing various settings of the ranging system 100 and user operations for causing the ranging system 100 to perform ranging, and outputs signals corresponding to the user operations to the ISP 103 .
- the imaging device 1 also includes a pixel array 10 , an AD (Analog to Digital) converter 20 , and a data processor 30 . Note that FIG. 1 shows part of the components of the imaging apparatus 1 . A specific configuration example of the imaging device 1 will be described later with reference to FIG.
- the pixel array 10 photoelectrically converts the incident light from the imaging optical system 102 into a range image pixel signal and a visible light image pixel signal, and outputs the pixel signals to the AD conversion unit 20 .
- the AD conversion unit 20 AD-converts the pixel signals of the distance image and the visible light image and outputs them to the data processing unit 30 .
- the data processing unit 30 performs various types of image processing and image analysis on the image data of the range image and the visible light image, and outputs the image data after image processing and image analysis to the ISP 103 .
- the distance measuring system 100 is an iToF sensor, for example, it is necessary to calculate a distance image (distance data) from four (times) phase data.
- phase data for four times are once stored in a memory, and then calculated to create one distance image (distance data).
- the data storage unit 106 for example, flash memory
- a memory unit for example, memory such as SRAM (Static Random Access Memory) for temporarily storing phase data.
- SRAM is a memory in which information can be written and read at high speed, so it is suitable for distance calculation processing that must be performed at high speed. Such a memory should be able to store information only when the power is on.
- Ranging system 100 uses the memory built into ISP 103 for temporary storage of phase data in the case of an iToF sensor.
- the distance measurement system 100 needs to operate the light source 101 and the pixel array 10 that captures the distance image at high frequencies, and consumes more power than a general visible light sensor. . Therefore, the ranging system 100 effectively utilizes visible light image data in the imaging device 1 to reduce power consumption.
- the imaging device 1 first captures a visible light image that consumes relatively little power for imaging, and does not capture a range image until a moving object is detected in the visible light image. After that, when the imaging device 1 detects a moving object in the visible light image, the imaging device 1 causes the light source 101 to emit light, captures a distance image that consumes relatively large power for imaging, and performs distance measurement. Thereby, the ranging system 100 can appropriately reduce power consumption.
- the imaging device 1 first captures a visible light image that consumes relatively little power for imaging, and does not capture a range image until a moving object is detected in the visible light image.
- the imaging device 1 causes the light source 101 to emit light, captures a distance image that consumes relatively large power for imaging, and performs distance measurement.
- the ranging system 100 can appropriately reduce power consumption.
- FIG. 2 is an explanatory diagram showing a configuration example of an imaging device according to the present disclosure.
- the imaging device 1 includes a pixel array 10, a global control circuit 41, a first row control circuit 42, a second row control circuit 43, a first readout circuit 21, and a first data processing circuit. It includes a section 22 , a second readout circuit 23 , and a second data processing section 24 .
- Global control circuit 41, first row control circuit 42, and second row control circuit 43 are included in control unit 40 shown in FIG.
- the pixel array 10 includes a first pixel array 11 and a second pixel array 12 .
- first pixels a plurality of first light-receiving pixels (hereinafter referred to as "first pixels”) 13 corresponding to each pixel of the visible light image are arranged two-dimensionally (in a matrix).
- second pixels a plurality of second light-receiving pixels (hereinafter referred to as "second pixels") 14 corresponding to each pixel of the infrared light image are arranged two-dimensionally (in a matrix).
- the first pixel 13 photoelectrically converts light in the first wavelength range including the visible light range into signal charges according to the amount of received light.
- the first pixel 13 photoelectrically converts red light, green light, and blue light.
- the second pixel 14 photoelectrically converts light in the second wavelength range including the infrared light range.
- the second pixels 14 photoelectrically convert infrared light into signal charges corresponding to the amount of received light.
- the first pixel array 11 is laminated on the second pixel array 12 .
- the first pixel array 11 photoelectrically converts incident visible light to capture a visible light image.
- the second pixel array 12 photoelectrically converts the infrared light transmitted through the first pixel array to capture an infrared light image.
- the imaging device 1 can capture a visible light image and an infrared light image with one pixel array 10 .
- the global control circuit 41 controls the first row control circuit 42 and the ranging pulse generator 50 based on the control signal from the ISP 103 .
- the first row control circuit 42 drives and controls the first pixel array 11 to capture a visible light image.
- the first pixel array 11 photoelectrically converts visible light into signal charges and accumulates them.
- the first readout circuit 21 reads out a signal corresponding to the signal charge from the first pixel array 11 and outputs it to the first data processing section 22 .
- the first data processing unit 22 AD-converts the signal input from the first reading circuit 21 to acquire image data of a visible light image, and performs predetermined image processing and image analysis on the image data.
- the first data processing unit 22 outputs image data after image processing and image analysis to the ISP 103 as necessary.
- the first data processing unit 22 outputs the result of image analysis to the second row control circuit 43 to operate the second row control circuit 43 .
- image analysis the first data processing unit 22 determines, for example, whether the subject in the visible light image is moving or not, and operates the second row control circuit 43 only when the subject is moving.
- the first data processing unit 22 detects the changed first pixels 13 from the image data captured by the first pixel array 11 .
- the first pixel 13 having a change here means, for example, the first pixel 13 in which the difference between the signal charge amount accumulated in the image data of the previous frame and the signal charge amount accumulated in the image data of the current frame exceeds a threshold value. is.
- the second row control circuit 43 starts driving the second pixel array 12 when the first data processing unit 22 detects the changed first pixels 13 . Thereby, the imaging device 1 can appropriately reduce the power consumption required for driving the second pixel array 12 .
- the first data processing unit 22 may be configured to control the second readout circuit 23 so as to read out signals only from the second pixels 14 corresponding to the changing first pixels 13 . Thereby, the imaging device 1 can also reduce the power consumption of the second readout circuit 23 .
- the first data processing unit 22 can cause the second row control circuit 43 to drive only the second pixels 14 corresponding to the first pixels 13 having a change in the first pixel array 11 .
- the imaging device 1 can further reduce power consumption required for driving the second pixel array 12 .
- the first data processing unit 22 sets a threshold for the amount of change in the first pixel array 11 instead of the binary value of whether the subject is moving or not, and sets the first threshold when the amount of change due to the movement of the subject exceeds the threshold. It is also possible to drive only the second pixels corresponding to the pixels 13 by the second row control circuit 43 . Thereby, the imaging device 1 can minimize the number of second pixels 14 to be driven.
- the first data processing section 22 may be configured so that the operating frequency of the second row control circuit 43 can be changed according to changes in the first pixel array 11 .
- the first data processing unit 22 increases the operating frequency of the second row control circuit 43 when the change in the first pixel array 11 (moving speed of the object) is large.
- the imaging device 1 can capture a clear image of a subject moving at a high speed using the second pixel array.
- the second row control circuit 43 drives and controls the second pixel array 12 to capture an infrared light image.
- the second pixel array 12 photoelectrically converts infrared light into signal charges and accumulates them.
- the second readout circuit 23 reads a signal corresponding to the signal charge from the second pixel array 12 and outputs it to the second data processing section 24 .
- the second data processing unit 24 AD-converts the signal input from the second reading circuit 23 to obtain image data of a distance image, which is an infrared light image, and performs predetermined image processing on the image data. Output to ISP 103 .
- the range of the second pixels 14 to be driven may be determined by the control unit 40 (second row control circuit 43), by a dedicated determination circuit, or by the ISP 103. good too.
- the second row control circuit 43, determination circuit, or ISP 103 acquires the image data of the visible light image from the first data processing unit 22, and detects the first pixels 13 with changes from the image data. Then, the second row control circuit 43, the determination circuit, or the ISP 103 determines the range of the second pixels 14 arranged at the position overlapping the detected first pixels 13 as the range of the second pixels 14 to be driven.
- the distance measurement system 100 cannot calculate the distance from one (time) distance image. It is also possible to calculate the distance to the subject from a plurality of distance images stored internally.
- FIG. 3 is a cross-sectional explanatory diagram of a pixel array according to the present disclosure.
- FIG. 3 schematically shows a cross section of the pixel array 10 corresponding to one second pixel 14 .
- the positive direction of the Z-axis in the orthogonal coordinate system shown in FIG. 3 is referred to as "up”
- the negative direction of the Z-axis is referred to as "down” for convenience.
- the pixel array 10 includes a so-called longitudinal spectral imaging element having a structure in which the second pixels 14 and the first pixels 13 are stacked in the Z-axis direction, which is the thickness direction.
- the pixel array 10 includes an intermediate layer 60 provided between the second pixels 14 and the first pixels 13, and a multilayer wiring layer 61 provided on the opposite side of the first pixels 13 as viewed from the second pixels 14. Prepare.
- a sealing film 62 is arranged on the side opposite to the intermediate layer 60 when viewed from the first pixels 13 .
- the first pixel 13 is an indirect TOF (hereinafter referred to as "iTOF") sensor that acquires a distance image (distance information) by TOF.
- the first pixel 13 has a semiconductor substrate 66 , a photoelectric conversion region 67 , a fixed charge layer 68 , a pair of gate electrodes 69A and 69B, floating diffusions 70A and 70B, and a through electrode 71 .
- the semiconductor substrate 66 is, for example, an n-type silicon substrate, and has a P-well in a predetermined internal region.
- a lower surface of the semiconductor substrate 66 faces the multilayer wiring layer 61 .
- the upper surface of the semiconductor substrate 66 faces the intermediate layer 60 and has a fine uneven structure. As a result, the upper surface of the semiconductor substrate 66 appropriately scatters the incident infrared light, thereby increasing the optical path length.
- the semiconductor substrate 66 may also have a fine uneven structure formed on the bottom surface.
- the photoelectric conversion region 67 is a photoelectric conversion element composed of a PIN (Positive Intrinsic Negative) type photodiode.
- the photoelectric conversion region 67 mainly receives light having wavelengths in the infrared region (infrared light) among the light incident on the pixel array 10, photoelectrically converts the light into signal charges according to the amount of received light, and accumulates the signal charges.
- the fixed charge layer 68 is provided so as to cover the top surface and side surfaces of the semiconductor substrate 66 .
- the fixed charge layer 68 contains negative fixed charges that suppress the generation of dark current due to the interface state of the upper surface of the semiconductor substrate 66 that is the light receiving surface.
- the fixed charge layer 68 forms a hole accumulation layer in the vicinity of the upper surface of the semiconductor substrate 66, and suppresses generation of electrons from the upper surface of the semiconductor substrate 66 by the hole accumulation layer.
- the fixed charge layer 68 also extends between the inter-pixel area light shielding wall 72 and the photoelectric conversion area 67 .
- the gate electrodes 69A and 69B extend from the lower surface of the semiconductor substrate 66 to a position reaching the photoelectric conversion region 67.
- the gate electrodes 69A, 69B transfer the signal charge accumulated in the photoelectric conversion region 67 to the floating diffusions 70A, 70B when a predetermined voltage is applied.
- the floating diffusions 70A and 70B are floating diffusion regions that temporarily hold the signal charges transferred from the photoelectric conversion region 67.
- the signal charges held in the floating diffusions 70A and 70B are read out as pixel signals by the second readout circuit 23 (see FIG. 2).
- a wiring 74 is provided inside an insulating layer 73 in the multilayer wiring layer 61 .
- the insulating layer 73 is made of, for example, silicon oxide.
- the wiring 74 is made of metal such as copper or gold, for example.
- a first readout circuit 21 and a second readout circuit 23 are also provided inside the insulating layer 73 .
- the intermediate layer 60 has an optical filter 75 embedded in the insulating layer 73 and an inter-pixel area light shielding film 76 .
- the optical filter 75 is made of, for example, an organic material, and mainly selectively transmits light with frequencies in the infrared region.
- the inter-pixel region light shielding film 76 reduces color mixture between adjacent pixels.
- the first pixel 13 has a first electrode 77, a semiconductor layer 78, a photoelectric conversion layer 79, and a second electrode 80, which are stacked in order from a position closer to the photoelectric conversion region 67. Furthermore, the first pixel 13 has a charge storage electrode 82 provided below the semiconductor layer 78 so as to face the semiconductor layer 78 with the insulating film 81 interposed therebetween.
- the charge storage electrode 82 and the first electrode 77 are separated from each other, and are provided on the same layer, for example.
- the first electrode 77 is connected to the upper end of the through electrode 71, for example.
- the first electrode 77, the second electrode 80, and the charge storage electrode 82 are formed of, for example, a light-transmitting conductive film such as ITO (indium tin oxide) or IZO (indium zinc oxide containing indium). .
- the photoelectric conversion layer 79 converts light energy into electrical energy, and is formed by including two or more kinds of organic materials that function as p-type semiconductors and n-type semiconductors, for example.
- a p-type semiconductor functions as an electron donor (donor).
- the n-type semiconductor functions as an electron acceptor.
- the photoelectric conversion layer 79 has a bulk heterojunction structure.
- a bulk heterojunction structure is a p/n junction formed by mixing a p-type semiconductor and an n-type semiconductor.
- the photoelectric conversion layer 79 separates incident light into electrons and holes at the p/n junction.
- the charge storage electrode 82 forms a capacitor together with the insulating film 81 and the semiconductor layer 78, and the signal charge generated in the photoelectric conversion layer 79 is transferred to the region of the semiconductor layer 78 facing the charge storage electrode 82 via the insulating film 81. accumulate.
- the charge storage electrode 82 is provided at a position corresponding to each color filter 63 and on-chip lens 65 .
- excitons thus generated move to the interface between the electron donor and the electron acceptor that constitute the photoelectric conversion layer 79 and are separated into electrons and holes.
- the electrons and holes generated here move to the second electrode 80 or the semiconductor layer 78 and are accumulated due to the difference in carrier concentration and the internal electric field due to the potential difference between the first electrode 77 and the second electrode 80 .
- the first electrode 77 is set at a positive potential and the second electrode 80 is set at a negative potential.
- holes generated in the photoelectric conversion layer 79 move to the second electrode 80 side. Electrons generated in the photoelectric conversion layer 79 are attracted to the charge storage electrode 82 and stored in the region of the semiconductor layer 78 corresponding to the charge storage electrode 82 via the insulating film 81 .
- the electrons accumulated in the region of the semiconductor layer 78 corresponding to the charge storage electrode 82 via the insulating film 81 are read as follows.
- the potential V1 is applied to the first electrode 77 and the potential V2 is applied to the charge storage electrode 82 .
- the potential V1 is set higher than the potential V2.
- the electrons accumulated in the region of the semiconductor layer 78 corresponding to the charge accumulation electrode 82 via the insulating film 81 are transferred to the first electrode 77 and read out.
- the semiconductor layer 78 is provided below the photoelectric conversion layer 79, and charges (e.g., electrons) are accumulated in regions of the semiconductor layer 78 corresponding to the charge accumulation electrodes 82 via the insulating film 81.
- charges e.g., electrons
- the following effects are obtained. Compared to the case where charges (for example, electrons) are accumulated in the photoelectric conversion layer 79 without providing the semiconductor layer 78, recombination of holes and electrons during charge accumulation is prevented, and the accumulated charges (for example, electrons) to the first electrode 77 can be increased.
- reading electrons has been described here, reading of holes may be performed. When reading holes, the relationship between the potentials V1, V2 and V3 described above is reversed.
- FIG. 4 is a circuit diagram illustrating an example of a first readout circuit according to the present disclosure.
- the first readout circuit 21 has, for example, a floating diffusion FD, a reset transistor RST, an amplification transistor AMP, and a selection transistor SEL.
- a floating diffusion FD is connected between the first electrode 77 and the amplification transistor AMP.
- the floating diffusion FD converts the signal charge transferred by the first electrode 77 into a voltage signal and outputs the voltage signal to the amplification transistor AMP.
- the reset transistor RST is connected between the floating diffusion FD and the power supply.
- a drive signal is applied to the gate electrode of the reset transistor RST to turn on the reset transistor RST, the potential of the floating diffusion FD is reset to the level of the power supply.
- the amplification transistor AMP has a gate electrode connected to the floating diffusion FD and a drain electrode connected to a power supply.
- a source electrode of the amplification transistor AMP is connected to the vertical signal line via the selection transistor SEL.
- the selection transistor SEL is connected between the source electrode of the amplification transistor AMP and the vertical signal line.
- a drive signal is applied to the gate electrode of the selection transistor SEL, and when the selection transistor SEL is turned on, the pixel signal output from the amplification transistor AMP is output to the AD conversion section 20 via SEL and the vertical signal line.
- the AD conversion section 20 AD-converts the pixel signal based on the control signal input from the control section 40 and outputs the result to the data processing section 30 .
- FIG. 5 to 8 are explanatory diagrams showing layout examples of pixels according to the present disclosure in plan view.
- a plurality of on-chip lenses 65 are arranged two-dimensionally (in rows and columns).
- a plurality of color filters 63 are two-dimensionally (in rows and columns) arranged in a layer below the layer where the on-chip lens 65 is arranged, as shown in FIG.
- the color filter 63 includes a filter R selectively transmitting red light, a filter G selectively transmitting green light, and a filter G selectively transmitting blue light.
- Each color filter 63 (one filter R, G, B) is provided at a position corresponding to one on-chip lens 65 .
- the corresponding positions here are, for example, positions that overlap each other in the Z-axis direction.
- the filters R, G, and B are arranged, for example, according to a color arrangement method called Bayer arrangement. Note that the arrangement of the filters R, G, and B is not limited to the Bayer arrangement.
- each color filter 63 is indicated by a dashed line in order to clarify the positional relationship between each charge storage electrode 82 and each color filter 63 .
- Each charge storage electrode 82 is provided at a position corresponding to one filter R, G, B, respectively. The corresponding positions here are, for example, positions that overlap each other in the Z-axis direction.
- the plurality of photoelectric conversion regions 67 in the second pixel array 12 are arranged in two layers below the layer in which the charge storage electrodes 82 in the first pixel array 11 are arranged. Arranged in a dimension (matrix). A plurality of through electrodes 71 are provided around each photoelectric conversion region 67 .
- each color filter 63 is indicated by a broken line, and each on-chip lens 65 is indicated by a dotted line. showing.
- Each photoelectric conversion region 67 is provided at a position corresponding to 16 on-chip lenses 65 and 16 color filters 63 arranged in a 4 ⁇ 4 matrix. The corresponding positions here are, for example, positions that overlap each other in the Z-axis direction.
- one photoelectric conversion region 67 is provided for 16 on-chip lenses 65 and 16 color filters 63, but this is just an example.
- the pixel array 10 may be provided with one photoelectric conversion region 67 for four on-chip lenses 65 and 16 color filters 63, and one on-chip lens 65 and 16 color filters 63 may be provided.
- one photoelectric conversion region 67 may be provided.
- FIG. 9 is an explanatory diagram showing an operation example of the imaging device according to the present disclosure.
- the imaging apparatus 1 first performs first pixel control by the control unit 40 (step S1), and drives the first pixel array 11 (step S2).
- the first pixel array 11 photoelectrically converts incident light into signal charges by the first pixels 13 and accumulates the signal charges.
- the imaging device 1 reads out pixel signals corresponding to the signal charges from the first pixel array 11 by the first readout circuit 21 (step S3), and AD-converts the pixel signals by the first data processing unit 22 .
- the first data processing unit 22 outputs the AD-converted pixel signal (image data of the visible light image) as necessary (step S4).
- the imaging device 1 evaluates the visible light image by the first data processing unit 22 (step S5).
- the first data processing unit 22 detects the first pixels 13 that have changed in the visible image.
- the first data processing unit 22 detects the first pixels 13 in which the difference between the signal charge amount accumulated in the image data of the previous frame and the signal charge amount accumulated in the image data of the current frame exceeds a threshold.
- the threshold is set by the ISP 103, for example (step S6).
- the first data processing unit 22 outputs the detection result of the changed first pixel 13 to the second row control circuit 43 and the ISP 103 .
- the ISP 103 When the ISP 103 detects the changed first pixel 13, the ISP 103 performs sensor control (step S7) and drives the light source 101 (step S8). As a result, the light source 101 emits light while blinking at a high frequency to irradiate the subject 110 with infrared light. As described above, in the distance measuring system 100, the light source 101 emits light only after the first pixel 13 having a change is detected. Therefore, the power consumption of the light source 101 can be reduced compared to the case where the light source 101 emits light all the time. .
- the ISP 103 causes the light source 101 to emit light at a relatively low luminance before the first pixel 13 with a change is detected, and when the first pixel 13 with a change is detected, the light emission intensity of the light source 101 is increased. It can also be raised to emit light with high brightness.
- the imaging device 1 performs second pixel control by the second row control circuit 43 (step S9), and drives the second pixel array 12 (step S10).
- the second pixel array 12 photoelectrically converts the incident infrared light transmitted through the first pixel array 11 into signal charges by the second pixels 14, and accumulates the signal charges.
- the second row control circuit 43 drives the second pixel array 12 only after the first pixel 13 with a change is detected. Power consumption of the array 12 can be reduced.
- the second row control circuit 43 can drive all the second pixels 14, but can also selectively drive the second pixels 14 corresponding to the first pixels 13 that have changed.
- the second row control circuit 43 can reduce power consumption by selectively driving the second pixels 14 corresponding to the changed first pixels 13 .
- the second row control circuit 43 changes the drive frequency of the second pixels 14 corresponding to the first pixels 13 when the amount of change in the first pixels 13 that have changed is large, for example, when the amount of change exceeds the threshold value. It can be higher than normal.
- the imaging device 1 reads pixel signals corresponding to the signal charges from the second pixel array 12 by the second readout circuit 23 (step S11), AD-converts the pixel signals by the second data processing unit 24, and after the AD conversion pixel signals (image data of an infrared light image) are output (step S12).
- the second data processing unit 24 reads pixel signals from all the second pixels 14, AD-converts only the pixel signals of the second pixels 14 corresponding to the first pixels 13 with changes, and converts the pixel signals after the AD conversion. (image data of an infrared light image) can be output. Thereby, the imaging device 1 can reduce the power consumption required for AD conversion.
- the second data processing unit 24 reads pixel signals only from the pixel signals of the second pixels 14 corresponding to the first pixels 13 with change, AD-converts the read pixel signals, and converts the pixel signals after the AD conversion. (image data of an infrared light image) can be output. As a result, the imaging device 1 can reduce power consumption required for reading pixel signals.
- the imaging device 1 effectively uses the information of the visible light image, and drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11. Therefore, the light source 101 and the imaging device 1 to reduce power consumption.
- FIG. 10 is a flowchart illustrating an example of processing executed by an imaging device according to the present disclosure.
- the image capturing apparatus 1 performs various settings, drives the first pixel array 11 under image capturing conditions based on the external light conditions, and captures a visible light image as a first image (step S101).
- the imaging device 1 photoelectrically converts visible light by the first pixel array 11 and accumulates signal charges (step S102). Subsequently, the imaging device 1 reads signal charges (signals) from the first pixel array 11 and AD-converts the signals (step S103). Subsequently, the imaging device 1 outputs a first image that is a visible light image. At this time, the imaging device 1 can also thin out and output the first pixels 13 from the first pixel array 11 .
- the imaging device 1 analyzes the information of the first image and determines imaging conditions for the second image, which is an infrared light image (step S105). Specifically, the imaging device 1 detects the first pixel 13 with a change, and determines the coordinates (X1, Y1 to X2, Y2) of the scanning range of the second pixel 14 .
- the scanning range determined at this time may be a single pixel, an aggregated area of a plurality of pixels, or a combination thereof.
- the imaging device 1 detects the position of the subject's face in the visible light image by the first data processing unit 22, It is also possible to determine only a portion of the image as the scanning range. In addition, since the position of the subject may change over time, the imaging device 1 can appropriately track the position of the subject in the visible light image and set a larger scanning range. .
- the imaging device 1 performs various settings, drives the second pixel array 12 under imaging conditions based on the external light conditions, and captures an infrared light image (step S106). After that, the imaging device 1 photoelectrically converts the infrared light by the second pixel array 12 and accumulates signal charges (step S107).
- the imaging device 1 reads signal charges (signals) from the second pixels 14 and AD-converts the signals (step S108). Subsequently, the imaging device 1 outputs the signals of the first pixels 13 and/or the second pixels 14 corresponding to the scanning range (X1, Y1 to X2, Y2) to the ISP 103 (step S109), and ends the process.
- FIG. 11 is a flowchart illustrating a modification of processing executed by an imaging device according to the present disclosure. As shown in FIG. 11, in the process according to the modification, the processes of steps S208 and S209 are different from the processes of steps S108 and S109 shown in FIG. Since the processing of steps S101 to S107 shown in FIG. 11 is the same as the processing of S101 to S107 shown in FIG. 10, redundant description will be omitted here.
- the imaging device 1 accumulates signal charges (signals) by the second pixels 14 (step S107), and then scans the scanning range (X1, Y1 to X2, Y2) determined in step S105.
- the signal of the second pixel 14 is read out and AD-converted (step S208).
- the imaging device 1 outputs the signal of the first pixel 13 and/or the second pixel 14 to the ISP 103 (step S209), and ends the process.
- FIG. 12 is an explanatory diagram showing a configuration example of a ranging system according to a modification of the present disclosure.
- the distance measuring system 100a according to the modification differs from the distance measuring system 100 shown in FIG. It is similar to the ranging system 100 shown. Therefore, the driving control of the light source 101 by the control unit 40a will be described here, and redundant description of other configurations will be omitted.
- the control unit 40a controls the distance measurement pulse generation unit 50. As a result, the light emission intensity of the light source 101 is raised above normal, or the operating frequency of the light source 101 is raised above normal.
- the control unit 40 a also adjusts the drive frequency of the second pixel array 12 in accordance with the change in the operating frequency of the light source 101 .
- the control unit 40a when the image analysis by the first data processing unit 22 reveals that the subject of the visible light image is a face and the face is located closer to the imaging optical system 102 than the preset location, the control unit 40a , the emission intensity of the light source 101 is made lower than usual.
- control unit 40a can irradiate the subject with appropriate infrared light according to the moving speed of the subject and the distance to the subject. Therefore, the second pixel array 12 can capture an appropriate infrared light image for distance measurement.
- FIG. 13 is a flowchart illustrating an example of processing executed by an imaging device according to a modification of the present disclosure
- steps S305 to S310 differs from the processing of steps S105 to S109 shown in FIG. Since the processing of steps S101 to S104 shown in FIG. 13 is the same as the processing of steps S101 to S104 shown in FIG. 10, redundant description will be omitted here.
- the imaging device 1a causes the first pixel array 11 to accumulate signal charges (signals) (step S104), then analyzes the information of the first image to obtain the infrared light image. 2 Determine the imaging conditions for the images.
- the imaging device 1a detects the first pixels 13 with changes, and determines the coordinates (X1, Y1 to X2, Y2) of the scanning range of the second pixels 14. Further, the imaging device 1a determines the emission intensity and emission time (operating frequency) of the light source 101 (step S305). After that, the imaging device 1a causes the light source 101 to emit light with the determined emission intensity and emission time (operating frequency) (step S306).
- the imaging device 1a opens the gate of the second pixel 14 at a timing corresponding to the determined light emitting time (operating frequency) (step S307), closes the gate of the second pixel 14 (step S308), Infrared light received by the second pixels 14 is photoelectrically converted to accumulate charges.
- the imaging device 1a reads the signals of the pixels corresponding to the scanning range (X1, Y1 to X2, Y2) and AD-converts them (step S309). Subsequently, the imaging device 1a outputs the signal of the first pixel 13 and/or the second pixel 14 to the ISP 103 (step S310), and ends the process.
- the imaging device may omit the optical filter 75 shown in FIG. If the optical filter 75 is not provided, both the first pixel array 11 and the second pixel array 12 will capture visible light images.
- the thickness of the photoelectric conversion layer 79 is made thinner than the thickness shown in FIG. 3, the light receiving sensitivity of the first pixel array 11 is lowered.
- the imaging device captures a low-sensitivity first image with the first pixel array 11, captures a second image with a higher sensitivity than the first image with the second pixel array 12, and captures a second image.
- 1 image and the 2nd image are output to ISP103.
- the ISP 103 can generate an HDR image by HDR (High Dynamic Range) synthesis of the first image and the second image.
- the imaging device can capture a low-sensitivity image and a high-sensitivity image having the same number of pixels. Further, if a filter that transmits infrared light is added to the imaging optical system 102, it becomes possible to generate an infrared light HDR image.
- the imaging apparatus can effectively use the information of the visible light image and appropriately reduce the power consumption by performing the same control as described above by the control units 40 and 40a.
- the imaging device 1 has a semiconductor substrate 66 , a first pixel array 11 , a second pixel array 12 and a control section 40 .
- the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
- the second pixel array 12 is provided in the semiconductor substrate 66 at a position overlapping the first light receiving pixels 13 in the thickness direction of the semiconductor substrate 66, and photoelectrically converts light in a second wavelength region including an infrared light region. Two light-receiving pixels 14 are arranged two-dimensionally.
- the control unit 40 drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11 . Thereby, the imaging device 1 can effectively use the information of the visible light image.
- the imaging device 1 includes a data processing unit 30 that detects the first pixels 13 with changes from the image data captured by the first pixel array 11 .
- the control unit 40 outputs a signal photoelectrically converted by the second light-receiving pixel 14 corresponding to the first pixel 13 having a change detected by the data processing unit 30 .
- the imaging device 1 outputs only the second light-receiving pixels 14 corresponding to the ROI (Region Of Interest) region of interest in the visible image, so that the power consumption required for outputting information can be appropriately reduced. can.
- ROI Region Of Interest
- the control unit 40 outputs the signal read from the second light-receiving pixel 14 corresponding to the first pixel 13 with a change among all the signals read from the second pixel array 12 .
- the imaging device 1 outputs only the second light-receiving pixels 14 corresponding to the ROI region of interest in the visible image, so power consumption required for outputting information can be appropriately reduced.
- the control unit 40 reads out and outputs photoelectrically converted signals from the second light receiving pixels 14 corresponding to the first pixels 13 with change.
- the imaging device 1 reads and outputs only the second light receiving pixels 14 corresponding to the ROI region of interest in the visible image from the second pixel array 12, thereby appropriately reducing power consumption required for reading information. can do.
- the control unit 40 starts driving the second pixel array 12 when the data processing unit 30 detects the changed first pixels 13 .
- the imaging device 1 does not drive the second pixel array 12 until the changed first pixels 13 are detected, so that the power consumption of the second pixel array 12 can be reduced.
- the control unit 40 thins out some of the first light-receiving pixels 13 to output photoelectrically converted signals from the first pixel array 11 .
- the imaging device 1 can reduce power consumption required for outputting signals photoelectrically converted by the first pixel array 11 .
- the imaging device 1 has a semiconductor substrate 66 , a first pixel array 11 , a second pixel array 12 and a control section 40 .
- the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
- the second pixel array 12 is provided at a position overlapping the first light receiving pixels 13 in the thickness direction of the semiconductor substrate 66 in the semiconductor substrate 66 , and has a light receiving sensitivity different from that of the first light receiving pixels 13 .
- the second light-receiving pixels 14 that photoelectrically convert the light in the first wavelength band that has passed through are arranged two-dimensionally.
- the control unit 40 drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11 . Thereby, the imaging device 1 can capture a high-sensitivity image and a low-sensitivity image with one pixel array 10 .
- the imaging device 1 has a semiconductor substrate 66 , a first pixel array 11 , a second pixel array 12 and a control section 40 .
- the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
- the second pixel array 12 is provided on the same plane as the first pixel array 11, and the second light receiving pixels 14 that photoelectrically convert light in the second wavelength region including the infrared light region are arranged two-dimensionally.
- the control unit 40 drives and controls the second pixel array 12 based on the signals photoelectrically converted by the first pixel array 11 .
- the imaging device 1 can effectively use the information of the visible light image even when the first pixel array 11 and the second pixel array 12 are arranged on the same plane.
- the ranging system 100a includes a light source 101 and an imaging device 1a.
- a light source 101 emits infrared light.
- the imaging device 1a captures an image of a subject irradiated with infrared light, and measures the distance to the subject based on the captured image.
- the imaging device 1a has a semiconductor substrate 66, a first pixel array 11, a second pixel array 12, and a controller 40a.
- the first pixel array 11 is provided on a semiconductor substrate 66, has a laminated structure in which a first electrode 77, a photoelectric conversion layer 79, and a second electrode 80 are laminated in order, and has a first wavelength region including a visible light region. are arranged two-dimensionally.
- the second pixel array 12 is provided in the semiconductor substrate 66 at a position overlapping the first light receiving pixels 13 in the thickness direction of the semiconductor substrate 66, and photoelectrically converts light in a second wavelength region including an infrared light region.
- Two light-receiving pixels 14 are arranged two-dimensionally.
- the control unit 40 a drives and controls the second pixel array 12 and the light source 101 based on the signals photoelectrically converted by the first pixel array 11 , and controls the distance to the subject based on the signals photoelectrically converted by the second light receiving pixels 14 . Measure distance.
- the distance measurement system 100a can improve the distance measurement accuracy by effectively using the information of the visible light image and appropriately driving and controlling the light source 101 .
- the present technology can also take the following configuration.
- a semiconductor substrate A first light-receiving pixel that is provided on the semiconductor substrate, has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
- a first pixel array in which are arranged two-dimensionally;
- a second light-receiving pixel provided in the semiconductor substrate at a position overlapping with the first light-receiving pixel in the thickness direction of the semiconductor substrate and photoelectrically converting light in a second wavelength region including an infrared light region two-dimensionally.
- a second array of pixels a second array of pixels; and a controller that drives and controls the second pixel array based on signals photoelectrically converted by the first pixel array.
- a data processing unit that detects the first light-receiving pixels having a change from the image data captured by the first pixel array
- the control unit The imaging device according to (1), wherein a signal photoelectrically converted by the second light-receiving pixel corresponding to the changed first light-receiving pixel is output.
- the control unit The imaging device according to (2), wherein among all signals read from the second pixel array, signals read from the second light-receiving pixels corresponding to the changed first light-receiving pixels are output.
- the signals photoelectrically converted by the second pixel array are read out and output.
- a first light-receiving pixel that is provided on the semiconductor substrate has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
- a first pixel array in which are arranged two-dimensionally; provided in the semiconductor substrate at a position overlapping with the first light-receiving pixels in the thickness direction of the semiconductor substrate; a second pixel array in which second light-receiving pixels that photoelectrically convert light in the area are arranged two-dimensionally; and a controller that drives and controls the second pixel array based on signals photoelectrically converted by the first pixel array.
- a first light-receiving pixel that is provided on the semiconductor substrate has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
- a first pixel array in which are arranged two-dimensionally;
- a second pixel array provided on the same plane as the first pixel array, in which second light-receiving pixels photoelectrically converting light in a second wavelength region including an infrared light region are arranged two-dimensionally; and a controller that drives and controls the second pixel array based on signals photoelectrically converted by the first pixel array.
- a light source that emits infrared light
- an imaging device that captures an image of a subject irradiated with the infrared light and measures the distance to the subject based on the captured image
- the imaging device is a semiconductor substrate
- a first light-receiving pixel that is provided on the semiconductor substrate, has a laminated structure in which a first electrode, a photoelectric conversion layer, and a second electrode are laminated in order, and photoelectrically converts light in a first wavelength region including a visible light region.
- a ranging system comprising a controller and .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Electromagnetism (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Solid State Image Pick-Up Elements (AREA)
Abstract
Description
図1は、本開示に係る測距システムの構成例を示す説明図である。図1に示す測距システム100は、例えば、赤外光が照射された被写体の赤外光画像(以下、「距離画像」と称する場合がある)を撮像し、距離画像から光飛行時間(Time Of Tlight:TOF)に基づいて被写体までの距離を計測するd(direct)TOFセンサを実現するシステムである。測距システム100は、i(indirect)ToFを実現するシステムであってもよい。
図2は、本開示に係る撮像装置の構成例を示す説明図である。図2に示すように、撮像装置1は、画素アレイ10と、グローバル制御回路41と、第1行制御回路42と、第2行制御回路43と、第1読み出し回路21と、第1データ処理部22と、第2読み出し回路23と、第2データ処理部24とを備える。グローバル制御回路41、第1行制御回路42、および第2行制御回路43は、図1に示す制御部40に含まれる。
次に、図3を参照し、本開示に係る画素アレイの断面構造について説明する。図3は、本開示に係る画素アレイの断面説明図である。図3には、一つの第2画素14に対応する部分の画素アレイ10の断面を模式的に示している。なお、ここでは、図3に示す直交座標系におけるZ軸の正方向を便宜上「上」、Z軸の負方向を便宜上「下」として説明する。
図4は、本開示に係る第1読み出し回路の一例を示す回路図である。第1読み出し回路21は、例えば、フローティングデュージョンFDと、リセットトランジスタRSTと、増幅トランジスタAMPと、選択トランジスタSELとを有する。フローティングディフュージョンFDは、第1電極77と増幅トランジスタAMPとの間に接続されている。フローティングディフュージョンFDは、第1電極77により転送される信号電荷を電圧信号に変換して、増幅トランジスタAMPに出力する。
次に、図5~図8を参照して、本開示に係る画素の平面視による配置例について説明する。図5~図8は、本開示に係る画素の平面視による配置例を示す説明図である。図5に示すように、画素アレイ10の最上層には、複数のオンチップレンズ65が2次元(行列状)に配置される。また、画素アレイ10は、オンチップレンズ65が配置される階層よりも下の階層に、図6に示すように、複数のカラーフィルタ63が2次元(行列状)に配置される。
次に、図9を参照し、本開示に係る撮像装置の動作例について説明する。図9は、本開示に係る撮像装置の動作例を示す説明図である。図9に示すように、撮像装置1は、まず、制御部40によって第1画素制御を行い(ステップS1)、第1画素アレイ11を駆動する(ステップS2)。これにより、第1画素アレイ11は、第1画素13によって入射光を信号電荷に光電変換し、信号電荷を蓄積する。
次に、図10を参照して、本開示に係る撮像装置が実行する処理の一例について説明する。図10は、本開示に係る撮像装置が実行する処理の一例を示すフローチャートである。図10に示すように、撮像装置1は、各種設定を行い、外光条件に基づく撮像条件で第1画素アレイ11を駆動して第1画像となる可視光画像を撮像する(ステップS101)。
次に、図11を参照して、本開示に係る撮像装置が実行する処理の変形例について説明する。図11は、本開示に係る撮像装置が実行する処理の変形例を示すフローチャートである。図11に示すように、変形例に係る処理は、ステップS208,S209の処理が図10に示すステップS108,S109の処理とは異なる。図11に示すステップS101~S107の処理は、図10に示すS101~S107の処理と同一であるため、ここでは、重複する説明を省略する。
次に、変形例に係る測距システムについて説明する。図12は、本開示の変形例に係る測距システムの構成例を示す説明図である。
次に、図13を参照して、本開示の変形例に係るの撮像装置が実行する処理の一例について説明する。図13は、本開示の変形例に係るの撮像装置が実行する処理の一例を示すフローチャートである。
上述した実施形態は一例であり、種々の変形が可能である。例えば、本開示に係る撮像装置は、図3に示す光学フィルタ75を省略してもよい。光学フィルタ75が設けられない場合、第1画素アレイ11および第2画素アレイ12は、共に可視光画像を撮像することになる。ここで、例えば、光電変換層79の厚さを図3に示す厚さよりも薄くすれば、第1画素アレイ11の受光感度が低下する。
撮像装置1は、半導体基板66と、第1画素アレイ11と、第2画素アレイ12と、制御部40とを有する。第1画素アレイ11は、半導体基板66上に設けられ、第1電極77、光電変換層79、および第2電極80が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素13が2次元に配列される。第2画素アレイ12は、半導体基板66内のうち半導体基板66の厚さ方向において第1受光画素13と重なる位置に設けられ、赤外光域を含む第2波長域の光を光電変換する第2受光画素14が2次元に配列される。制御部40は、第1画素アレイ11によって光電変換された信号に基づいて、第2画素アレイ12を駆動制御する。これにより、撮像装置1は、可視光画像の情報を有効利用することができる。
(1)
半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記半導体基板内のうち前記半導体基板の厚さ方向において前記第1受光画素と重なる位置に設けられ、赤外光域を含む第2波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイを駆動制御する制御部と
を有する撮像装置。
(2)
前記第1画素アレイによって撮像された画像データから変化のある前記第1受光画素を検出するデータ処理部
を備え、
前記制御部は、
前記変化のある前記第1受光画素に対応する前記第2受光画素によって光電変換された信号を出力させる
前記(1)に記載の撮像装置。
(3)
前記制御部は、
前記第2画素アレイから読み出される全ての信号のうち、前記変化のある前記第1受光画素に対応する前記第2受光画素から読み出された信号を出力させる
前記(2)に記載の撮像装置。
(4)
前記第2画素アレイによって光電変換された信号のうち、前記変化のある前記第1受光画素に対応する前記第2受光画素から光電変換された信号を読み出して出力させる
前記(2)に記載の撮像装置。
(5)
前記制御部は、
前記データ処理部によって、前記変化のある前記第1受光画素が検出された場合に、前記第2画素アレイの駆動を開始する
前記(2)~(4)のいずれか一つに記載の撮像素子。
(6)
前記制御部は、
一部の前記第1受光画素を間引いて前記第1画素アレイから光電変換された信号を出力させる
前記(1)~(5)のいずれか一つに記載の撮像装置。
(7)
半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記半導体基板内のうち前記半導体基板の厚さ方向において前記第1受光画素と重なる位置に設けられ、前記第1受光画素とは受光感度が異なり、前記第1画素アレイを透過した前記第1波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイを駆動制御する制御部と
を有する撮像装置。
(8)
半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記第1画素アレイと同一平面上に設けられ、赤外光域を含む第2波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイを駆動制御する制御部と
を有する撮像装置。
(9)
赤外光を発する光源と、
前記赤外光が照射された被写体を撮像し、撮像画像に基づいて前記被写体までの距離を計測する撮像装置と
を含み、
前記撮像装置は、
半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記半導体基板内のうち前記半導体基板の厚さ方向において前記第1受光画素と重なる位置に設けられ、赤外光域を含む第2波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイおよび前記光源を駆動制御し、前記第2受光画素によって光電変換された信号に基づいて前記被写体までの距離を計測する制御部と
を有する測距システム。
1,1a 撮像装置
10 画素アレイ
11 第1画素アレイ
12 第2画素アレイ
13 第1画素
14 第2画素
20 AD変換部
21 第1読み出し回路
22 第1データ処理部
23 第2読み出し回路
24 第2データ処理部
30 データ処理部
40 制御部
41 グローバル制御回路
42 第1行制御回路
43 第2行制御回路
50 測距用パルス生成部
63 カラーフィルタ
65 オンチップレンズ
66 半導体基板
67 光電変換領域
75 光学フィルタ
77 第1電極
78 半導体層
79 光電変換層
80 第2電極
81 絶縁膜
82 電荷蓄積電極
101 光源
102 撮像光学系
103 ISP
104 入力装置
105 表示装置
106 データ保存部
Claims (9)
- 半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記半導体基板内のうち前記半導体基板の厚さ方向において前記第1受光画素と重なる位置に設けられ、赤外光域を含む第2波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイを駆動制御する制御部と
を有する撮像装置。 - 前記第1画素アレイによって撮像された画像データから変化のある前記第1受光画素を検出するデータ処理部
を備え、
前記制御部は、
前記データ処理部によって検出される前記変化のある前記第1受光画素に対応する前記第2受光画素によって光電変換された信号を出力させる
請求項1に記載の撮像装置。 - 前記制御部は、
前記第2画素アレイから読み出される全ての信号のうち、前記変化のある前記第1受光画素に対応する前記第2受光画素から読み出された信号を出力させる
請求項2に記載の撮像装置。 - 前記制御部は、
前記第2画素アレイによって光電変換された信号のうち、前記変化のある前記第1受光画素に対応する前記第2受光画素から光電変換された信号を読み出して出力させる
請求項2に記載の撮像装置。 - 前記制御部は、
前記データ処理部によって、前記変化のある前記第1受光画素が検出された場合に、前記第2画素アレイの駆動を開始する
請求項2に記載の撮像装置。 - 前記制御部は、
一部の前記第1受光画素を間引いて前記第1画素アレイから光電変換された信号を出力させる
請求項1に記載の撮像装置。 - 半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記半導体基板内のうち前記半導体基板の厚さ方向において前記第1受光画素と重なる位置に設けられ、前記第1受光画素とは受光感度が異なり、前記第1画素アレイを透過した前記第1波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイを駆動制御する制御部と
を有する撮像装置。 - 半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記第1画素アレイと同一平面上に設けられ、赤外光域を含む第2波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイを駆動制御する制御部と
を有する撮像装置。 - 赤外光を発する光源と、
前記赤外光が照射された被写体を撮像し、撮像画像に基づいて前記被写体までの距離を計測する撮像装置と
を含み、
前記撮像装置は、
半導体基板と、
前記半導体基板上に設けられ、第1電極、光電変換層、および第2電極が順に積層された積層構造を有し、可視光域を含む第1波長域の光を光電変換する第1受光画素が2次元に配列される第1画素アレイと、
前記半導体基板内のうち前記半導体基板の厚さ方向において前記第1受光画素と重なる位置に設けられ、赤外光域を含む第2波長域の光を光電変換する第2受光画素が2次元に配列される第2画素アレイと、
前記第1画素アレイによって光電変換された信号に基づいて、前記第2画素アレイおよび前記光源を駆動制御し、前記第2受光画素によって光電変換された信号に基づいて前記被写体までの距離を計測する制御部と
を有する測距システム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22766828.2A EP4307378A1 (en) | 2021-03-12 | 2022-02-22 | Imaging device and ranging system |
DE112022001551.9T DE112022001551T5 (de) | 2021-03-12 | 2022-02-22 | Abbildungsvorrichtung und Entfernungsmesssystem |
JP2023505277A JPWO2022190867A1 (ja) | 2021-03-12 | 2022-02-22 | |
KR1020237029303A KR20230156694A (ko) | 2021-03-12 | 2022-02-22 | 촬상 장치 및 측거 시스템 |
CN202280019409.0A CN116941040A (zh) | 2021-03-12 | 2022-02-22 | 成像设备和测距系统 |
US18/546,684 US20240145496A1 (en) | 2021-03-12 | 2022-02-22 | Imaging device and ranging system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-040358 | 2021-03-12 | ||
JP2021040358 | 2021-03-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190867A1 true WO2022190867A1 (ja) | 2022-09-15 |
Family
ID=83227724
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/007398 WO2022190867A1 (ja) | 2021-03-12 | 2022-02-22 | 撮像装置および測距システム |
Country Status (8)
Country | Link |
---|---|
US (1) | US20240145496A1 (ja) |
EP (1) | EP4307378A1 (ja) |
JP (1) | JPWO2022190867A1 (ja) |
KR (1) | KR20230156694A (ja) |
CN (1) | CN116941040A (ja) |
DE (1) | DE112022001551T5 (ja) |
TW (1) | TW202245464A (ja) |
WO (1) | WO2022190867A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015050331A (ja) * | 2013-09-02 | 2015-03-16 | ソニー株式会社 | 固体撮像素子およびその製造方法、並びに電子機器 |
JP2016005189A (ja) * | 2014-06-18 | 2016-01-12 | オリンパス株式会社 | 撮像素子、撮像装置 |
JP2018136123A (ja) * | 2015-06-24 | 2018-08-30 | 株式会社村田製作所 | 距離センサ及びユーザインタフェース装置 |
JP2018142838A (ja) | 2017-02-27 | 2018-09-13 | 日本放送協会 | 撮像素子、撮像装置及び撮影装置 |
JP2021022875A (ja) * | 2019-07-29 | 2021-02-18 | キヤノン株式会社 | 光電変換装置、光電変換システム、および移動体 |
-
2022
- 2022-02-22 KR KR1020237029303A patent/KR20230156694A/ko unknown
- 2022-02-22 US US18/546,684 patent/US20240145496A1/en active Pending
- 2022-02-22 EP EP22766828.2A patent/EP4307378A1/en active Pending
- 2022-02-22 JP JP2023505277A patent/JPWO2022190867A1/ja active Pending
- 2022-02-22 DE DE112022001551.9T patent/DE112022001551T5/de active Pending
- 2022-02-22 CN CN202280019409.0A patent/CN116941040A/zh active Pending
- 2022-02-22 WO PCT/JP2022/007398 patent/WO2022190867A1/ja active Application Filing
- 2022-03-04 TW TW111107905A patent/TW202245464A/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015050331A (ja) * | 2013-09-02 | 2015-03-16 | ソニー株式会社 | 固体撮像素子およびその製造方法、並びに電子機器 |
JP2016005189A (ja) * | 2014-06-18 | 2016-01-12 | オリンパス株式会社 | 撮像素子、撮像装置 |
JP2018136123A (ja) * | 2015-06-24 | 2018-08-30 | 株式会社村田製作所 | 距離センサ及びユーザインタフェース装置 |
JP2018142838A (ja) | 2017-02-27 | 2018-09-13 | 日本放送協会 | 撮像素子、撮像装置及び撮影装置 |
JP2021022875A (ja) * | 2019-07-29 | 2021-02-18 | キヤノン株式会社 | 光電変換装置、光電変換システム、および移動体 |
Also Published As
Publication number | Publication date |
---|---|
CN116941040A (zh) | 2023-10-24 |
KR20230156694A (ko) | 2023-11-14 |
JPWO2022190867A1 (ja) | 2022-09-15 |
TW202245464A (zh) | 2022-11-16 |
EP4307378A1 (en) | 2024-01-17 |
US20240145496A1 (en) | 2024-05-02 |
DE112022001551T5 (de) | 2024-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10903279B2 (en) | Solid state image sensor pixel electrode below a photoelectric conversion film | |
US11411034B2 (en) | Solid-state imaging device and electronic apparatus | |
JP5651976B2 (ja) | 固体撮像素子およびその製造方法、並びに電子機器 | |
US8294880B2 (en) | Distance measuring sensor including double transfer gate and three dimensional color image sensor including the distance measuring sensor | |
US7741689B2 (en) | Photoelectric conversion layer-stacked solid-state imaging element | |
US10536659B2 (en) | Solid-state image capturing element, manufacturing method therefor, and electronic device | |
US7498624B2 (en) | Solid-state imaging device | |
JP2002134729A (ja) | 固体撮像装置及びその駆動方法 | |
WO2018105334A1 (ja) | 固体撮像素子及び電子機器 | |
US11594568B2 (en) | Image sensor and electronic device | |
US20090294815A1 (en) | Solid state imaging device including a semiconductor substrate on which a plurality of pixel cells have been formed | |
WO2015198876A1 (ja) | 撮像素子、電子機器 | |
WO2022190867A1 (ja) | 撮像装置および測距システム | |
JP2020027937A (ja) | 固体撮像装置、固体撮像装置の製造方法、および電子機器 | |
US20230420482A1 (en) | Light receiving element and electronic apparatus | |
US10804303B2 (en) | Image sensors comprising an organic photo-detector, a photo-detector array and dual floating diffusion nodes and electronic devices including the same | |
US20240178254A1 (en) | Light-receiving element and electronic apparatus | |
US20240162271A1 (en) | Image sensor arrangement, image sensor device and method for operating an image sensor arrangement | |
US20240192054A1 (en) | Light detection apparatus and electronic device | |
JP2011204991A (ja) | 固体撮像素子およびその製造方法、並びに電子機器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22766828 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023505277 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18546684 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280019409.0 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022001551 Country of ref document: DE Ref document number: 2022766828 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022766828 Country of ref document: EP Effective date: 20231012 |